Atul Kumar, PhD1, Shih-Wei Huang, MD1, Yen-Yu Wang, MS1, Kai-Che Liu, PhD1, Wan-Chi Hung, MS2, Yi-Chun Lee, PhD2, Jungle Chi-hsiang Wu1, Jian Hua Lin1, Min Chang Hung, MD1. 11Chang Bing Show Chwan Memorial Hospital, 2IRCAD-Taiwanup>Chang Bing Show Chwan Memorial Hospital, 2IRCAD-Taiwan
Objective: An augmented reality (AR) approach using pre-surgical CT/MRI model superimposed on the surgical scene enables to visualize the structures behind the surface. A major challenge in such AR is the registration of the real and the virtual object. This study presents an augmented reality system for endoscopic surgery where the real and virtual objects are registered with the help of Kinect® depth camera.
Method: A CT scan 3D virtual model (Pctvm) of a phantom (a torso phantom (IOUSFAN®) enclosed in a plastic box) was reconstructed. The AR system comprised of a computer, a software system, an NDI® 3D tracking system, a Kinect® and an endoscope system. The NDI® tracking system’s reference frame was the global reference frame (GlobalRF) for the AR. All the objects to be tracked during AR were brought in the GlobalRF. 1) Kinect® to GlobalRF : The Kinect® depth camera was fixed with respect to the NDI tracking system and their positional relationship was calculated by landmark registration technique. 2) Endoscope to GlobalRF: An NDI tracking tool was mounted on the endoscope camera. A perspective-n-point solution was applied to find the camera location with respect to the tracking tool. 3) Phantom to GlobalRF: A 3D virtual model of the phantom (Pkvm) was reconstructed using the Kinect®. The virtual model Pkvm was registered with Pctvm using iterative closest point algorithm. The system was applied on the phantom and the AR was visualized on a computer monitor. The error of superimposition of the real and the virtual scene was measured as the root mean square of the distances (in mm) between the edges of the real scene and the virtual scene in their rendered images.
Results: The complete software for the AR was developed using C++ and was run in a computer with Intel® Core™ i7 960 @3.20GHz, 6.00RAM 64 bit Windows 7. The software used several open source libraries. The complete system was applied on the phantom at its 30 different positons. Error of superimposition fall between 5 to 15 mm.
Conclusions: The current study presents an augmented reality system for endoscopic surgery where the real and the virtual objects are registered using the 3D model from Kinect® and the 3D model from CT scan.
Presented at the SAGES 2017 Annual Meeting in Houston, TX.
Abstract ID: 79261
Program Number: P592
Presentation Session: Poster (Non CME)
Presentation Type: Poster