Maki Sugimoto, MD. IUHW Graduate School
We developed an integrated surgical navigation system of patient-based immersive virtual reality, holographic augmented reality, and mixed reality surgical navigation for laparoscopic surgery.
At first, by reconstructing the patient-specific 3D surface models of each organ out of the patient’s MDCT images, immersive VR navigation system was developed using side-by side image volume rendering. Second, we also developed a holographic augmented reality system by sensing the user's hand or arm position using motion sensor and 3D grasses. We developed new spatial imaging system by interactive superimposing 3D hologram and 3D printing technology by tracking the user's head and hand/arm position. It allowed the user to manipulate the spatial attributes of the virtual and real printed organs, which can enhance spatial reasoning and augmented tangibility.
We also integrated Google Tango technology that used computer vision to enable mobile devices to detect their position relative to the world around them without using GPS or other external signals. This allowed application developers to create user experiences that include indoor navigation, 3D mapping, physical space measurement, environmental recognition, augmented reality, and windows into a virtual world.
We will report illustrative benefits of the immersive VR/AR/MR devices (VR-HMD, 3D holographic tablet, VIVE, Oculus, Google Tango, zSpace, and HoloLens) in surgical planning, simulation, and image-guided navigation.
Presented at the SAGES 2017 Annual Meeting in Houston, TX.
Abstract ID: 84479
Program Number: ET008
Presentation Session: Emerging Technology Session
Presentation Type: Podium