Stereoscopic Augmented Reality for Laparoscopic Surgery

Xin Kang, Wilson Emmanuel, Kyle Wu, Aaron Martin, Timothy Kane, Craig A Peters, Kevin Cleary, Raj Shekhar

Shekhar Zayed Institute for Pediatric Surgical Innovation, Children’s National Medical Center, Washington DC

Objective: Visual information is critical to safe and effective surgical outcomes, particularly in laparoscopic procedures where haptic feedback is limited. Traditional laparoscopes provide a flat representation of the three-dimensional (3D) operative field and are incapable of visualizing internal structures located beneath visible organ surfaces. Although computed tomography and magnetic resonance imaging can define internal anatomy, this information is hard to fuse in real-time into the surgeon’s visual field due to deformation of the anatomy.

Using real-time stereoscopic camera technology now available for conventional laparoscopic surgeries, we have developed a novel visualization capability called stereoscopic augmented reality (AR). Designed and developed by a team of engineers and minimally invasive surgeons, the stereoscopic AR system merges live laparoscopic ultrasound images with stereoscopic laparoscopic video. Stereoscopic AR visualization provides minimally invasive surgeons two new visual cues—(1) perception of true depth and improved understanding of 3D spatial relationship among anatomical structures, and (2) visualization of critical internal structures such as blood vessels, bile ducts, and surgical targets such as tumors, along with a more comprehensive visualization of the operative field.

Methods: The developed stereoscopic AR system has been designed with clinical translation as a near-term goal and integrates seamlessly into the existing surgical workflow. The system consists of a 5-mm diameter laparoscopic stereoscopic vision system (VSII, Visionsense, New York, NY) and a 10-mm diameter laparoscopic ultrasound system (flexFocus 700, BK Medical, Herlev, Denmark). Both imaging devices are FDA approved. The spatial registration between the two devices is achieved through an optical tracker (Polaris, Northern Digital, Waterloo, Canada). Purpose-built fixtures, on which reflecting spheres are mounted for optical tracking, are attached near the external tip of the two devices. Specialized software processes the streaming imaging data from the devices and registers those using optical tracking data in real time. The result is two ultrasound-augmented video streams (one each for left and right eyes), which when viewed on a 3D monitor give the operator a live stereoscopic AR view of the operative field. Under an Institutional Animal Care and Use Committee-approved protocol, the team conducted a series of stereoscopic AR interrogations of the liver, gallbladder and biliary system, kidneys, and lungs in two swine.

Results: The preclinical studies demonstrated the feasibility of stereoscopic AR during in-vivo procedures. The system recorded images from individual devices and the AR video. The figure shows representative images during stereoscopic AR interrogation of the liver. The AR image (right) resulted from overlay of the camera image (left) and the time-matched ultrasound image (center). Note our system produces two-channel stereoscopic video, but only single-channel images are shown here. The system exhibited unobservable latency with acceptable overlay accuracy.

Conclusions: We have presented, to our knowledge, the first in-vivo use of a complete system with stereoscopic AR visualization capability. This new capability introduces new visual cues and thus enhances visualization of the surgical anatomy within the existing clinical framework. Additional development and testing are necessary, but the system shows promise to improve the precision and expand the capacity of minimally invasive laparoscopic surgeries.

Fused image

Session: Podium Presentation

Program Number: ET001

« Return to SAGES 2013 abstract archive