Steen M Hansen, MD1, Michael S Hansen2, Peter Funch-Jensen, MD, DMSc, Professor3. 1Aalborg University Hospital, 2Vrinno, 3Department of Clinical Medicine, Aarhus University Hospital
Objective: Instrument tracking of laparoscopic instruments offers an opportunity to provide supervision of junior surgeons during their training and give performance feedback during surgery.
Current available technologies to track instruments include optical tracking, magnetic tracking, gyroscopes, and accelerometers. However, none of these methods relates the instrument position to the surface of the operating field, which is essential to understand the correlations between instrument navigation and surgical outcomes.
We present a novel method where a structured light pattern is utilized to track a laparoscopic instrument and its position and movements related to the surgical surface during a procedure. Our technology provides a dataset that enables a range of new essential performance metrics on instrument-tissue interaction, such as number of times the instrument is in contact with the surgical surface tissue, duration of tissue-instrument contact, distance between the instrument and tissue, acceleration of the instrument towards the tissue, and instrument orientation within the surgical field. The data can be stored and used for machine learning to evaluate surgical performance and provide guidance during operations for operating surgeons and surgical robots.
Method: Using a low-cost miniature projector probe (Ø 1.6mm) in a controlled setup, a structured light pattern was projected on a surface. The prototype structured light pattern was formed as a grid. A commercially available laparoscopic camera was used to provide real-time vision of the projected structured light. Data from the laparoscopic cameras was extracted real-time and digitally processed by the 3DIntegrated algorithm. Using triangulation methods, the real-time position of the laparoscopic instrument was calculated in relation to the surgical cavity surface.
Preliminary results: In the controlled setup, the real-time distance to the center of attention from the laparoscopic instrument was measured using the 3Dintegrated projector with a 0.10mm accuracy. The instrument orientation was measured with an accuracy of 0.9° in real-time tests.
Conclusions and future directions: Using a structured light pattern, a real-time position of the laparoscopic instrument in relation to the surgical cavity surface was calculated with a high accuracy. This approach allows for the generation of data that enables new essential performance metrics on instrument-tissue interactions. Such metrics can be used for meaningful real-time performance measurement that can be correlated to surgical outcomes. Further, such data is a cornerstone in the development of intelligent instruments for semi-autonomous and, eventually, autonomous surgical robots. In-vivo tests in animal models will be conducted primo 2017.
Presented at the SAGES 2017 Annual Meeting in Houston, TX.
Abstract ID: 84322
Program Number: ET011
Presentation Session: Emerging Technology Session
Presentation Type: Podium