Development of a Smart Trocar for automated instrument recognition during laparoscopic surgery

Giulia Toti, MS, Marc Garbey, PhD, Brian J Dunkin, MD, FACS, Vadim Sherman, MD, FRCSC, FACS, Barbara Bass, MD

Department of Computer Science, University of Houston, Houston, TX; Methodist Institute for Technology, Innovation and Education, Houston, TX; Department of Surgery, The Methodist Hospital, Houston, TX

Objective: Complex minimally invasive surgical procedures require coordination among team members. Ideally, intelligent operating rooms would aid in this coordination by tracking surgical progress and alerting the surgeon and his/her team about key decision points during the procedure. Data concerning technical options, variations in anatomy, and required instrumentation would also be provided. Such a system must be able to automatically track the progress of a surgical procedure. Well scripted operations such as gastric bypass and colon resection follow a recognizable “signature” of instrument use that could be tracked to monitor procedure progress. This study developed a Smart Trocar to automatically recognize laparoscopic instrument type during minimally invasive surgery.
Methods: A standard laparoscopic trocar (StepTM Bladeless Trocar, Covidien, Mansfield, MA) was adapted by attaching a small, battery powered, disposable wireless camera to the side of the valve head focused outward toward the inserted laparoscopic instrument (Figure). A separate perforated color wheel was attached to a standard laparoscopic hand instrument near the handle. Each wheel is color coded for a specific instrument. The camera “sees” the color wheel and transmits the image wirelessly to an in-room computer. Using a computer vision algorithm developed in our lab, the computer recognizes the pattern of color markers and correlates it to a library of registered instruments. To test the accuracy of instrument identification, two sets of 25 different markers were analyzed. Color wheel position, distance from the trocar and angle of entry into the trocar were varied. Different paint qualities (lucid or opaque) were also tested. Each wheel contained 1 to 3 color markers chosen from a palette of 5 colors: pink, yellow, blue, green, or red-orange. 450 static images were acquired from the system and analyzed.

Results: Opaque paint resulted in the most accurate tool recognition with correct identification of the marker in 98% of captured images. The 3 false identifications were a result of incorrect color recognition (false positives). Lucid paint resulted in less accuracy with correct identification in 96.6% of the images. There were 10 false positives and one false negative (failure to recognize green color at one of the wheel markers). The distance between the color wheel and camera and the angle of tool insertion did not affect accuracy.
Conclusions: This study demonstrates that an inexpensive, disposable wireless camera and color-wheel system can accurately identify standard laparoscopic hand instruments as they are passing thru the Smart Trocar. The system can be adapted to any standard laparoscopic trocar and instrument with at least 25 color wheels to choose from, based on a 5-color palette. Automated tool recognition is the first step toward developing an intelligent operating room that can augment the performance of a surgeon and his or her team. Future work will include testing the system in low ambient light conditions such as those commonly encountered in the OR during laparoscopic surgery.


Session: Poster Presentation

Program Number: ETP046

« Return to SAGES 2013 abstract archive