A Motion Analysis Platform for Real Time Laparoscopic Performance Using Mobile And Wearable Technology

Vivian E de Ruijter, MD1, Catherine Wong2, Adrian Rodriguez2, Kiruthiga Nandagopal, BS, MS, PhD1, Lee L Swanstrom, MD, FACS3, James Wall, MD, FACS1. 1Stanford University, Department of Surgery, 2Stanford University, Department of Computer Science, 3IHU-IRCAD, University Hospital of Strasbourg, Strasbourg, France

OBJECTIVE The progression towards milestones-based medical education has highlighted the need for validated assessments of real-time performance of technical skills. Recent studies have provided evidence in support of the use of motion analysis in laparoscopic surgical skills assessment. Unfortunately, high-fidelity motion analysis is typically cost prohibitive, thereby limiting broad accessibility, use in training and research implications. In the current abstract we present a wearable portable metric competency platform, which represents a low-cost interface for real-time assessment of surgical performance for education and self-reflection training purposes. In addition we compare available mobile and wearable technology to determine the best interface for this application.

METHODS This study is conducted in two phases: a proof-of-concept and a pre-clinical feasibility study in box trainers to establish construct validity. For both phases dedicated optical tracking software was developed with OpenCV4Android SDK. Bio-compatible colored tape was attached to the instrument tip for marking in each phase, combining accelerometer data with optical color tracking of the surgical instruments. The proof-of-concept was performed with the tracking platform using a smartphone and a laparoscopic box trainer. The researchers preliminarily tested the current tracking software during a Fundamentals of Laparoscopic Surgery (FLSR) peg transfer and an intra-corporeal suturing task, to determine: 1) the percentage of time during which the software identified contour and color of the tracked instrument, 2) time of task completion in seconds (s), and 3) an instrument path length in pixel (px) coordinates. The second phase utilized a 1) smartphone, 2) Google Glass, and a 3) FLS-Boxtrainer-Webcam. A group of novices and experts performed the FLS peg transfer task. Parameters of efficiency and economy of motion (the instrument’s path length (cm), time of procedure (seconds), and smoothness of movements (cm/s3) were measured and compared between the two groups for each system.

PRELIMINARY RESULTS Out of 20 tests performed, the optical tracking software tracked the laparoscopic instruments an average of 98.21% of the time for both FLS tasks (98.47% for the peg transfer and 98.21% for the intra-corporeal suturing task). The peg transfer task was completed on average in 69.5s (SD 3.2) and provided an average path length of 10.22px (SD 4.97) for the right instrument and 42.57px (SD 11.70) for the left instrument. The intra-corporeal suturing task was completed in 102.6s (SD 25.96) and an average path length was measured of 42.34px (SD 13.27) for the right instrument and 64.08px (SD 18.32) for the left instrument. Data collection for the second phase is ongoing and will be described during our presentation.

FUTURE DIRECTIONS This study provides preliminary data and demonstrates feasibility of a smart-phone and wearable based portable competency platform for real time laparoscopic performance based on motion tracking. The software utilized has the ability to run on both android-based phones and head-mounted displays, making it possible to use this during real time surgical procedures and possibly to other surgical and endoscopic disciplines. Further studies are ongoing to demonstrate the feasibility and reliability of this platform in differentiating experts and novices, and in clinical practice.

« Return to SAGES 2015 abstract archive