Francisco M. Sánchez Margallo, PhD, Juan A. Sánchez-Margallo, PhD, José L. Moyano-Cuevas. Minimally Invasive Surgery Centre, Cáceres, Spain
INTRODUCTION: Surgical environments require special aseptic conditions for direct interaction with the preoperative images and surgical equipment, which hampers the use of traditional input devices. We presented the feasibility of using a natural user interface (NUI) for gesture control combined with voice control to directly interact in a more intuitive and sterile manner with the preoperative images and the integrated operating room (OR) functionalities during laparoscopic surgery. In this study, efficiency and face validity of using this NUI for medical image navigation and remote control during the performance of a set of basic tasks in the OR will be assessed.
METHODS AND PROCEDURES: Twenty experienced laparoscopic surgeons participated in this study. They performed 25 basic tasks in the OR focused on the interaction with a medical image viewer (Osirix; Pixmeo) and with the functionalities of the integrated OR (OR1; Karl Storz). These tasks were carried out by means of traditional manual interaction, using a computer keyboard and mouse and a touching screen, and using a gesture control sensor (MYO armband) in combination with voice commands. This NUI is controlled by the TEDCUBE system (TEDCAS Medical Systems). Time required to complete the tasks using each interaction method was recorded. At the end of the tasks, participants completed a questionnaire for face validation and usability assessment.
RESULTS: The use of the NUI required significantly less time than conventional manual control to show preoperative studies and information for surgical support. However, the interaction with the medical image viewer was significantly faster using the traditional input devices. Participants evaluated the NUI as an intuitive, simple and versatile tool that improves sterility during surgical activity. Seventy-five percent of the participants would choose the gesture control system as a method of interaction with the patient’s preoperative information during surgery.
CONCLUSIONS: The presented gesture control system allows surgeons to directly interact with preoperative imaging studies and the functionalities of an integrated OR during surgery maintaining the aseptic conditions. For the traditional manual interaction, it is necessary to take into account the possible reaction time and displacement time of the technician to execute the surgeon’s requests. A more personalized medical image viewer is required and with higher integration with the capabilities of the presented gesture control system.
Presented at the SAGES 2017 Annual Meeting in Houston, TX.
Abstract ID: 87893
Program Number: P488
Presentation Session: iPoster Session (Non CME)
Presentation Type: Poster