File(s) under permanent embargo
The study of using eye movements to control the laparoscope under a haptically-enabled laparoscopic surgery simulation environment
conference contributionposted on 2019-01-16, 00:00 authored by Hailing Zhou, Lei WeiLei Wei, R Cao, Samer HanounSamer Hanoun, Asim BhattiAsim Bhatti, Yonghang Tai, Saeid NahavandiSaeid Nahavandi
© 2018 IEEE. The purpose of this study is to investigate the possibility to use eye movements to control the laparoscope during a laparoscopic surgery. Laparoscopic surgery usually needs at least two doctors, a surgeon and a laparoscope assistant. The view of the operating surgeon is provided by the laparoscope assistant. As misunderstandings or conflicts of cooperation may happen, an ideal way is that the surgeon has a full control of all the instruments including the surgical tools and laparoscope. To achieve it, an eye based interaction method is introduced in this paper that allows surgeons to control the view by themselves. With recent developments in the eye tracker platforms and associated eye tracking technologies, many non-contact eye tracking systems are available. It can record where a person is looking at any time and a sequence of eye movements. This information can be used to know where is the attention and interest of the person on a display. As such, surgeon's attention can be captured and then be followed by moving the laparoscope to the region of interest. To have a safe and efficient evaluation on the usability, a virtual reality based laparoscopic surgery simulation is built. It is based on Unity with two haptic devices simulating the surgical tools, a 3D mouse providing 6 degrees-of-freedom control of the camera and an eye tracker capturing eyes' positions on a display. Experiments on moving a camera left, right, up, down, in, out and to specified locations using eyes are conducted, and moreover the performances of the proposed eye based self-control and the 3D mouse based other-control are compared. The results are promising where the proposed pointing method leads to 43.6% faster completion of the tasks against the traditional other-control method using the 3D mouse.