Deakin University

File(s) under permanent embargo

The study of using eye movements to control the laparoscope under a haptically-enabled laparoscopic surgery simulation environment

Version 2 2024-06-03, 23:57
Version 1 2019-03-29, 14:00
conference contribution
posted on 2024-06-03, 23:57 authored by H Zhou, Lei WeiLei Wei, R Cao, Samer HanounSamer Hanoun, Asim BhattiAsim Bhatti, Y Tai, S Nahavandi
© 2018 IEEE. The purpose of this study is to investigate the possibility to use eye movements to control the laparoscope during a laparoscopic surgery. Laparoscopic surgery usually needs at least two doctors, a surgeon and a laparoscope assistant. The view of the operating surgeon is provided by the laparoscope assistant. As misunderstandings or conflicts of cooperation may happen, an ideal way is that the surgeon has a full control of all the instruments including the surgical tools and laparoscope. To achieve it, an eye based interaction method is introduced in this paper that allows surgeons to control the view by themselves. With recent developments in the eye tracker platforms and associated eye tracking technologies, many non-contact eye tracking systems are available. It can record where a person is looking at any time and a sequence of eye movements. This information can be used to know where is the attention and interest of the person on a display. As such, surgeon's attention can be captured and then be followed by moving the laparoscope to the region of interest. To have a safe and efficient evaluation on the usability, a virtual reality based laparoscopic surgery simulation is built. It is based on Unity with two haptic devices simulating the surgical tools, a 3D mouse providing 6 degrees-of-freedom control of the camera and an eye tracker capturing eyes' positions on a display. Experiments on moving a camera left, right, up, down, in, out and to specified locations using eyes are conducted, and moreover the performances of the proposed eye based self-control and the 3D mouse based other-control are compared. The results are promising where the proposed pointing method leads to 43.6% faster completion of the tasks against the traditional other-control method using the 3D mouse.





Miyazaki, Japan

Start date


End date






Publication classification

E1 Full written paper - refereed

Copyright notice

2018, IEEE

Title of proceedings

SMC 2018: Proceedings - IEEE International Conference on Systems, Man, and Cybernetics


Systems, Man, and Cybernetics. International Conference (2018 : Miyazaki, Japan)



Place of publication

Piscataway, N.J.