File(s) under permanent embargo
The study of using eye movements to control the laparoscope under a haptically-enabled laparoscopic surgery simulation environment
Version 2 2024-06-03, 23:57Version 2 2024-06-03, 23:57
Version 1 2019-03-29, 14:00Version 1 2019-03-29, 14:00
conference contribution
posted on 2024-06-03, 23:57 authored by H Zhou, Lei WeiLei Wei, R Cao, Samer HanounSamer Hanoun, Asim BhattiAsim Bhatti, Y Tai, S Nahavandi© 2018 IEEE. The purpose of this study is to investigate the possibility to use eye movements to control the laparoscope during a laparoscopic surgery. Laparoscopic surgery usually needs at least two doctors, a surgeon and a laparoscope assistant. The view of the operating surgeon is provided by the laparoscope assistant. As misunderstandings or conflicts of cooperation may happen, an ideal way is that the surgeon has a full control of all the instruments including the surgical tools and laparoscope. To achieve it, an eye based interaction method is introduced in this paper that allows surgeons to control the view by themselves. With recent developments in the eye tracker platforms and associated eye tracking technologies, many non-contact eye tracking systems are available. It can record where a person is looking at any time and a sequence of eye movements. This information can be used to know where is the attention and interest of the person on a display. As such, surgeon's attention can be captured and then be followed by moving the laparoscope to the region of interest. To have a safe and efficient evaluation on the usability, a virtual reality based laparoscopic surgery simulation is built. It is based on Unity with two haptic devices simulating the surgical tools, a 3D mouse providing 6 degrees-of-freedom control of the camera and an eye tracker capturing eyes' positions on a display. Experiments on moving a camera left, right, up, down, in, out and to specified locations using eyes are conducted, and moreover the performances of the proposed eye based self-control and the 3D mouse based other-control are compared. The results are promising where the proposed pointing method leads to 43.6% faster completion of the tasks against the traditional other-control method using the 3D mouse.
History
Pagination
3022-3026Location
Miyazaki, JapanPublisher DOI
Start date
2018-10-07End date
2018-10-10ISBN-13
9781538666500Language
engPublication classification
E1 Full written paper - refereedCopyright notice
2018, IEEETitle of proceedings
SMC 2018: Proceedings - IEEE International Conference on Systems, Man, and CyberneticsEvent
Systems, Man, and Cybernetics. International Conference (2018 : Miyazaki, Japan)Publisher
IEEEPlace of publication
Piscataway, N.J.Usage metrics
Categories
No categories selectedKeywords
170205 Neurocognitive Patterns and Neural Networks080109 Pattern Recognition and Data Mining090303 Biomedical Instrumentation090609 Signal Processing080602 Computer-Human Interaction970117 Expanding Knowledge in Psychology and Cognitive Sciences970108 Expanding Knowledge in the Information and Computing Sciences970110 Expanding Knowledge in Technology970109 Expanding Knowledge in Engineering4608 Human-centred computing4003 Biomedical engineering4006 Communications engineering
Licence
Exports
RefWorksRefWorks
BibTeXBibTeX
Ref. managerRef. manager
EndnoteEndnote
DataCiteDataCite
NLMNLM
DCDC