Deakin University
Browse

GazeSwitch: Automatic Eye-Head Mode Switching for Optimised Hands-Free Pointing

Download (4.65 MB)
journal contribution
posted on 2024-08-19, 04:41 authored by BJ Hou, J Newn, L Sidenmark, AA Khan, H Gellersen
This paper contributes GazeSwitch, an ML-based technique that optimises the real-time switching between eye and head modes for fast and precise hands-free pointing. GazeSwitch reduces false positives from natural head movements and efficiently detects head gestures for input, resulting in an effective hands-free and adaptive technique for interaction. We conducted two user studies to evaluate its performance and user experience. Comparative analyses with baseline switching techniques, Eye+Head Pinpointing (manual) and BimodalGaze (threshold-based) revealed several trade-offs. We found that GazeSwitch provides a natural and effortless experience but trades off control and stability compared to manual mode switching, and requires less head movement compared to BimodalGaze. This work demonstrates the effectiveness of machine learning approach to learn and adapt to patterns in head movement, allowing us to better leverage the synergistic relation between eye and head input modalities for interaction in mixed and extended reality.

History

Related Materials

  1. 1.

Location

New York, N.Y.

Open access

  • Yes

Language

eng

Publication classification

C1.1 Refereed article in a scholarly journal

Journal

Proceedings of the ACM on Human-Computer Interaction

Volume

8

Pagination

1-20

ISSN

2573-0142

eISSN

2573-0142

Issue

ETRA

Publisher

Association for Computing Machinery

Usage metrics

    Research Publications

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC