UGI: a multi-dimensional ultrasonic-based interaction approach
Version 2 2024-06-06, 12:03Version 2 2024-06-06, 12:03
Version 1 2016-10-13, 15:17Version 1 2016-10-13, 15:17
conference contribution
posted on 2024-06-06, 12:03authored byM Babaei, Marwan Al-Jemeli, I Avazpour
We are currently witnessing an era where interaction with computers is no longer limited to conventional methods (i.e. keyboard and mouse). Human Computer Interaction (HCI) as a progressive field of research, has opened up alternatives to the traditional interaction techniques. Embedded Infrared (IR) sensors, Accelerometers and RGBD cameras have become common inputs for devices to recognize gestures and body movements. These sensors are vision based and as a result the devices that incorporate them will be reliant on presence of light. Ultrasonic sensors on the other hand do not suffer this limitation as they utilize properties of sound waves. These sensors however, have been mainly used for distance detection and not with HCI devices. This paper presents our approach in developing a multi-dimensional interaction input method and tool Ultrasonic Gesture-based Interaction (UGI) that utilizes ultrasonic sensors. We demonstrate how these sensors can detect object movements and recognize gestures. We present our approach in building the device and demonstrate sample interactions with it. We have also conducted a user study to evaluate our tool and its distance and micro gesture detection accuracy. This paper reports these results and outlines our future work in the area.
History
Pagination
1-8
Location
Launceston, Tas.
Start date
2016-11-30
End date
2016-12-02
ISBN-13
9781450336734
Language
eng
Publication classification
E Conference publication, E1 Full written paper - refereed
Copyright notice
2016, ACM
Title of proceedings
OzCHI 2016: Proceedings of the 28th Australian Conference on Human-Computer Interaction
Event
Human-Computer Interaction. Australian Conference (28th : 2016 : Launceston, Tas.)