Wei, Lei, Le, Vu, Abdelrahman, Wael, Hossny, Mohammed, Creighton, Douglas and Nahavandi, Saeid 2012, Kinect crowd interaction, in SimTecT 2012 : Simulation-integrated solutions : Proceedings of the Annual Asia Pacific Simulation Technology and Training Conference, [SimTecT], [Adelaide, S.Aust.], pp. 1-6.
(Some files may be inaccessible until you login with your Deakin Research Online credentials)
SimTecT 2012 : Simulation-integrated solutions : Proceedings of the Annual Asia Pacific Simulation Technology and Training Conference
Asia Pacific Simulation Technology and Training Conference
Place of publication
Most of the state-of-the-art commercial simulation software mainly focuses on providing realistic animations and convincing artificial intelligence to avatars in the scenario. However, works on how to trigger the events and avatar reactions in the scenario in a natural and intuitive way are less noticed and developed. Typical events are usually triggered by predefined timestamps. Once the events are set, there is no easy way to interactively generate new events while the scene is running and therefore difficult to dynamically affect the avatar reactions. Based on this situation, we propose a framework to use human gesture as input to trigger events within a DI-Guy simulation scenario in real-time, which could greatly help users to control events and avatar reactions in the scenario. By implementing such a framework, we will be able to identify user’s intentions interactively and ensure that the avatars make corresponding reactions.
Field of Research
080602 Computer-Human Interaction
Socio Economic Objective
970108 Expanding Knowledge in the Information and Computing Sciences
Unless expressly stated otherwise, the copyright for items in Deakin Research Online is owned by the author, with all rights reserved.
Every reasonable effort has been made to ensure that permission has been obtained for items included in DRO.
If you believe that your rights have been infringed by this repository, please contact email@example.com.