You are not logged in.
Openly accessible

Spatial activity recognition in a smart home environment using a chemotactic model

Riedel, Daniel E., Venkatesh, Svetha and Liu, Wanquan 2005, Spatial activity recognition in a smart home environment using a chemotactic model, in Proceedings of the 2005 intelligent sensors, sensor networks and information processing conference, IEEE, Piscataway, N.J., pp. 301-306, doi: 10.1109/ISSNIP.2005.1595596.

Attached Files
Name Description MIMEType Size Downloads
venkatesh-spatialactivity-2005.pdf Published version application/pdf 1.30MB 70

Title Spatial activity recognition in a smart home environment using a chemotactic model
Author(s) Riedel, Daniel E.
Venkatesh, SvethaORCID iD for Venkatesh, Svetha orcid.org/0000-0001-8675-6631
Liu, Wanquan
Conference name Intelligent sensors, sensor networks and information processing conference (2nd : 2005 : Melbourne, Vic.)
Conference location Melbourne, Vic.
Conference dates 5- 8 Dec. 2005
Title of proceedings Proceedings of the 2005 intelligent sensors, sensor networks and information processing conference
Editor(s) Palaniswami, M.
Publication date 2005
Conference series Intelligent Sensors, Sensor Networks and Information Conference
Start page 301
End page 306
Total pages 6
Publisher IEEE
Place of publication Piscataway, N.J.
Keyword(s) biological system modeling
biology computing
chemical processes
chemical technology
hidden Markov models
microorganisms
organisms
signal processing
smart homes
working environment noise
Summary Spatial activity recognition is challenging due to the amount of noise incorporated during video tracking in everyday environments. We address the spatial recognition problem with a biologically-inspired chemotactic model that is capable of handling noisy data. The model is based on bacterial chemotaxis, a process that allows bacteria to change motile behaviour in relation to environmental gradients. Through adoption of chemotactic principles, we propose the chemotactic model and evaluate its performance in a smart house environment. The model exhibits greater than 99% recognition performance with a diverse six class dataset and outperforms the Hidden Markov Model (HMM). The approach also maintains high accuracy (90-99%) with small training sets of one to five sequences. Importantly, unlike other low-level spatial activity recognition models, we show that the chemotactic model is capable of recognising simple interwoven activities.
Notes This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.
ISBN 9780780393998
0780393996
Language eng
DOI 10.1109/ISSNIP.2005.1595596
Field of Research 089999 Information and Computing Sciences not elsewhere classified
Socio Economic Objective 970108 Expanding Knowledge in the Information and Computing Sciences
HERDC Research category E1.1 Full written paper - refereed
Copyright notice ©2005, IEEE
Persistent URL http://hdl.handle.net/10536/DRO/DU:30044626

Document type: Conference Paper
Collections: School of Information Technology
Open Access Collection
Connect to link resolver
 
Unless expressly stated otherwise, the copyright for items in DRO is owned by the author, with all rights reserved.

Every reasonable effort has been made to ensure that permission has been obtained for items included in DRO. If you believe that your rights have been infringed by this repository, please contact drosupport@deakin.edu.au.

Versions
Version Filter Type
Citation counts: TR Web of Science Citation Count  Cited 0 times in TR Web of Science
Scopus Citation Count Cited 7 times in Scopus
Google Scholar Search Google Scholar
Access Statistics: 233 Abstract Views, 70 File Downloads  -  Detailed Statistics
Created: Fri, 20 Apr 2012, 11:36:10 EST

Every reasonable effort has been made to ensure that permission has been obtained for items included in DRO. If you believe that your rights have been infringed by this repository, please contact drosupport@deakin.edu.au.