You are not logged in.
Openly accessible

A smith-waterman local alignment approach for spatial activity recognition

Riedel, Daniel E., Venkatesh, Svetha and Liu, Wanquan 2006, A smith-waterman local alignment approach for spatial activity recognition, in AVSS 2006 : Proceedings of the IEEE International Conference on Video and Signal Based Surveillance, IEEE, [Washington, D. C.], pp. 54-59.

Attached Files
Name Description MIMEType Size Downloads
venkatesh-asmithwaterman-2006.pdf Published version application/pdf 174.60KB 101

Title A smith-waterman local alignment approach for spatial activity recognition
Author(s) Riedel, Daniel E.
Venkatesh, SvethaORCID iD for Venkatesh, Svetha orcid.org/0000-0001-8675-6631
Liu, Wanquan
Conference name IEEE International Conference on Video and Signal Based Surveillance (2006 : Sydney, N. S. W.)
Conference location Sydney, N. S. W.
Conference dates 22-24 Nov. 2006
Title of proceedings AVSS 2006 : Proceedings of the IEEE International Conference on Video and Signal Based Surveillance
Editor(s) [Unknown]
Publication date 2006
Conference series IEEE International Conference on Video and Signal Based Surveillance
Start page 54
End page 59
Total pages 6
Publisher IEEE
Place of publication [Washington, D. C.]
Keyword(s) sequence similarity
smith waterman local alignment
spatial activity recognition
spatial data
Summary In this paper we address the spatial activity recognition problem with an algorithm based on Smith-Waterman (SW) local alignment. The proposed SW approach utilises dynamic programming with two dimensional spatial data to quantify sequence similarity. SW is well suited for spatial activity recognition as the approach is robust to noise and can accommodate gaps, resulting from tracking system errors. Unlike other approaches SW is able to locate and quantify activities embedded within extraneous spatial data. Through experimentation with a three class data set, we show that the proposed SW algorithm is capable of recognising accurately and inaccurately segmented spatial sequences. To benchmark the techniques classification performance we compare it to the discrete hidden markov model (HMM). Results show that SW exhibits higher accuracy than the HMM, and also maintains higher classification accuracy with smaller training set sizes. We also confirm the robust property of the SW approach via evaluation with sequences containing artificially introduced noise.
Notes This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.
ISBN 0769526888
9780769526881
Language eng
Field of Research 089999 Information and Computing Sciences not elsewhere classified
Socio Economic Objective 970108 Expanding Knowledge in the Information and Computing Sciences
HERDC Research category E1.1 Full written paper - refereed
Copyright notice ©2006, IEEE
Persistent URL http://hdl.handle.net/10536/DRO/DU:30044606

Document type: Conference Paper
Collections: School of Information Technology
Open Access Collection
Connect to link resolver
 
Unless expressly stated otherwise, the copyright for items in DRO is owned by the author, with all rights reserved.

Every reasonable effort has been made to ensure that permission has been obtained for items included in DRO. If you believe that your rights have been infringed by this repository, please contact drosupport@deakin.edu.au.

Versions
Version Filter Type
Citation counts: TR Web of Science Citation Count  Cited 0 times in TR Web of Science
Scopus Citation Count Cited 2 times in Scopus
Google Scholar Search Google Scholar
Access Statistics: 233 Abstract Views, 101 File Downloads  -  Detailed Statistics
Created: Fri, 20 Apr 2012, 11:34:55 EST

Every reasonable effort has been made to ensure that permission has been obtained for items included in DRO. If you believe that your rights have been infringed by this repository, please contact drosupport@deakin.edu.au.