A review of vision-based gait recognition methods for human identification
Wang, Jin, She, Mary, Nahavandi, Saeid and Kouzani, Abbas 2010, A review of vision-based gait recognition methods for human identification, in DICTA 2010 : Proceedings of the Digital Image Computing : Techniques and Application, IEEE, Piscataway, N.J., pp. 320-327.
(Some files may be inaccessible until you login with your Deakin Research Online credentials)
DICTA 2010 : Proceedings of the Digital Image Computing : Techniques and Application
Zhang, Jian Shen, Chunhua Geers, Glenn Wu, Qiang
Australian Pattern Recognition Society Conference
Place of publication
Human identification by gait has created a great deal of interest in computer vision community due to its advantage of inconspicuous recognition at a relatively far distance. This paper provides a comprehensive survey of recent developments on gait recognition approaches. The survey emphasizes on three major issues involved in a general gait recognition system, namely gait image representation, feature dimensionality reduction and gait classification. Also, a review of the available public gait datasets is presented. The concluding discussions outline a number of research challenges and provide promising future directions for the field.
This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.
Unless expressly stated otherwise, the copyright for items in Deakin Research Online is owned by the author, with all rights reserved.
Every reasonable effort has been made to ensure that permission has been obtained for items included in DRO.
If you believe that your rights have been infringed by this repository, please contact firstname.lastname@example.org.