File(s) under permanent embargo
Semantic body parts segmentation for quadrupedal animals
Version 2 2024-06-04, 02:19Version 2 2024-06-04, 02:19
Version 1 2017-04-30, 18:20Version 1 2017-04-30, 18:20
conference contribution
posted on 2024-06-04, 02:19 authored by H Haggag, A Abobakr, M Hossny, S Nahavandi© 2016 IEEE. Although marker-less human pose estimation and tracking is important in various systems, nowadays many applications tend to detect animals while performing a certain task. These applications are multidisciplinary including robotics, computer vision, safety, and animal healthcare. The appearance of RGB-D sensors such as Microsoft Kinect and its successful applications in tracking and recognition made this area of research more active, especially with their affordable price. In this paper, a data synthesis approach for generating realistic and highly varied animal corpus is presented. The generated dataset is used to train a machine learning model to semantically segment animal body parts. In the proposed framework, foreground extraction is applied to segment the animal, dense representations are obtained using the depth comparison feature extractor and used for training a supervised random decision forest. An accurate pixel-wise classification of the parts will allow accurate joint localization and hence pose estimation. Our approach records classification accuracy of 93% in identifying the different body parts of an animal using RGB-D images.
History
Pagination
855-860Location
Budapest, HungaryPublisher DOI
Start date
2016-10-09End date
2016-10-12ISBN-13
9781509018970Language
engPublication classification
E1 Full written paper - refereedCopyright notice
2016, IEEETitle of proceedings
SMC 2016 : IEEE International Conference on Systems, Man and CyberneticsEvent
Systems, Man and Cybernetics. International Conference (2016 : Budapest, Hungary)Publisher
IEEEPlace of publication
Piscataway, N.J.Usage metrics
Keywords
Licence
Exports
RefWorksRefWorks
BibTeXBibTeX
Ref. managerRef. manager
EndnoteEndnote
DataCiteDataCite
NLMNLM
DCDC