Local motion planning for ground mobile robots via deep imitation learning
Version 2 2024-06-04, 02:21Version 2 2024-06-04, 02:21
Version 1 2018-01-01, 00:00Version 1 2018-01-01, 00:00
conference contribution
posted on 2024-06-04, 02:21 authored by K Saleh, M Attia, M Hossny, Samer HanounSamer Hanoun, S Salaken, S Nahavandi© 2018 IEEE. Imitation learning have been recently applied to a number of robotic-related tasks. In this work, a novel approach based on imitation learning will be applied to the local motion planning problem for mobile robots. Based solely on a monocular RGB camera and set of expert demonstrations, our learned model can predict an accurate steering angle actions to be performed by the local motion planner of a mobile robot. In order to train and validate our model, we collected large amount of labelled RGB images from a monocular camera mounted on a ground mobile robot platform while driving the robot manually by human demonstrator. The proposed approach has achieved resilient results in capturing the behavior of a human demonstrator and provided a smooth and safe steering trajectories. Quantitatively, the model has also achieved lower mean absolute error of on a step-by-step basis of steering angular velocity with only 0.1 radians/s error over all the testing dataset.
History
Related Materials
- 1.
Location
Miyazaki, JapanLanguage
engPublication classification
E1 Full written paper - refereedCopyright notice
2018, IEEEPagination
4077-4082Start date
2018-10-07End date
2018-10-10ISBN-13
9781538666500Title of proceedings
SMC 2018 : Proceedings of the IEEE International Conference on Systems, Man, and CyberneticsEvent
Systems, Man, and Cybernetics. International Conference ( 2018 : Miyazaki, Japan)Publisher
IEEEPlace of publication
Piscataway, N.J.Usage metrics
Categories
Keywords
Licence
Exports
RefWorksRefWorks
BibTeXBibTeX
Ref. managerRef. manager
EndnoteEndnote
DataCiteDataCite
NLMNLM
DCDC

