A fast orientation estimation approach of natural images
Cao, Zhiqiang, Liu, Xilong, Gu, Nong, Nahavandi, Saeid, Xu, De, Zhou, Chao and Tan, Min 2016, A fast orientation estimation approach of natural images, IEEE transactions on systems, man and cybernetics: systems, vol. 46, no. 11, pp. 1589-1597, doi: 10.1109/TSMC.2015.2497253.
Attached Files
Name
Description
MIMEType
Size
Downloads
Title
A fast orientation estimation approach of natural images
This correspondence paper proposes a fast orientation estimation approach of natural images without the help of semantic information. Different from traditional low-level features, our low-level features are extracted inspired by the biological simple cells of the visual cortex. Two approximated receptive fields to mimic the biological cells are presented, and a local rotation operator is introduced to determine the optimal output and local orientation corresponding to an image position, which serve as the low-level feature employed in this paper. To generate the low-level features, a bisection method is applied to the first derivative of the model of receptive fields. Moreover, the feature screener is introduced to eliminate too much useless low-level features, which will speed up the processing time. After all the valuable low-level features are combined, the overall image orientation is estimated. The proposed approach possesses several features suitable for real-time applications. First, it avoids the tedious training procedure of some conventional methods. Second, no specific reference such as the horizon is assumed and no a priori knowledge of image is required. The proposed approach achieves a real-time orientation estimation of natural images using only low-level features with a satisfactory resolution. The effectiveness of our proposed approach is verified on real images with complex scenes and strong noises.
Every reasonable effort has been made to ensure that permission has been obtained for items included in DRO. If you believe that your rights have been infringed by this repository, please contact drosupport@deakin.edu.au.