Stereo image matching using wavelet scale-space representation
Bhatti, Asim and Nahavandi, Saeid 2006, Stereo image matching using wavelet scale-space representation, in CGIV 2006 : Proceedings of the 2006 International Conference on Computer Graphics, Imaging and Visualisation, IEEE Computer Society, Los Alamitos, Calif., pp. 267-272.
(Some files may be inaccessible until you login with your Deakin Research Online credentials)
CGIV 2006 : Proceedings of the 2006 International Conference on Computer Graphics, Imaging and Visualisation
Computer Graphics, Imaging and Visualisation Conference
IEEE Computer Society
Place of publication
Los Alamitos, Calif.
A multi-resolution technique for matching a stereo pair of images based on translation invariant discrete multi-wavelet transform is presented. The technique uses the well known coarse to fine strategy, involving the calculation of matching points at the coarsest level with consequent refinement up to the finest level. Vector coefficients of the wavelet transform modulus are used as matching features, where modulus maxima defines the shift invariant high-level features (multiscale edges) with phase pointing to the normal of the feature surface. The technique addresses the estimation of optimal corresponding points and the corresponding 2D disparity maps. Illuminative variation that can exist between the perspective views of the same scene is controlled using scale normalization at each decomposition level by dividing the details space coefficients with approximation space and then using normalized correlation. The problem of ambiguity, explicitly, and occlusion, implicitly, is addressed by using a geometric topological refinement procedure and symbolic tagging.
Unless expressly stated otherwise, the copyright for items in Deakin Research Online is owned by the author, with all rights reserved.
Every reasonable effort has been made to ensure that permission has been obtained for items included in DRO.
If you believe that your rights have been infringed by this repository, please contact firstname.lastname@example.org.