Deakin University
Browse

File(s) under permanent embargo

Riemannian optimization with subspace tracking for low-rank recovery

conference contribution
posted on 2016-10-31, 00:00 authored by Q Li, W Niu, Gang LiGang Li, J Tan, G Xiong, L Guo
Low-rank matrix recovery (MR) has been widely used in data analysis and dimensionality reduction. As a direct heuristic to MR, convex relaxation is usually degraded by the repeated calling of singular value decomposition (SVD), especially in large-scale applications. In this paper, we propose a novel Riemannian optimization method (ROAM) for MR problem by exploiting the Riemannian geometry of the searching space. In particular, ROAM utilizes an efficient subspace tracking schema that automatically detects the unknown rank to identify the preferable geometry space. Moreover, a gradient-based optimization algorithm is proposed to obtain the latent low-rank component, which avoids the expensive full dimension of SVD. More significantly, ROAM algorithm is proved to converge under mild assumptions, which also verifies the effectiveness of ROAM. Extensive empirical results demonstrate the improved accuracy and efficiency of ROAM over convex-relaxation approaches.

History

Event

Neural Networks. IEEE International Joint Conference (2016 : Vancouver, Canada)

Pagination

3280 - 3287

Publisher

IEEE

Location

Vancouver, Canada

Place of publication

Piscataway, N.J.

Start date

2016-07-24

End date

2016-07-29

ISBN-13

9781509006205

Language

eng

Publication classification

E Conference publication; E1 Full written paper - refereed

Copyright notice

2016, IEEE

Title of proceedings

IJCNN 2016: Proceedings of the IEEE International Joint Conference on Neural Networks