Deakin University
Browse

A Fast Non-Smooth Nonnegative Matrix Factorization for Learning Sparse Representation

Download (4.04 MB)
Version 2 2024-06-06, 00:15
Version 1 2016-10-15, 23:44
journal contribution
posted on 2024-06-06, 00:15 authored by Z Yang, Y Zhang, W Yan, Yong XiangYong Xiang, S Xie
Nonnegative matrix factorization (NMF) is a hot topic in machine learning and data processing. Recently, a constrained version, non-smooth NMF (NsNMF), shows a great potential in learning meaningful sparse representation of the observed data. However, it suffers from a slow linear convergence rate, discouraging its applications to large-scale data representation. In this paper, a fast NsNMF (FNsNMF) algorithm is proposed to speed up NsNMF. In the proposed method, it first shows that the cost function of the derived sub-problem is convex and the corresponding gradient is Lipschitz continuous. Then, the optimization to this function is replaced by solving a proximal function, which is designed based on the Lipschitz constant and can be solved through utilizing a constructed fast convergent sequence. Due to the usage of the proximal function and its efficient optimization, our method can achieve a nonlinear convergence rate, much faster than NsNMF. Simulations in both computer generated data and the real-world data show the advantages of our algorithm over the compared methods.

History

Journal

IEEE Access

Volume

4

Pagination

5161-5168

Location

Piscataway, N.J.

Open access

  • Yes

ISSN

2169-3536

eISSN

2169-3536

Language

English

Publication classification

C Journal article, C1 Refereed article in a scholarly journal

Copyright notice

2016, IEEE

Issue

99

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC