Robust Orthogonal Matrix Factorization for Efficient Subspace Learning

Published in Neurocomputing, 2015

Eunwoo Kim and Songhwai Oh, “Robust Orthogonal Matrix Factorization for Efficient Subspace Learning”, Neurocomputing, vol. 167, pp. 218-229, Nov. 2015.

Abstract: Low-rank matrix factorization plays an important role in the areas of pattern recognition, computer vision, and machine learning. Recently, a new family of methods, such as l1-norm minimization and robust PCA, has been proposed for low-rank subspace analysis problems and has shown to be robust against outliers and missing data. But these methods suffer from heavy computation loads and can fail to find a solution when highly corrupted data are presented. In this paper, a robust orthogonal matrix approximation method using fixed-rank factorization is proposed. The proposed method finds a robust solution efficiently using orthogonality and smoothness constraints. The proposed method is also extended to handle the rank uncertainty issue by a rank estimation strategy for practical real-world problems. The proposed method is applied to a number of low-rank matrix approximation problems and experimental results show that the proposed method is highly accurate, fast, and efficient compared to the existing methods.

[Paper]