Display options
Share it on

IEEE Trans Image Process. 2015 Sep;24(9):2760-71. doi: 10.1109/TIP.2015.2425545.

Learning a Nonnegative Sparse Graph for Linear Regression.

IEEE transactions on image processing : a publication of the IEEE Signal Processing Society

Xiaozhao Fang, Yong Xu, Xuelong Li, Zhihui Lai, Wai Keung Wong

PMID: 25910093 DOI: 10.1109/TIP.2015.2425545

Abstract

Previous graph-based semisupervised learning (G-SSL) methods have the following drawbacks: 1) they usually predefine the graph structure and then use it to perform label prediction, which cannot guarantee an overall optimum and 2) they only focus on the label prediction or the graph structure construction but are not competent in handling new samples. To this end, a novel nonnegative sparse graph (NNSG) learning method was first proposed. Then, both the label prediction and projection learning were integrated into linear regression. Finally, the linear regression and graph structure learning were unified within the same framework to overcome these two drawbacks. Therefore, a novel method, named learning a NNSG for linear regression was presented, in which the linear regression and graph learning were simultaneously performed to guarantee an overall optimum. In the learning process, the label information can be accurately propagated via the graph structure so that the linear regression can learn a discriminative projection to better fit sample labels and accurately classify new samples. An effective algorithm was designed to solve the corresponding optimization problem with fast convergence. Furthermore, NNSG provides a unified perceptiveness for a number of graph-based learning methods and linear regression methods. The experimental results showed that NNSG can obtain very high classification accuracy and greatly outperforms conventional G-SSL methods, especially some conventional graph construction methods.

Publication Types