Please use this identifier to cite or link to this item: http://localhost/handle/Hannan/235883
Title: Nonnegative Discriminant Matrix Factorization
Authors: Yuwu Lu;Zhihui Lai;Yong Xu;Xuelong Li;David Zhang;Chun Yuan
Year: 2017
Publisher: IEEE
Abstract: Nonnegative matrix factorization (NMF), which aims at obtaining the nonnegative low-dimensional representation of data, has received wide attention. To obtain more effective nonnegative discriminant bases from the original NMF, in this paper, a novel method called nonnegative discriminant matrix factorization (NDMF) is proposed for image classification. NDMF integrates the nonnegative constraint, orthogonality, and discriminant information in the objective function. NDMF considers the incoherent information of both factors in standard NMF and is proposed to enhance the discriminant ability of the learned base matrix. NDMF projects the low-dimensional representation of the subspace of the base matrix to regularize the NMF for discriminant subspace learning. Based on the Euclidean distance metric and the generalized Kullback-Leibler (KL) divergence, two kinds of iterative algorithms are presented to solve the optimization problem. The between- and within-class scatter matrices are divided into positive and negative parts for the update rules and the proofs of the convergence are also presented. Extensive experimental results demonstrate the effectiveness of the proposed method in comparison with the state-of-the-art discriminant NMF algorithms.
URI: http://localhost/handle/Hannan/235883
volume: 27
issue: 7
More Information: 1392,
1405
Appears in Collections:2017

Files in This Item:
File SizeFormat 
7428887.pdf1.74 MBAdobe PDF
Title: Nonnegative Discriminant Matrix Factorization
Authors: Yuwu Lu;Zhihui Lai;Yong Xu;Xuelong Li;David Zhang;Chun Yuan
Year: 2017
Publisher: IEEE
Abstract: Nonnegative matrix factorization (NMF), which aims at obtaining the nonnegative low-dimensional representation of data, has received wide attention. To obtain more effective nonnegative discriminant bases from the original NMF, in this paper, a novel method called nonnegative discriminant matrix factorization (NDMF) is proposed for image classification. NDMF integrates the nonnegative constraint, orthogonality, and discriminant information in the objective function. NDMF considers the incoherent information of both factors in standard NMF and is proposed to enhance the discriminant ability of the learned base matrix. NDMF projects the low-dimensional representation of the subspace of the base matrix to regularize the NMF for discriminant subspace learning. Based on the Euclidean distance metric and the generalized Kullback-Leibler (KL) divergence, two kinds of iterative algorithms are presented to solve the optimization problem. The between- and within-class scatter matrices are divided into positive and negative parts for the update rules and the proofs of the convergence are also presented. Extensive experimental results demonstrate the effectiveness of the proposed method in comparison with the state-of-the-art discriminant NMF algorithms.
URI: http://localhost/handle/Hannan/235883
volume: 27
issue: 7
More Information: 1392,
1405
Appears in Collections:2017

Files in This Item:
File SizeFormat 
7428887.pdf1.74 MBAdobe PDF
Title: Nonnegative Discriminant Matrix Factorization
Authors: Yuwu Lu;Zhihui Lai;Yong Xu;Xuelong Li;David Zhang;Chun Yuan
Year: 2017
Publisher: IEEE
Abstract: Nonnegative matrix factorization (NMF), which aims at obtaining the nonnegative low-dimensional representation of data, has received wide attention. To obtain more effective nonnegative discriminant bases from the original NMF, in this paper, a novel method called nonnegative discriminant matrix factorization (NDMF) is proposed for image classification. NDMF integrates the nonnegative constraint, orthogonality, and discriminant information in the objective function. NDMF considers the incoherent information of both factors in standard NMF and is proposed to enhance the discriminant ability of the learned base matrix. NDMF projects the low-dimensional representation of the subspace of the base matrix to regularize the NMF for discriminant subspace learning. Based on the Euclidean distance metric and the generalized Kullback-Leibler (KL) divergence, two kinds of iterative algorithms are presented to solve the optimization problem. The between- and within-class scatter matrices are divided into positive and negative parts for the update rules and the proofs of the convergence are also presented. Extensive experimental results demonstrate the effectiveness of the proposed method in comparison with the state-of-the-art discriminant NMF algorithms.
URI: http://localhost/handle/Hannan/235883
volume: 27
issue: 7
More Information: 1392,
1405
Appears in Collections:2017

Files in This Item:
File SizeFormat 
7428887.pdf1.74 MBAdobe PDF