Fast Approximation for Sparse Coding with Applications to Object Recognition.

fast approximation homotopy iterative hard thresholding object recognition sparse coding

Journal

Sensors (Basel, Switzerland)
ISSN: 1424-8220
Titre abrégé: Sensors (Basel)
Pays: Switzerland
ID NLM: 101204366

Informations de publication

Date de publication:
19 Feb 2021
Historique:
received: 19 12 2020
revised: 10 02 2021
accepted: 17 02 2021
entrez: 6 3 2021
pubmed: 7 3 2021
medline: 7 3 2021
Statut: epublish

Résumé

Sparse Coding (SC) has been widely studied and shown its superiority in the fields of signal processing, statistics, and machine learning. However, due to the high computational cost of the optimization algorithms required to compute the sparse feature, the applicability of SC to real-time object recognition tasks is limited. Many deep neural networks have been constructed to low fast estimate the sparse feature with the help of a large number of training samples, which is not suitable for small-scale datasets. Therefore, this work presents a simple and efficient fast approximation method for SC, in which a special single-hidden-layer neural network (SLNNs) is constructed to perform the approximation task, and the optimal sparse features of training samples exactly computed by sparse coding algorithm are used as ground truth to train the SLNNs. After training, the proposed SLNNs can quickly estimate sparse features for testing samples. Ten benchmark data sets taken from UCI databases and two face image datasets are used for experiment, and the low root mean square error (RMSE) results between the approximated sparse features and the optimal ones have verified the approximation performance of this proposed method. Furthermore, the recognition results demonstrate that the proposed method can effectively reduce the computational time of testing process while maintaining the recognition performance, and outperforms several state-of-the-art fast approximation sparse coding methods, as well as the exact sparse coding algorithms.

Identifiants

pubmed: 33669576
pii: s21041442
doi: 10.3390/s21041442
pmc: PMC7923134
pii:
doi:

Types de publication

Journal Article

Langues

eng

Sous-ensembles de citation

IM

Subventions

Organisme : National Natural Science Foundation of China(NFSC)
ID : 61873067

Références

IEEE Trans Image Process. 2019 Oct 03;:
pubmed: 31603781
Artif Intell Med. 2019 Apr;95:96-103
pubmed: 30352711
J Neurophysiol. 1995 Feb;73(2):713-26
pubmed: 7760130
IEEE Trans Neural Netw Learn Syst. 2018 Apr;29(4):1132-1146
pubmed: 28212100
J Physiol. 1959 Oct;148:574-91
pubmed: 14403679
IEEE Trans Pattern Anal Mach Intell. 2009 Feb;31(2):210-27
pubmed: 19110489
IEEE Trans Syst Man Cybern B Cybern. 2012 Apr;42(2):513-29
pubmed: 21984515
IEEE Trans Pattern Anal Mach Intell. 2013 Nov;35(11):2651-64
pubmed: 24051726
IEEE Trans Image Process. 2007 Dec;16(12):2992-3004
pubmed: 18092598
IEEE Trans Image Process. 2012 Dec;21(12):4709-21
pubmed: 23008251
Proc Natl Acad Sci U S A. 2009 Nov 10;106(45):18914-9
pubmed: 19858495
IEEE Trans Neural Netw Learn Syst. 2012 Jul;23(7):1013-27
pubmed: 24807129
Proc Natl Acad Sci U S A. 2003 Mar 4;100(5):2197-202
pubmed: 16576749

Auteurs

Zhenzhen Sun (Z)

The College of Mathematics and Computer Science, Fuzhou University, Fuzhou 350116, China.

Yuanlong Yu (Y)

The College of Mathematics and Computer Science, Fuzhou University, Fuzhou 350116, China.

Classifications MeSH