Self-organizing maps on "what-where" codes towards fully unsupervised classification.
Biologically-inspired models
Self-organizing maps
Unsupervised classification
Visual pattern recognition
What-Where codes
Journal
Biological cybernetics
ISSN: 1432-0770
Titre abrégé: Biol Cybern
Pays: Germany
ID NLM: 7502533
Informations de publication
Date de publication:
06 2023
06 2023
Historique:
received:
15
09
2021
accepted:
14
04
2023
medline:
13
6
2023
pubmed:
16
5
2023
entrez:
15
5
2023
Statut:
ppublish
Résumé
Interest in unsupervised learning architectures has been rising. Besides being biologically unnatural, it is costly to depend on large labeled data sets to get a well-performing classification system. Therefore, both the deep learning community and the more biologically-inspired models community have focused on proposing unsupervised techniques that can produce adequate hidden representations which can then be fed to a simpler supervised classifier. Despite great success with this approach, an ultimate dependence on a supervised model remains, which forces the number of classes to be known beforehand, and makes the system depend on labels to extract concepts. To overcome this limitation, recent work has been proposed that shows how a self-organizing map (SOM) can be used as a completely unsupervised classifier. However, to achieve success it required deep learning techniques to generate high quality embeddings. The purpose of this work is to show that we can use our previously proposed What-Where encoder in tandem with the SOM to get an end-to-end unsupervised system that is Hebbian. Such system, requires no labels to train nor does it require knowledge of which classes exist beforehand. It can be trained online and adapt to new classes that may emerge. As in the original work, we use the MNIST data set to run an experimental analysis and verify that the system achieves similar accuracies to the best ones reported thus far. Furthermore, we extend the analysis to the more difficult Fashion-MNIST problem and conclude that the system still performs.
Identifiants
pubmed: 37188974
doi: 10.1007/s00422-023-00963-y
pii: 10.1007/s00422-023-00963-y
pmc: PMC10258173
doi:
Types de publication
Journal Article
Research Support, Non-U.S. Gov't
Langues
eng
Sous-ensembles de citation
IM
Pagination
211-220Informations de copyright
© 2023. The Author(s).
Références
Neural Netw. 2019 Oct;118:90-101
pubmed: 31254771
Trends Cogn Sci. 2000 Jan;4(1):6-14
pubmed: 10637617
Nature. 1978 Apr 27;272(5656):814-6
pubmed: 643070
Proc Natl Acad Sci U S A. 2019 Apr 16;116(16):7723-7731
pubmed: 30926658
Neural Netw. 2019 Jun;114:38-46
pubmed: 30856532
Neural Comput. 2021 Nov 12;33(12):3334-3350
pubmed: 34710905
Neural Netw. 2010 Jan;23(1):74-88
pubmed: 19783403
Neural Comput. 2020 Jan;32(1):136-152
pubmed: 31614104