Pseudo-Supervised Deep Subspace Clustering.


Journal

IEEE transactions on image processing : a publication of the IEEE Signal Processing Society
ISSN: 1941-0042
Titre abrégé: IEEE Trans Image Process
Pays: United States
ID NLM: 9886191

Informations de publication

Date de publication:
2021
Historique:
pubmed: 26 5 2021
medline: 26 5 2021
entrez: 25 5 2021
Statut: ppublish

Résumé

Auto-Encoder (AE)-based deep subspace clustering (DSC) methods have achieved impressive performance due to the powerful representation extracted using deep neural networks while prioritizing categorical separability. However, self-reconstruction loss of an AE ignores rich useful relation information and might lead to indiscriminative representation, which inevitably degrades the clustering performance. It is also challenging to learn high-level similarity without feeding semantic labels. Another unsolved problem facing DSC is the huge memory cost due to n×n similarity matrix, which is incurred by the self-expression layer between an encoder and decoder. To tackle these problems, we use pairwise similarity to weigh the reconstruction loss to capture local structure information, while a similarity is learned by the self-expression layer. Pseudo-graphs and pseudo-labels, which allow benefiting from uncertain knowledge acquired during network training, are further employed to supervise similarity learning. Joint learning and iterative training facilitate to obtain an overall optimal solution. Extensive experiments on benchmark datasets demonstrate the superiority of our approach. By combining with the k -nearest neighbors algorithm, we further show that our method can address the large-scale and out-of-sample problems. The source code of our method is available: https://github.com/sckangz/SelfsupervisedSC.

Identifiants

pubmed: 34033539
doi: 10.1109/TIP.2021.3079800
doi:

Types de publication

Journal Article

Langues

eng

Sous-ensembles de citation

IM

Pagination

5252-5263

Auteurs

Classifications MeSH