Continual Multiview Task Learning via Deep Matrix Factorization.
Journal
IEEE transactions on neural networks and learning systems
ISSN: 2162-2388
Titre abrégé: IEEE Trans Neural Netw Learn Syst
Pays: United States
ID NLM: 101616214
Informations de publication
Date de publication:
01 2021
01 2021
Historique:
pubmed:
17
3
2020
medline:
29
1
2022
entrez:
17
3
2020
Statut:
ppublish
Résumé
The state-of-the-art multitask multiview (MTMV) learning tackles a scenario where multiple tasks are related to each other via multiple shared feature views. However, in many real-world scenarios where a sequence of the multiview task comes, the higher storage requirement and computational cost of retraining previous tasks with MTMV models have presented a formidable challenge for this lifelong learning scenario. To address this challenge, in this article, we propose a new continual multiview task learning model that integrates deep matrix factorization and sparse subspace learning in a unified framework, which is termed deep continual multiview task learning (DCMvTL). More specifically, as a new multiview task arrives, DCMvTL first adopts a deep matrix factorization technique to capture hidden and hierarchical representations for this new coming multiview task while accumulating the fresh multiview knowledge in a layerwise manner. Then, a sparse subspace learning model is employed for the extracted factors at each layer and further reveals cross-view correlations via a self-expressive constraint. For model optimization, we derive a general multiview learning formulation when a new multiview task comes and apply an alternating minimization strategy to achieve lifelong learning. Extensive experiments on benchmark data sets demonstrate the effectiveness of our proposed DCMvTL model compared with the existing state-of-the-art MTMV and lifelong multiview task learning models.
Identifiants
pubmed: 32175877
doi: 10.1109/TNNLS.2020.2977497
doi:
Types de publication
Journal Article
Research Support, Non-U.S. Gov't
Langues
eng
Sous-ensembles de citation
IM