Multi-Task Learning for Dense Prediction Tasks: A Survey.
Journal
IEEE transactions on pattern analysis and machine intelligence
ISSN: 1939-3539
Titre abrégé: IEEE Trans Pattern Anal Mach Intell
Pays: United States
ID NLM: 9885960
Informations de publication
Date de publication:
Jul 2022
Jul 2022
Historique:
pubmed:
27
1
2021
medline:
27
1
2021
entrez:
26
1
2021
Statut:
ppublish
Résumé
With the advent of deep learning, many dense prediction tasks, i.e., tasks that produce pixel-level predictions, have seen significant performance improvements. The typical approach is to learn these tasks in isolation, that is, a separate neural network is trained for each individual task. Yet, recent multi-task learning (MTL) techniques have shown promising results w.r.t. performance, computations and/or memory footprint, by jointly tackling multiple tasks through a learned shared representation. In this survey, we provide a well-rounded view on state-of-the-art deep learning approaches for MTL in computer vision, explicitly emphasizing on dense prediction tasks. Our contributions concern the following. First, we consider MTL from a network architecture point-of-view. We include an extensive overview and discuss the advantages/disadvantages of recent popular MTL models. Second, we examine various optimization methods to tackle the joint learning of multiple tasks. We summarize the qualitative elements of these works and explore their commonalities and differences. Finally, we provide an extensive experimental evaluation across a variety of dense prediction benchmarks to examine the pros and cons of the different methods, including both architectural and optimization based strategies.
Identifiants
pubmed: 33497328
doi: 10.1109/TPAMI.2021.3054719
doi:
Types de publication
Journal Article
Langues
eng
Sous-ensembles de citation
IM