Interpretable Temporal Attention Network for COVID-19 forecasting.
COVID-19 forecasting
Covariate forecasting
Degraded Teacher Forcing
Multi-task learning
Neural network
Journal
Applied soft computing
ISSN: 1568-4946
Titre abrégé: Appl Soft Comput
Pays: United States
ID NLM: 101536968
Informations de publication
Date de publication:
May 2022
May 2022
Historique:
received:
30
10
2021
revised:
17
02
2022
accepted:
26
02
2022
pubmed:
15
3
2022
medline:
15
3
2022
entrez:
14
3
2022
Statut:
ppublish
Résumé
The worldwide outbreak of coronavirus disease 2019 (COVID-19) has triggered an unprecedented global health and economic crisis. Early and accurate forecasts of COVID-19 and evaluation of government interventions are crucial for governments to take appropriate interventions to contain the spread of COVID-19. In this work, we propose the Interpretable Temporal Attention Network (ITANet) for COVID-19 forecasting and inferring the importance of government interventions. The proposed model is with an encoder-decoder architecture and employs long short-term memory (LSTM) for temporal feature extraction and multi-head attention for long-term dependency caption. The model simultaneously takes historical information, a priori known future information, and pseudo future information into consideration, where the pseudo future information is learned with the covariate forecasting network (CFN) and multi-task learning (MTL). In addition, we also propose the degraded teacher forcing (DTF) method to train the model efficiently. Compared with other models, the ITANet is more effective in the forecasting of COVID-19 new confirmed cases. The importance of government interventions against COVID-19 is further inferred by the Temporal Covariate Interpreter (TCI) of the model.
Identifiants
pubmed: 35281183
doi: 10.1016/j.asoc.2022.108691
pii: S1568-4946(22)00154-5
pmc: PMC8905883
doi:
Types de publication
Journal Article
Langues
eng
Pagination
108691Informations de copyright
© 2022 Elsevier B.V. All rights reserved.
Déclaration de conflit d'intérêts
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
Références
J Thorac Dis. 2020 Mar;12(3):165-174
pubmed: 32274081
Nat Hum Behav. 2021 Apr;5(4):529-538
pubmed: 33686204
J Am Med Inform Assoc. 2021 Mar 18;28(4):733-743
pubmed: 33486527
Results Phys. 2021 Aug;27:104509
pubmed: 34307005
Neural Comput. 2000 Oct;12(10):2451-71
pubmed: 11032042
Results Phys. 2021 May;24:104137
pubmed: 33898209
Chaos Solitons Fractals. 2020 Nov;140:110212
pubmed: 32839642
Chaos Solitons Fractals. 2021 Jan;142:110511
pubmed: 33281305
Appl Soft Comput. 2021 May;103:107161
pubmed: 33584158
Sci Rep. 2021 Jul 12;11(1):14262
pubmed: 34253768
Sensors (Basel). 2021 Jan 13;21(2):
pubmed: 33451092
SN Comput Sci. 2020;1(4):197
pubmed: 33063048
Neural Comput. 1997 Nov 15;9(8):1735-80
pubmed: 9377276