A novel hybrid framework based on temporal convolution network and transformer for network traffic prediction.
Journal
PloS one
ISSN: 1932-6203
Titre abrégé: PLoS One
Pays: United States
ID NLM: 101285081
Informations de publication
Date de publication:
2023
2023
Historique:
received:
04
12
2022
accepted:
07
07
2023
medline:
11
9
2023
pubmed:
8
9
2023
entrez:
8
9
2023
Statut:
epublish
Résumé
Accurately predicting mobile network traffic can help mobile network operators allocate resources more rationally and can facilitate stable and fast network services to users. However, due to burstiness and uncertainty, it is difficult to accurately predict network traffic. Considering the spatio-temporal correlation of network traffic, we proposed a deep-learning model, Convolutional Block Attention Module (CBAM) Spatio-Temporal Convolution Network-Transformer, for time-series prediction based on a CBAM attention mechanism, a Temporal Convolutional Network (TCN), and Transformer with a sparse self-attention mechanism. The model can be used to extract the spatio-temporal features of network traffic for prediction. First, we used the improved TCN for spatial information and added the CBAM attention mechanism, which we named CSTCN. This model dealt with important temporal and spatial features in network traffic. Second, Transformer was used to extract spatio-temporal features based on the sparse self-attention mechanism. The experiments in comparison with the baseline showed that the above work helped significantly to improve the prediction accuracy. We conducted experiments on a real network traffic dataset in the city of Milan. The results showed that CSTCN-Transformer reduced the mean square error and the mean average error of prediction results by 65.16%, 64.97%, and 60.26%, and by 51.36%, 53.10%, and 38.24%, respectively, compared to CSTCN, a Long Short-Term Memory network, and Transformer on test sets, which justified the model design in this paper.
Sections du résumé
BACKGROUND
Accurately predicting mobile network traffic can help mobile network operators allocate resources more rationally and can facilitate stable and fast network services to users. However, due to burstiness and uncertainty, it is difficult to accurately predict network traffic.
METHODOLOGY
Considering the spatio-temporal correlation of network traffic, we proposed a deep-learning model, Convolutional Block Attention Module (CBAM) Spatio-Temporal Convolution Network-Transformer, for time-series prediction based on a CBAM attention mechanism, a Temporal Convolutional Network (TCN), and Transformer with a sparse self-attention mechanism. The model can be used to extract the spatio-temporal features of network traffic for prediction. First, we used the improved TCN for spatial information and added the CBAM attention mechanism, which we named CSTCN. This model dealt with important temporal and spatial features in network traffic. Second, Transformer was used to extract spatio-temporal features based on the sparse self-attention mechanism. The experiments in comparison with the baseline showed that the above work helped significantly to improve the prediction accuracy. We conducted experiments on a real network traffic dataset in the city of Milan.
RESULTS
The results showed that CSTCN-Transformer reduced the mean square error and the mean average error of prediction results by 65.16%, 64.97%, and 60.26%, and by 51.36%, 53.10%, and 38.24%, respectively, compared to CSTCN, a Long Short-Term Memory network, and Transformer on test sets, which justified the model design in this paper.
Identifiants
pubmed: 37682829
doi: 10.1371/journal.pone.0288935
pii: PONE-D-22-32685
pmc: PMC10490908
doi:
Types de publication
Journal Article
Research Support, Non-U.S. Gov't
Langues
eng
Sous-ensembles de citation
IM
Pagination
e0288935Informations de copyright
Copyright: © 2023 Zhang et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Déclaration de conflit d'intérêts
The authors have declared that no competing interests exist.
Références
Annu Rev Psychol. 2007;58:1-23
pubmed: 17029565
Sci Data. 2015 Oct 27;2:150055
pubmed: 26528394
IEEE Trans Cybern. 2022 Dec 13;PP:
pubmed: 37015356