Deep learning improves image quality and radiomics reproducibility for high-speed four-dimensional computed tomography reconstruction.
4DCT
Deep learning
Imaging quality
Radiomics
Radiotherapy
Reproducibility
Journal
Radiotherapy and oncology : journal of the European Society for Therapeutic Radiology and Oncology
ISSN: 1879-0887
Titre abrégé: Radiother Oncol
Pays: Ireland
ID NLM: 8407192
Informations de publication
Date de publication:
05 2022
05 2022
Historique:
received:
24
01
2022
revised:
24
02
2022
accepted:
25
02
2022
pubmed:
9
3
2022
medline:
20
5
2022
entrez:
8
3
2022
Statut:
ppublish
Résumé
Hybrid iterative reconstruction (HIR) is the most commonly used algorithm for four-dimensional computed tomography (4DCT) reconstruction due to its high speed. However, the image quality is worse than that of model-based iterative reconstruction (MIR). Different reconstruction methods affect the stability of radiomics features. Herein, we developed a deep learning method to improve the quality and radiomics reproducibility of the high-speed reconstruction. The 4DCT images of 70 patients were reconstructed using both the HIR and MIR algorithms. A cycle-consistent adversarial network was adopted to learn the mapping from HIR to MIR, and then generate synthetic MIR (sMIR) images from HIR. The performance was evaluated using the testing set (10 patients). The total reconstruction times for the HIR, MIR, and proposed sMIR images were approximately 2.5, 15, and 3.1 mins, respectively. The quality of sMIR images was close to that of MIR and was superior to that of HIR images, with noise reduced by 45-77% and contrast-to-noise ratio improved by 91-296%. The concordance correlation coefficients (CCC) of radiomic features improved from 0.89 ± 0.15 for HIR to 0.97 ± 0.07 for the proposed sMIR. The percentage of reproducible features (CCC ≥ 0.85) increased from 76.08% for HIR to 95.86% for sMIR, with an improvement of 19.78%. Compared to existing HIR algorithm, the proposed method improves the image quality and radiomics reproducibility of 4DCT images under high-speed reconstruction. It is computationally efficient and has potential to be integrated into any CT system.
Sections du résumé
BACKGROUND AND PURPOSE
Hybrid iterative reconstruction (HIR) is the most commonly used algorithm for four-dimensional computed tomography (4DCT) reconstruction due to its high speed. However, the image quality is worse than that of model-based iterative reconstruction (MIR). Different reconstruction methods affect the stability of radiomics features. Herein, we developed a deep learning method to improve the quality and radiomics reproducibility of the high-speed reconstruction.
MATERIALS AND METHODS
The 4DCT images of 70 patients were reconstructed using both the HIR and MIR algorithms. A cycle-consistent adversarial network was adopted to learn the mapping from HIR to MIR, and then generate synthetic MIR (sMIR) images from HIR. The performance was evaluated using the testing set (10 patients).
RESULTS
The total reconstruction times for the HIR, MIR, and proposed sMIR images were approximately 2.5, 15, and 3.1 mins, respectively. The quality of sMIR images was close to that of MIR and was superior to that of HIR images, with noise reduced by 45-77% and contrast-to-noise ratio improved by 91-296%. The concordance correlation coefficients (CCC) of radiomic features improved from 0.89 ± 0.15 for HIR to 0.97 ± 0.07 for the proposed sMIR. The percentage of reproducible features (CCC ≥ 0.85) increased from 76.08% for HIR to 95.86% for sMIR, with an improvement of 19.78%.
CONCLUSION
Compared to existing HIR algorithm, the proposed method improves the image quality and radiomics reproducibility of 4DCT images under high-speed reconstruction. It is computationally efficient and has potential to be integrated into any CT system.
Identifiants
pubmed: 35257852
pii: S0167-8140(22)00119-0
doi: 10.1016/j.radonc.2022.02.034
pii:
doi:
Types de publication
Journal Article
Research Support, Non-U.S. Gov't
Langues
eng
Sous-ensembles de citation
IM
Pagination
184-189Informations de copyright
Copyright © 2022 Elsevier B.V. All rights reserved.