Deep learning-based segmentation of multisite disease in ovarian cancer.
Deep learning
Omentum
Ovarian Neoplasms
Pelvis
Tomography (x-ray computed)
Journal
European radiology experimental
ISSN: 2509-9280
Titre abrégé: Eur Radiol Exp
Pays: England
ID NLM: 101721752
Informations de publication
Date de publication:
07 12 2023
07 12 2023
Historique:
received:
21
09
2022
accepted:
21
09
2023
medline:
11
12
2023
pubmed:
7
12
2023
entrez:
6
12
2023
Statut:
epublish
Résumé
To determine if pelvic/ovarian and omental lesions of ovarian cancer can be reliably segmented on computed tomography (CT) using fully automated deep learning-based methods. A deep learning model for the two most common disease sites of high-grade serous ovarian cancer lesions (pelvis/ovaries and omentum) was developed and compared against the well-established "no-new-Net" framework and unrevised trainee radiologist segmentations. A total of 451 CT scans collected from four different institutions were used for training (n = 276), evaluation (n = 104) and testing (n = 71) of the methods. The performance was evaluated using the Dice similarity coefficient (DSC) and compared using a Wilcoxon test. Our model outperformed no-new-Net for the pelvic/ovarian lesions in cross-validation, on the evaluation and test set by a significant margin (p values being 4 × 10 Automated ovarian cancer segmentation on CT scans using deep neural networks is feasible and achieves performance close to a trainee-level radiologist for pelvic/ovarian lesions. Automated segmentation of ovarian cancer may be used by clinicians for CT-based volumetric assessments and researchers for building complex analysis pipelines. • The first automated approach for pelvic/ovarian and omental ovarian cancer lesion segmentation on CT images has been presented. • Automated segmentation of ovarian cancer lesions can be comparable with manual segmentation of trainee radiologists. • Careful hyperparameter tuning can provide models significantly outperforming strong state-of-the-art baselines.
Identifiants
pubmed: 38057616
doi: 10.1186/s41747-023-00388-z
pii: 10.1186/s41747-023-00388-z
pmc: PMC10700248
doi:
Types de publication
Journal Article
Research Support, Non-U.S. Gov't
Research Support, N.I.H., Extramural
Langues
eng
Sous-ensembles de citation
IM
Pagination
77Subventions
Organisme : Wellcome Trust
ID : RG98755
Pays : United Kingdom
Organisme : Cancer Research UK
ID : 15601
Pays : United Kingdom
Organisme : Wellcome Trust
Pays : United Kingdom
Organisme : Cancer Research UK
ID : 22905
Pays : United Kingdom
Organisme : Wellcome Trust
ID : EP/R511675/1
Pays : United Kingdom
Organisme : NCI NIH HHS
ID : 75N91019D00024
Pays : United States
Informations de copyright
© 2023. The Author(s).
Références
Neuro Oncol. 2019 Feb 14;21(2):234-241
pubmed: 30085283
Radiol Artif Intell. 2023 Jul 05;5(5):e230024
pubmed: 37795137
Radiol Artif Intell. 2022 Feb 16;4(2):e210205
pubmed: 35391774
Med Image Anal. 2014 Jul;18(5):725-39
pubmed: 24835180
Comput Med Imaging Graph. 2022 Jan;95:102026
pubmed: 34953431
Nat Methods. 2021 Feb;18(2):203-211
pubmed: 33288961
Eur J Cancer. 2009 Jan;45(2):228-47
pubmed: 19097774
Nat Commun. 2023 Oct 24;14(1):6756
pubmed: 37875466
Surg Oncol Clin N Am. 2019 Jul;28(3):447-464
pubmed: 31079799
Int J Gynecol Cancer. 2019 May 15;:
pubmed: 31097511
Radiology. 2016 Sep;280(3):905-15
pubmed: 26982677
Lancet Oncol. 2017 Mar;18(3):e143-e152
pubmed: 28271869