The nnU-Net based method for automatic segmenting fetal brain tissues.
FeTA challenge
Fetal brain tissue segmentation
Image automatic segmentation
Magnetic resonance image segmentation
nnU-Net
Journal
Health information science and systems
ISSN: 2047-2501
Titre abrégé: Health Inf Sci Syst
Pays: England
ID NLM: 101638060
Informations de publication
Date de publication:
Dec 2023
Dec 2023
Historique:
received:
24
12
2022
accepted:
09
03
2023
pmc-release:
01
12
2024
medline:
1
4
2023
entrez:
31
3
2023
pubmed:
1
4
2023
Statut:
epublish
Résumé
The magnetic resonance (MR) images of fetuses make it possible for doctors to detect out pathological fetal brains in early stages. Brain tissue segmentation is prerequisite for making brain morphology and volume analyses. nnU-Net is an automatic segmentation method based on deep learning. It can adaptively configure itself, so as to adapt to a specific task via preprocessing, network architecture, training, and post-processing. Therefore, we adapt nnU-Net to segment seven types of fetal brain tissues, including external cerebrospinal fluid, gray matter, white matter, ventricle, cerebellum, deep gray matter, and brainstem. With regard to the characteristics of the FeTA 2021 data, some adjustments are made to the original nnU-Net, so that it can segment seven types of fetal brain tissues precisely as far as possible. The average segmentation results on FeTA 2021 training data demonstrate that our advanced nnU-Net is superior to the peers including SegNet, CoTr, AC U-Net and ResUnet. Its average segmentation results are 0.842, 11.759 and 0.957 in terms of Dice, HD95 and VS criteria. Moreover, the experimental results on FeTA 2021 test data further demonstrate that our advanced nnU-Net has obtained good segmentation performance of 0.774, 14.699 and 0.875 in terms of Dice, HD95 and VS, ranked the third in FeTA 2021 challenge. Our advanced nnU-Net realized the task for segmenting the fetal brain tissues using MR images of different gestational ages, which can help doctors to make correct and seasonable diagnoses.
Identifiants
pubmed: 36998806
doi: 10.1007/s13755-023-00220-3
pii: 220
pmc: PMC10043149
doi:
Types de publication
Journal Article
Langues
eng
Pagination
17Informations de copyright
© The Author(s), under exclusive licence to Springer Nature Switzerland AG 2023, Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
Déclaration de conflit d'intérêts
Conflict of interestThere is not any conflict of interest in the manuscript.
Références
Health Inf Sci Syst. 2019 Oct 12;7(1):21
pubmed: 31656594
Sci Data. 2021 Jul 6;8(1):167
pubmed: 34230489
Med Image Anal. 2012 Dec;16(8):1550-64
pubmed: 22939612
Front Neurosci. 2020 Dec 02;14:591683
pubmed: 33343286
Med Image Comput Comput Assist Interv. 2008;11(Pt 1):351-8
pubmed: 18979766
Neuroimage. 2010 Nov 1;53(2):460-70
pubmed: 20600970
Magn Reson Imaging. 2019 Dec;64:77-89
pubmed: 31181246
IEEE Trans Med Imaging. 2021 Apr;40(4):1123-1133
pubmed: 33351755
Neuroimage. 2015 Sep;118:584-97
pubmed: 26072252
Nat Methods. 2021 Feb;18(2):203-211
pubmed: 33288961
Health Inf Sci Syst. 2020 Oct 8;8(1):32
pubmed: 33088488
Health Inf Sci Syst. 2022 May 20;10(1):9
pubmed: 35607433
Neuroimage. 2018 Apr 15;170:231-248
pubmed: 28666878
Health Inf Sci Syst. 2020 Oct 12;8(1):33
pubmed: 33088489
J Community Genet. 2018 Oct;9(4):335-340
pubmed: 30229538
Nat Methods. 2019 Jan;16(1):67-70
pubmed: 30559429
Med Image Anal. 2023 Aug;88:102833
pubmed: 37267773
IEEE Trans Pattern Anal Mach Intell. 2017 Dec;39(12):2481-2495
pubmed: 28060704
IEEE Trans Pattern Anal Mach Intell. 2018 Apr;40(4):834-848
pubmed: 28463186