Kidney Segmentation from Dynamic Contrast-Enhanced Magnetic Resonance Imaging Integrating Deep Convolutional Neural Networks and Level Set Methods.
DCE-MRI
U-Net
kidney segmentation
level set
Journal
Bioengineering (Basel, Switzerland)
ISSN: 2306-5354
Titre abrégé: Bioengineering (Basel)
Pays: Switzerland
ID NLM: 101676056
Informations de publication
Date de publication:
24 Jun 2023
24 Jun 2023
Historique:
received:
16
05
2023
revised:
20
06
2023
accepted:
21
06
2023
medline:
29
7
2023
pubmed:
29
7
2023
entrez:
29
7
2023
Statut:
epublish
Résumé
The dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) technique has taken on a significant and increasing role in diagnostic procedures and treatments for patients who suffer from chronic kidney disease. Careful segmentation of kidneys from DCE-MRI scans is an essential early step towards the evaluation of kidney function. Recently, deep convolutional neural networks have increased in popularity in medical image segmentation. To this end, in this paper, we propose a new and fully automated two-phase approach that integrates convolutional neural networks and level set methods to delimit kidneys in DCE-MRI scans. We first develop two convolutional neural networks that rely on the U-Net structure (UNT) to predict a kidney probability map for DCE-MRI scans. Then, to leverage the segmentation performance, the pixel-wise kidney probability map predicted from the deep model is exploited with the shape prior information in a level set method to guide the contour evolution towards the target kidney. Real DCE-MRI datasets of 45 subjects are used for training, validating, and testing the proposed approach. The valuation results demonstrate the high performance of the two-phase approach, achieving a Dice similarity coefficient of 0.95 ± 0.02 and intersection over union of 0.91 ± 0.03, and 1.54 ± 1.6 considering a 95% Hausdorff distance. Our intensive experiments confirm the potential and effectiveness of that approach over both UNT models and numerous recent level set-based methods.
Identifiants
pubmed: 37508782
pii: bioengineering10070755
doi: 10.3390/bioengineering10070755
pmc: PMC10375962
pii:
doi:
Types de publication
Journal Article
Langues
eng
Subventions
Organisme : This research is supported by the Science and Technology Development Fund (STDF), Egypt (grant USC 17:253). Also, this research work is partially funded by Princess Nourah bint Abdulrahman University Researchers Supporting Project number (PNURSP2023R40),
ID : Grant USC 17:253 and Project number PNURSP2023R40
Références
IEEE Trans Pattern Anal Mach Intell. 2007 Jun;29(6):945-58
pubmed: 17431295
J Imaging. 2022 Feb 25;8(3):
pubmed: 35324610
BMC Med Inform Decis Mak. 2019 Dec 12;19(Suppl 9):244
pubmed: 31830973
Med Image Anal. 2021 Apr;69:101960
pubmed: 33517241
Med Image Anal. 2021 Apr;69:101950
pubmed: 33421920
IEEE Trans Med Imaging. 2003 Feb;22(2):137-54
pubmed: 12715991
IEEE Trans Med Imaging. 2013 Oct;32(10):1910-27
pubmed: 23797240
Biomedicines. 2022 Dec 21;11(1):
pubmed: 36672514
Invest Radiol. 2008 Jan;43(1):40-8
pubmed: 18097276
Arch Comput Methods Eng. 2022;29(7):5525-5567
pubmed: 35729963
Med Image Comput Comput Assist Interv. 2010;13(Pt 1):10-8
pubmed: 20879209
Bioengineering (Basel). 2022 Nov 05;9(11):
pubmed: 36354565
Nat Methods. 2021 Feb;18(2):203-211
pubmed: 33288961
Sensors (Basel). 2021 Nov 28;21(23):
pubmed: 34883946
Sci Rep. 2022 Nov 5;12(1):18816
pubmed: 36335227
Sensors (Basel). 2021 Jul 20;21(14):
pubmed: 34300667
Proc IEEE Int Symp Biomed Imaging. 2018 Apr;2018:1534-1537
pubmed: 30473744
Comput Biol Med. 2021 Jul;134:104497
pubmed: 34022486