Boundary determination of foot ulcer images by applying the associative hierarchical random field framework.

conditional random field diabetic foot ulcer wound boundary determination wound image analysis

Journal

Journal of medical imaging (Bellingham, Wash.)
ISSN: 2329-4302
Titre abrégé: J Med Imaging (Bellingham)
Pays: United States
ID NLM: 101643461

Informations de publication

Date de publication:
Apr 2019
Historique:
received: 16 04 2018
accepted: 29 03 2019
entrez: 1 5 2019
pubmed: 1 5 2019
medline: 1 5 2019
Statut: ppublish

Résumé

As traditional visual-examination-based methods provide neither reliable nor consistent wound assessment, several computer-based approaches for quantitative wound image analysis have been proposed in recent years. However, these methods require either some level of human interaction for proper image processing or that images be captured under controlled conditions. However, to become a practical tool of diabetic patients for wound management, the wound image algorithm needs to be able to correctly locate and detect the wound boundary of images acquired under less-constrained conditions, where the illumination and camera angle can vary within reasonable bounds. We present a wound boundary determination method that is robust to lighting and camera orientation perturbations by applying the associative hierarchical random field (AHRF) framework, which is an improved conditional random field (CRF) model originally applied to natural image multiscale analysis. To validate the robustness of the AHRF framework for wound boundary recognition tasks, we have tested the method on two image datasets: (1) foot and leg ulcer images (for the patients we have tracked for 2 years) that were captured under one of the two conditions, such that 70% of the entire dataset are captured with image capture box to ensure consistent lighting and range and the remaining 30% of the images are captured by a handheld camera under varied conditions of lighting, incident angle, and range and (2) moulage wound images that were captured under similarly varied conditions. Compared to other CRF-based machine learning strategies, our new method provides a determination accuracy with the best global performance rates (specificity:

Identifiants

pubmed: 31037245
doi: 10.1117/1.JMI.6.2.024002
pii: 18076R
pmc: PMC6475526
doi:

Types de publication

Journal Article

Langues

eng

Pagination

024002

Références

IEEE Trans Med Imaging. 2000 Dec;19(12):1202-10
pubmed: 11212368
Biochim Biophys Acta. 1975 Oct 20;405(2):442-51
pubmed: 1180967
IEEE Trans Med Imaging. 2010 Feb;29(2):410-27
pubmed: 19825516
IEEE Trans Med Imaging. 2011 Feb;30(2):315-26
pubmed: 20875969
IEEE Trans Pattern Anal Mach Intell. 1984 Jun;6(6):721-41
pubmed: 22499653
IEEE Trans Pattern Anal Mach Intell. 2012 Nov;34(11):2274-82
pubmed: 22641706
IEEE Trans Biomed Eng. 2015 Feb;62(2):477-88
pubmed: 25248175
IEEE Trans Pattern Anal Mach Intell. 2014 Jun;36(6):1056-77
pubmed: 26353271
IEEE Trans Biomed Eng. 2017 Sep;64(9):2098-2109
pubmed: 27893380

Auteurs

Lei Wang (L)

Worcester Polytechnic Institute, Department of Electrical and Computer Engineering, Worcester, Massachusetts, United States.

Peder C Pedersen (PC)

Worcester Polytechnic Institute, Department of Electrical and Computer Engineering, Worcester, Massachusetts, United States.

Emmanuel Agu (E)

Worcester Polytechnic Institute, Department of Computer Science, Worcester, Massachusetts, United States.

Diane Strong (D)

Worcester Polytechnic Institute, Foisie School of Business, Worcester, Massachusetts, United States.

Bengisu Tulu (B)

Worcester Polytechnic Institute, Foisie School of Business, Worcester, Massachusetts, United States.

Classifications MeSH