3D point cloud
anthropometry
deep learning
neural networks
structure from motion
Journal
Journal of imaging
ISSN: 2313-433X
Titre abrégé: J Imaging
Pays: Switzerland
ID NLM: 101698819
Informations de publication
Date de publication:
11 Sep 2020
11 Sep 2020
Historique:
received:
31
07
2020
revised:
23
08
2020
accepted:
31
08
2020
entrez:
30
8
2021
pubmed:
31
8
2021
medline:
31
8
2021
Statut:
epublish
Résumé
Current point cloud extraction methods based on photogrammetry generate large amounts of spurious detections that hamper useful 3D mesh reconstructions or, even worse, the possibility of adequate measurements. Moreover, noise removal methods for point clouds are complex, slow and incapable to cope with semantic noise. In this work, we present body2vec, a model-based body segmentation tool that uses a specifically trained Neural Network architecture. Body2vec is capable to perform human body point cloud reconstruction from videos taken on hand-held devices (smartphones or tablets), achieving high quality anthropometric measurements. The main contribution of the proposed workflow is to perform a background removal step, thus avoiding the spurious points generation that is usual in photogrammetric reconstruction. A group of 60 persons were taped with a smartphone, and the corresponding point clouds were obtained automatically with standard photogrammetric methods. We used as a 3D silver standard the clean meshes obtained at the same time with LiDAR sensors post-processed and noise-filtered by expert anthropological biologists. Finally, we used as gold standard anthropometric measurements of the waist and hip of the same people, taken by expert anthropometrists. Applying our method to the raw videos significantly enhanced the quality of the results of the point cloud as compared with the LiDAR-based mesh, and of the anthropometric measurements as compared with the actual hip and waist perimeter measured by the anthropometrists. In both contexts, the resulting quality of body2vec is equivalent to the LiDAR reconstruction.
Identifiants
pubmed: 34460751
pii: jimaging6090094
doi: 10.3390/jimaging6090094
pmc: PMC8321063
pii:
doi:
Types de publication
Journal Article
Langues
eng
Subventions
Organisme : Consejo Nacional de Investigaciones Científicas y Técnicas
ID : 11220150100878CO
Références
Br J Nutr. 1999 Sep;82(3):165-77
pubmed: 10655963
Comput Biol Med. 2018 Oct 1;101:112-119
pubmed: 30125785
PLoS One. 2014 May 15;9(5):e97846
pubmed: 24830292
Sensors (Basel). 2015 Feb 04;15(2):3593-609
pubmed: 25658392
Am J Hum Biol. 1992;4(2):253-263
pubmed: 28524349
Am J Hum Biol. 2020 Mar;32(2):e23323
pubmed: 31506993
Biomed Res Int. 2015;2015:404261
pubmed: 26413519
Am J Hum Biol. 2014 Mar-Apr;26(2):156-63
pubmed: 24554284
PLoS One. 2016 Dec 28;11(12):e0168585
pubmed: 28030627
PLoS One. 2014 Sep 17;9(9):e107212
pubmed: 25229394
Z Orthop Ihre Grenzgeb. 2002 Nov-Dec;140(6):632-6
pubmed: 12476386
Obes Open Access. 2016 Nov;2(3):
pubmed: 28042607
Am J Hum Biol. 2019 Sep;31(5):e23278
pubmed: 31237064
PLoS One. 2015 Mar 06;10(3):e0119430
pubmed: 25749283
Ann Med Health Sci Res. 2016 Jan-Feb;6(1):62-3
pubmed: 27144080
World Health Organ Tech Rep Ser. 2000;894:i-xii, 1-253
pubmed: 11234459
Eur J Clin Nutr. 2016 Nov;70(11):1265-1270
pubmed: 27329614