Multi-modal analysis of infant cry types characterization: Acoustics, body language and brain signals.


Journal

Computers in biology and medicine
ISSN: 1879-0534
Titre abrégé: Comput Biol Med
Pays: United States
ID NLM: 1250250

Informations de publication

Date de publication:
Dec 2023
Historique:
received: 04 07 2023
revised: 14 09 2023
accepted: 23 10 2023
medline: 27 11 2023
pubmed: 3 11 2023
entrez: 2 11 2023
Statut: ppublish

Résumé

Infant crying is the first attempt babies use to communicate during their initial months of life. A misunderstanding of the cry message can compromise infant care and future neurodevelopmental process. An exploratory study collecting multimodal data (i.e., crying, electroencephalography (EEG), near-infrared spectroscopy (NIRS), facial expressions, and body movements) from 38 healthy full-term newborns was conducted. Cry types were defined based on different conditions (i.e., hunger, sleepiness, fussiness, need to burp, and distress). Statistical analysis, Machine Learning (ML), and Deep Learning (DL) techniques were used to identify relevant features for cry type classification and to evaluate a robust DL algorithm named Acoustic MultiStage Interpreter (AMSI). Significant differences were found across cry types based on acoustics, EEG, NIRS, facial expressions, and body movements. Acoustics and body language were identified as the most relevant ML features to support the cause of crying. The DL AMSI algorithm achieved an accuracy rate of 92%. This study set a precedent for cry analysis research by highlighting the complexity of newborn cry expression and strengthening the potential use of infant cry analysis as an objective, reliable, accessible, and non-invasive tool for cry interpretation, improving the infant-parent relationship and ensuring family well-being.

Sections du résumé

BACKGROUND BACKGROUND
Infant crying is the first attempt babies use to communicate during their initial months of life. A misunderstanding of the cry message can compromise infant care and future neurodevelopmental process.
METHODS METHODS
An exploratory study collecting multimodal data (i.e., crying, electroencephalography (EEG), near-infrared spectroscopy (NIRS), facial expressions, and body movements) from 38 healthy full-term newborns was conducted. Cry types were defined based on different conditions (i.e., hunger, sleepiness, fussiness, need to burp, and distress). Statistical analysis, Machine Learning (ML), and Deep Learning (DL) techniques were used to identify relevant features for cry type classification and to evaluate a robust DL algorithm named Acoustic MultiStage Interpreter (AMSI).
RESULTS RESULTS
Significant differences were found across cry types based on acoustics, EEG, NIRS, facial expressions, and body movements. Acoustics and body language were identified as the most relevant ML features to support the cause of crying. The DL AMSI algorithm achieved an accuracy rate of 92%.
CONCLUSIONS CONCLUSIONS
This study set a precedent for cry analysis research by highlighting the complexity of newborn cry expression and strengthening the potential use of infant cry analysis as an objective, reliable, accessible, and non-invasive tool for cry interpretation, improving the infant-parent relationship and ensuring family well-being.

Identifiants

pubmed: 37918262
pii: S0010-4825(23)01091-0
doi: 10.1016/j.compbiomed.2023.107626
pii:
doi:

Types de publication

Journal Article

Langues

eng

Sous-ensembles de citation

IM

Pagination

107626

Informations de copyright

Copyright © 2023 The Authors. Published by Elsevier Ltd.. All rights reserved.

Déclaration de conflit d'intérêts

Declaration of competing interest The authors declare competing interests (Funding, Employment or Confidentiality interests) in relation to the work described herein. Ana Laguna, Sandra Pusil, Àngel Bazán and Paolo Piras are employed by Zoundream AG. Ana Laguna is also a co-founder of the company and owns stock in Zoundream AG. Silvia Orlandi, Alexandra Pardos Véglia and Jonathan Adrian Zegarra-Valdivia receive compensation for the collaboration as members of the scientific advisory board of Zoundream AG. Clàudia Palomares’ salary is funded by Zoundream AG through Fundació Clínic. Anna Lucia Paltrinieri and Oscar Garcia-Algar declare no potential conflict of interest.

Auteurs

Ana Laguna (A)

Zoundream AG, Switzerland. Electronic address: ana.laguna@zoundream.com.

Sandra Pusil (S)

Zoundream AG, Switzerland.

Àngel Bazán (À)

Zoundream AG, Switzerland.

Jonathan Adrián Zegarra-Valdivia (JA)

Global Brain Health Institute, University of California, San Francisco, CA, USA; Achucarro Basque Center for Neuroscience, Leioa, Spain; Universidad Señor de Sipán, Chiclayo, Peru.

Anna Lucia Paltrinieri (AL)

Neonatology Unit, Hospital Clínic-Maternitat, ICGON, BCNatal, 08028, Barcelona, Spain.

Paolo Piras (P)

Zoundream AG, Switzerland.

Clàudia Palomares I Perera (C)

Neonatology Unit, Hospital Clínic-Maternitat, ICGON, BCNatal, 08028, Barcelona, Spain.

Alexandra Pardos Véglia (A)

Centro de Neuropsicología Alexandra Pardos, Madrid, Spain.

Oscar Garcia-Algar (O)

Neonatology Unit, Hospital Clínic-Maternitat, ICGON, BCNatal, 08028, Barcelona, Spain; Department de Cirurgia I Especialitats Mèdico-quirúrgiques, Universitat de Barcelona, 08036, Barcelona, Spain.

Silvia Orlandi (S)

Department of Electrical, Electronic and Information Engineering "Guglielmo Marconi"(DEI), University of Bologna, Bologna, Italy; Health Sciences and Technologies Interdepartmental Center for Industrial Research (CIRI-SDV), University of Bologna, Bologna, Italy.

Articles similaires

[Redispensing of expensive oral anticancer medicines: a practical application].

Lisanne N van Merendonk, Kübra Akgöl, Bastiaan Nuijen
1.00
Humans Antineoplastic Agents Administration, Oral Drug Costs Counterfeit Drugs

Smoking Cessation and Incident Cardiovascular Disease.

Jun Hwan Cho, Seung Yong Shin, Hoseob Kim et al.
1.00
Humans Male Smoking Cessation Cardiovascular Diseases Female
Humans United States Aged Cross-Sectional Studies Medicare Part C
1.00
Humans Yoga Low Back Pain Female Male

Classifications MeSH