Children with autism spectrum disorder produce more ambiguous and less socially meaningful facial expressions: an experimental study using random forest classifiers.
Algorithm
Autism spectrum disorder
Emotion
Facial expressions
Journal
Molecular autism
ISSN: 2040-2392
Titre abrégé: Mol Autism
Pays: England
ID NLM: 101534222
Informations de publication
Date de publication:
2020
2020
Historique:
received:
08
08
2019
accepted:
01
01
2020
entrez:
21
1
2020
pubmed:
21
1
2020
medline:
1
1
2021
Statut:
epublish
Résumé
Computer vision combined with human annotation could offer a novel method for exploring facial expression (FE) dynamics in children with autism spectrum disorder (ASD). We recruited 157 children with typical development (TD) and 36 children with ASD in Paris and Nice to perform two experimental tasks to produce FEs with emotional valence. FEs were explored by judging ratings and by random forest (RF) classifiers. To do so, we located a set of 49 facial landmarks in the task videos, we generated a set of geometric and appearance features and we used RF classifiers to explore how children with ASD differed from TD children when producing FEs. Using multivariate models including other factors known to predict FEs (age, gender, intellectual quotient, emotion subtype, cultural background), ratings from expert raters showed that children with ASD had more difficulty producing FEs than TD children. In addition, when we explored how RF classifiers performed, we found that classification tasks, except for those for sadness, were highly accurate and that RF classifiers needed more facial landmarks to achieve the best classification for children with ASD. Confusion matrices showed that when RF classifiers were tested in children with ASD, anger was often confounded with happiness. The sample size of the group of children with ASD was lower than that of the group of TD children. By using several control calculations, we tried to compensate for this limitation. Children with ASD have more difficulty producing socially meaningful FEs. The computer vision methods we used to explore FE dynamics also highlight that the production of FEs in children with ASD carries more ambiguity.
Sections du résumé
Background
Computer vision combined with human annotation could offer a novel method for exploring facial expression (FE) dynamics in children with autism spectrum disorder (ASD).
Methods
We recruited 157 children with typical development (TD) and 36 children with ASD in Paris and Nice to perform two experimental tasks to produce FEs with emotional valence. FEs were explored by judging ratings and by random forest (RF) classifiers. To do so, we located a set of 49 facial landmarks in the task videos, we generated a set of geometric and appearance features and we used RF classifiers to explore how children with ASD differed from TD children when producing FEs.
Results
Using multivariate models including other factors known to predict FEs (age, gender, intellectual quotient, emotion subtype, cultural background), ratings from expert raters showed that children with ASD had more difficulty producing FEs than TD children. In addition, when we explored how RF classifiers performed, we found that classification tasks, except for those for sadness, were highly accurate and that RF classifiers needed more facial landmarks to achieve the best classification for children with ASD. Confusion matrices showed that when RF classifiers were tested in children with ASD, anger was often confounded with happiness.
Limitations
The sample size of the group of children with ASD was lower than that of the group of TD children. By using several control calculations, we tried to compensate for this limitation.
Conclusion
Children with ASD have more difficulty producing socially meaningful FEs. The computer vision methods we used to explore FE dynamics also highlight that the production of FEs in children with ASD carries more ambiguity.
Identifiants
pubmed: 31956394
doi: 10.1186/s13229-020-0312-2
pii: 312
pmc: PMC6958757
doi:
Types de publication
Journal Article
Research Support, Non-U.S. Gov't
Langues
eng
Sous-ensembles de citation
IM
Pagination
5Informations de copyright
© The Author(s). 2020.
Déclaration de conflit d'intérêts
Competing interestsThe authors declare that they have no competing interest.
Références
Front Psychol. 2018 Apr 04;9:446
pubmed: 29670561
PLoS One. 2018 Jan 2;13(1):e0190442
pubmed: 29293598
IEEE Trans Affect Comput. 2018 Jan-Mar;9(1):14-20
pubmed: 29963280
Child Dev. 1982 Oct;53(5):1299-1311
pubmed: 7140433
Emotion. 2006 Feb;6(1):103-114
pubmed: 16637754
J Speech Lang Hear Res. 2013 Jun;56(3):1035-44
pubmed: 23811475
Behav Res Methods. 2008 May;40(2):531-9
pubmed: 18522064
Proc (IEEE Int Conf Multimed Expo). 2013;2013:1-6
pubmed: 25302090
Autism. 2014 Aug;18(6):704-11
pubmed: 24121180
J Autism Dev Disord. 2015 Jan;45(1):75-89
pubmed: 25037584
J Abnorm Child Psychol. 2018 Jul;46(5):1111-1120
pubmed: 28993938
J Consult Clin Psychol. 1993 Jun;61(3):475-84
pubmed: 8326050
Shinrigaku Kenkyu. 2012 Aug;83(3):217-24
pubmed: 23012823
Sci Rep. 2017 Feb 01;7:40700
pubmed: 28145411
Autism Res. 2018 Dec;11(12):1586-1601
pubmed: 30393953
Autism Res. 2016 Feb;9(2):262-71
pubmed: 26053037
Sensors (Basel). 2018 Nov 16;18(11):null
pubmed: 30453518
Emotion. 2012 Oct;12(5):1161-79
pubmed: 22081890
Arch Gen Psychiatry. 2002 Sep;59(9):809-16
pubmed: 12215080
J Child Psychol Psychiatry. 1989 Sep;30(5):725-35
pubmed: 2793960
Brain Dev. 2013 Feb;35(2):96-101
pubmed: 22964276
J Autism Dev Disord. 2019 Mar;49(3):1062-1079
pubmed: 30406914
Int J Methods Psychiatr Res. 2011 Sep;20(3):145-56
pubmed: 22547297