Classification of emotional states via transdermal cardiovascular spatiotemporal facial patterns using multispectral face videos.
Journal
Scientific reports
ISSN: 2045-2322
Titre abrégé: Sci Rep
Pays: England
ID NLM: 101563288
Informations de publication
Date de publication:
01 07 2022
01 07 2022
Historique:
received:
05
08
2021
accepted:
13
06
2022
entrez:
1
7
2022
pubmed:
2
7
2022
medline:
7
7
2022
Statut:
epublish
Résumé
We describe a new method for remote emotional state assessment using multispectral face videos, and present our findings: unique transdermal, cardiovascular and spatiotemporal facial patterns associated with different emotional states. The method does not rely on stereotypical facial expressions but utilizes different wavelength sensitivities (visible spectrum, near-infrared, and long-wave infrared) to gauge correlates of autonomic nervous system activity spatially and temporally distributed across the human face (e.g., blood flow, hemoglobin concentration, and temperature). We conducted an experiment where 110 participants viewed 150 short emotion-eliciting videos and reported their emotional experience, while three cameras recorded facial videos with multiple wavelengths. Spatiotemporal multispectral features from the multispectral videos were used as inputs to a machine learning model that was able to classify participants' emotional state (i.e., amusement, disgust, fear, sexual arousal, or no emotion) with satisfactory results (average ROC AUC score of 0.75), while providing feature importance analysis that allows the examination of facial occurrences per emotional state. We discuss findings concerning the different spatiotemporal patterns associated with different emotional states as well as the different advantages of the current method over existing approaches to emotion detection.
Identifiants
pubmed: 35778591
doi: 10.1038/s41598-022-14808-4
pii: 10.1038/s41598-022-14808-4
pmc: PMC9249872
doi:
Types de publication
Journal Article
Research Support, Non-U.S. Gov't
Langues
eng
Sous-ensembles de citation
IM
Pagination
11188Commentaires et corrections
Type : ErratumIn
Informations de copyright
© 2022. The Author(s).
Références
Cogn Emot. 2016 Aug;30(5):827-56
pubmed: 25929696
Biomed Res Int. 2017;2017:8317357
pubmed: 28900626
Proc Natl Acad Sci U S A. 2017 Sep 19;114(38):E7900-E7909
pubmed: 28874542
Biol Lett. 2012 Oct 23;8(5):864-7
pubmed: 22647931
Psychol Sci. 2008 May;19(5):508-14
pubmed: 18466413
Photochem Photobiol. 2017 Nov;93(6):1449-1461
pubmed: 28471473
Sci Rep. 2015 Oct 06;5:14637
pubmed: 26440644
Sensors (Basel). 2019 May 13;19(9):
pubmed: 31086110
Sci Rep. 2018 Jul 12;8(1):10588
pubmed: 30002447
IEEE Trans Biomed Eng. 2013 Oct;60(10):2878-86
pubmed: 23744659
Annu Int Conf IEEE Eng Med Biol Soc. 2007;2007:247-9
pubmed: 18001936
Psychophysiology. 2014 Oct;51(10):951-63
pubmed: 24961292
PLoS One. 2015 Mar 04;10(3):e0118432
pubmed: 25738806
Psychol Sci Public Interest. 2019 Jul;20(1):1-68
pubmed: 31313636
Annu Int Conf IEEE Eng Med Biol Soc. 2020 Jul;2020:4414-4420
pubmed: 33018974
Phys Med Biol. 2001 Aug;46(8):2227-37
pubmed: 11512621
Phys Med Biol. 2006 Mar 7;51(5):N91-8
pubmed: 16481677
Int J Psychophysiol. 2004 Jan;51(2):143-53
pubmed: 14693364
Sci Rep. 2018 May 31;8(1):8501
pubmed: 29855610
J Invest Dermatol. 1981 Jul;77(1):13-9
pubmed: 7252245
Annu Int Conf IEEE Eng Med Biol Soc. 2017 Jul;2017:2333-2336
pubmed: 29060365