Facial Signals and Social Actions in Multimodal Face-to-Face Interaction.
conversation
facial signals
intentions
multimodal communication
questions
responses
social actions
turn-taking
Journal
Brain sciences
ISSN: 2076-3425
Titre abrégé: Brain Sci
Pays: Switzerland
ID NLM: 101598646
Informations de publication
Date de publication:
30 Jul 2021
30 Jul 2021
Historique:
received:
28
05
2021
revised:
07
07
2021
accepted:
26
07
2021
entrez:
27
8
2021
pubmed:
28
8
2021
medline:
28
8
2021
Statut:
epublish
Résumé
In a conversation, recognising the speaker's social action (e.g., a request) early may help the potential following speakers understand the intended message quickly, and plan a timely response. Human language is multimodal, and several studies have demonstrated the contribution of the body to communication. However, comparatively few studies have investigated (non-emotional) conversational facial signals and very little is known about how they contribute to the communication of social actions. Therefore, we investigated how facial signals map onto the expressions of two fundamental social actions in conversations: asking questions and providing responses. We studied the distribution and timing of 12 facial signals across 6778 questions and 4553 responses, annotated holistically in a corpus of 34 dyadic face-to-face Dutch conversations. Moreover, we analysed facial signal clustering to find out whether there are specific combinations of facial signals within questions or responses. Results showed a high proportion of facial signals, with a qualitatively different distribution in questions versus responses. Additionally, clusters of facial signals were identified. Most facial signals occurred early in the utterance, and had earlier onsets in questions. Thus, facial signals may critically contribute to the communication of social actions in conversation by providing social action-specific visual information.
Identifiants
pubmed: 34439636
pii: brainsci11081017
doi: 10.3390/brainsci11081017
pmc: PMC8392358
pii:
doi:
Types de publication
Journal Article
Langues
eng
Subventions
Organisme : European Research Council
ID : 773079
Pays : International
Références
J Autism Dev Disord. 2004 Apr;34(2):163-75
pubmed: 15162935
Psychon Bull Rev. 2018 Oct;25(5):1900-1908
pubmed: 28887798
Acta Psychol (Amst). 1967;26(1):22-63
pubmed: 6043092
Front Psychol. 2018 Jun 28;9:1109
pubmed: 30002643
Front Psychol. 2015 May 13;6:509
pubmed: 26029125
Curr Biol. 2009 Apr 28;19(8):661-7
pubmed: 19327997
Philos Trans R Soc Lond B Biol Sci. 2014 Sep 19;369(1651):20130302
pubmed: 25092670
Trends Cogn Sci. 2019 Aug;23(8):639-652
pubmed: 31235320
Front Psychol. 2015 Jun 12;6:731
pubmed: 26124727
Front Hum Neurosci. 2018 Feb 07;12:34
pubmed: 29467635
Front Hum Neurosci. 2016 Sep 22;10:471
pubmed: 27713695
Cognition. 2018 Nov;180:38-51
pubmed: 29981967
PLoS One. 2018 Dec 12;13(12):e0208030
pubmed: 30540819
J Consult Clin Psychol. 1969 Aug;33(4):448-57
pubmed: 5810590
Behav Res Methods. 2015 Sep;47(3):837-47
pubmed: 25106813
Am Psychol. 1993 Apr;48(4):384-92
pubmed: 8512154
Proc Biol Sci. 2021 Jul 28;288(1955):20210500
pubmed: 34284631
Cereb Cortex. 2020 Mar 14;30(3):1056-1067
pubmed: 31504305
Lang Speech. 2003 Mar;46(Pt 1):1-22
pubmed: 14529109
Front Psychol. 2015 Feb 09;6:98
pubmed: 25709592
Proc Natl Acad Sci U S A. 2009 Jun 30;106(26):10587-92
pubmed: 19553212
Psychophysiology. 2008 Sep;45(5):679-87
pubmed: 18665867
Cogn Process. 2017 Aug;18(3):285-306
pubmed: 28434136
Cognition. 2016 May;150:77-84
pubmed: 26872248
Neuroimage. 2016 Jan 15;125:857-867
pubmed: 26505303
Proc Biol Sci. 2009 Oct 22;276(1673):3635-44
pubmed: 19640888
Sci Rep. 2019 Nov 8;9(1):16285
pubmed: 31705052
PLoS One. 2015 Mar 20;10(3):e0120068
pubmed: 25793289
Behav Res Methods. 2018 Aug;50(4):1657-1672
pubmed: 29235070
Lang Speech. 2009;52(Pt 2-3):287-314
pubmed: 19624033
Sci Rep. 2016 Nov 15;6:37036
pubmed: 27845434
Biometrics. 1977 Mar;33(1):159-74
pubmed: 843571