Audiovisual synchrony perception in observing human motion to music.


Journal

PloS one
ISSN: 1932-6203
Titre abrégé: PLoS One
Pays: United States
ID NLM: 101285081

Informations de publication

Date de publication:
2019
Historique:
received: 17 04 2019
accepted: 09 08 2019
entrez: 28 8 2019
pubmed: 28 8 2019
medline: 4 3 2020
Statut: epublish

Résumé

To examine how individuals perceive synchrony between music and body motion, we investigated the characteristics of synchrony perception during observation of a Japanese Radio Calisthenics routine. We used the constant stimuli method to present video clips of an individual performing an exercise routine. We generated stimuli with a range of temporal shifts between the visual and auditory streams, and asked participants to make synchrony judgments. We then examined which movement-feature points agreed with music beats when the participants perceived synchrony. We found that extremities (e.g., hands and feet) reached the movement endpoint or moved through the lowest position at music beats associated with synchrony. Movement onsets never agreed with music beats. To investigate whether visual information about the feature points was necessary for synchrony perception, we conducted a second experiment where only limited portions of video clips were presented to the participants. Participants consistently judged synchrony even when the video image did not contain the critical feature points, suggesting that a prediction mechanism contributes to synchrony perception. To discuss the meaning of these feature points with respect to synchrony perception, we examined the temporal relationship between the motion of body parts and the ground reaction force (GRF) of exercise performers, which reflected the total force acting on the performer. Interestingly, vertical GRF showed local peaks consistently synchronized with music beats for most exercises, with timing that was closely correlated with the timing of movement feature points. This result suggests that synchrony perception in humans is based on some global variable anticipated from visual information, instead of the feature points found in the motion of individual body parts. In summary, the present results indicate that synchrony perception during observation of human motion to music depends largely on spatiotemporal prediction of the performer's motion.

Identifiants

pubmed: 31454393
doi: 10.1371/journal.pone.0221584
pii: PONE-D-19-10942
pmc: PMC6711538
doi:

Banques de données

Dryad
['10.5061/dryad.sq25h1q']

Types de publication

Journal Article Research Support, Non-U.S. Gov't

Langues

eng

Sous-ensembles de citation

IM

Pagination

e0221584

Déclaration de conflit d'intérêts

The authors have declared that no competing interests exist.

Références

J Acoust Soc Am. 2000 Jan;107(1):496-500
pubmed: 10641657
Behav Brain Sci. 2001 Oct;24(5):849-78; discussion 878-937
pubmed: 12239891
Curr Biol. 2003 Jul 1;13(13):R519-21
pubmed: 12842029
Exp Brain Res. 2003 Dec;153(4):628-36
pubmed: 12937876
Annu Rev Neurosci. 2004;27:169-92
pubmed: 15217330
Science. 2005 Jun 3;308(5727):1430
pubmed: 15933193
Exp Brain Res. 2005 Oct;166(3-4):455-64
pubmed: 16032402
Hum Mov Sci. 2005 Jun;24(3):379-402
pubmed: 16087264
J Vis. 2006 Mar 16;6(3):260-8
pubmed: 16643094
Brain Res. 2006 Sep 21;1111(1):134-42
pubmed: 16876772
Cognition. 2007 Dec;105(3):533-46
pubmed: 17196580
Brain Cogn. 2008 Jun;67(1):94-102
pubmed: 18234407
Physiology (Bethesda). 2008 Jun;23:171-9
pubmed: 18556470
Curr Opin Neurobiol. 2008 Apr;18(2):179-84
pubmed: 18706501
Percept Psychophys. 2008 Aug;70(6):955-68
pubmed: 18717383
Cortex. 2009 Jan;45(1):35-43
pubmed: 19054504
Cognition. 2009 Mar;110(3):432-9
pubmed: 19121519
Exp Brain Res. 2009 Sep;198(2-3):339-52
pubmed: 19404620
Exp Brain Res. 2009 Oct;199(1):89-93
pubmed: 19657633
Proc Natl Acad Sci U S A. 2010 Mar 30;107(13):5768-73
pubmed: 20231438
Atten Percept Psychophys. 2010 May;72(4):871-84
pubmed: 20436185
Atten Percept Psychophys. 2010 May;72(4):1120-9
pubmed: 20436205
Hum Mov Sci. 2011 Dec;30(6):1260-71
pubmed: 21802159
Proc Natl Acad Sci U S A. 2011 Dec 20;108(51):E1441-50
pubmed: 22114191
Psychon Bull Rev. 2012 Oct;19(5):820-46
pubmed: 22829342
J Comp Psychol. 2013 Nov;127(4):412-27
pubmed: 23544769
Neurosci Lett. 2013 Jun 7;544:157-62
pubmed: 23603261
Front Psychol. 2013 Apr 12;4:183
pubmed: 23641220
Cognition. 2014 Jun;131(3):330-44
pubmed: 24632323
Philos Trans R Soc Lond B Biol Sci. 2014 Apr 28;369(1644):20130420
pubmed: 24778385
PLoS One. 2014 May 16;9(5):e97680
pubmed: 24837135
Proc Natl Acad Sci U S A. 2014 Jul 15;111(28):10383-8
pubmed: 24982142
J Exp Psychol Hum Percept Perform. 2014 Oct;40(5):1849-60
pubmed: 25019498
Front Hum Neurosci. 2014 Nov 07;8:903
pubmed: 25426051
PLoS One. 2014 Dec 12;9(12):e115495
pubmed: 25502730
J Dance Med Sci. 2015 Mar;19(1):11-21
pubmed: 25741780
Front Psychol. 2015 Jun 02;6:736
pubmed: 26082738
Front Hum Neurosci. 2015 Aug 26;9:444
pubmed: 26379522
Front Hum Neurosci. 2015 Dec 18;9:663
pubmed: 26733370
Sci Rep. 2016 Mar 07;6:22774
pubmed: 26947252
Front Hum Neurosci. 2016 Apr 27;10:186
pubmed: 27199709
Acta Psychol (Amst). 2016 Sep;169:61-70
pubmed: 27232554
Exp Brain Res. 2017 May;235(5):1541-1554
pubmed: 28251338
J Exp Child Psychol. 2017 Jul;159:159-174
pubmed: 28288412
Annu Rev Psychol. 2018 Jan 4;69:51-75
pubmed: 29035690
PLoS One. 2017 Nov 6;12(11):e0187666
pubmed: 29107970
Sci Rep. 2017 Nov 24;7(1):16220
pubmed: 29176669
Neurosci Lett. 2018 Aug 24;682:132-136
pubmed: 30031033
Proc Natl Acad Sci U S A. 2018 Aug 7;115(32):8221-8226
pubmed: 30037989
Acta Psychol (Amst). 2018 Nov;191:190-200
pubmed: 30308442
Perception. 1980;9(6):719-21
pubmed: 7220244
J Exp Psychol Hum Percept Perform. 1996 Oct;22(5):1094-106
pubmed: 8865617

Auteurs

Akira Takehana (A)

Department of Mechanical Engineering and Intelligent Systems, Graduate School of Informatics and Engineering, University of Electro-Communications, Chofu, Tokyo, Japan.

Tsukasa Uehara (T)

Department of Mechanical Engineering and Intelligent Systems, Graduate School of Informatics and Engineering, University of Electro-Communications, Chofu, Tokyo, Japan.

Yutaka Sakaguchi (Y)

Department of Mechanical Engineering and Intelligent Systems, Graduate School of Informatics and Engineering, University of Electro-Communications, Chofu, Tokyo, Japan.
Research Center for Performance Art Science, University of Electro-Communications, Chofu, Tokyo, Japan.

Articles similaires

[Redispensing of expensive oral anticancer medicines: a practical application].

Lisanne N van Merendonk, Kübra Akgöl, Bastiaan Nuijen
1.00
Humans Antineoplastic Agents Administration, Oral Drug Costs Counterfeit Drugs

Smoking Cessation and Incident Cardiovascular Disease.

Jun Hwan Cho, Seung Yong Shin, Hoseob Kim et al.
1.00
Humans Male Smoking Cessation Cardiovascular Diseases Female
Humans United States Aged Cross-Sectional Studies Medicare Part C
1.00
Humans Yoga Low Back Pain Female Male

Classifications MeSH