Marble melancholy: using crossmodal correspondences of shapes, materials, and music to predict music-induced emotions.

crossmodal correspondences machine learning materials music-induced emotions random forests sensory interactions shapes

Journal

Frontiers in psychology
ISSN: 1664-1078
Titre abrégé: Front Psychol
Pays: Switzerland
ID NLM: 101550902

Informations de publication

Date de publication:
2023
Historique:
received: 17 02 2023
accepted: 08 08 2023
medline: 18 9 2023
pubmed: 18 9 2023
entrez: 18 9 2023
Statut: epublish

Résumé

Music is known to elicit strong emotions in listeners, and, if primed appropriately, can give rise to specific and observable crossmodal correspondences. This study aimed to assess two primary objectives: (1) identifying crossmodal correspondences emerging from music-induced emotions, and (2) examining the predictability of music-induced emotions based on the association of music with visual shapes and materials. To achieve this, 176 participants were asked to associate visual shapes and materials with the emotion classes of the Geneva Music-Induced Affect Checklist scale (GEMIAC) elicited by a set of musical excerpts in an online experiment. Our findings reveal that music-induced emotions and their underlying core affect (i.e., valence and arousal) can be accurately predicted by the joint information of musical excerpt and features of visual shapes and materials associated with these music-induced emotions. Interestingly, valence and arousal induced by music have higher predictability than discrete GEMIAC emotions. These results demonstrate the relevance of crossmodal correspondences in studying music-induced emotions. The potential applications of these findings in the fields of sensory interactions design, multisensory experiences and art, as well as digital and sensory marketing are briefly discussed.

Identifiants

pubmed: 37720661
doi: 10.3389/fpsyg.2023.1168258
pmc: PMC10502175
doi:

Types de publication

Journal Article

Langues

eng

Pagination

1168258

Informations de copyright

Copyright © 2023 Mesz, Tedesco, Reinoso-Carvalho, Ter Horst, Molina, Gunn and Küssner.

Déclaration de conflit d'intérêts

Author GM was employed by company Bayesian Solutions LLC, Charlotte, NC, United States. The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Références

Atten Percept Psychophys. 2011 May;73(4):971-95
pubmed: 21264748
Front Psychol. 2018 Feb 28;9:215
pubmed: 29541041
Iperception. 2020 Aug 27;11(4):2041669520950750
pubmed: 32922715
Atten Percept Psychophys. 2018 Apr;80(3):738-751
pubmed: 29260503
Exp Brain Res. 2012 Aug;220(3-4):319-33
pubmed: 22706551
Proc Natl Acad Sci U S A. 2013 May 28;110(22):8836-41
pubmed: 23671106
Exp Brain Res. 2016 Dec;234(12):3509-3522
pubmed: 27501731
Iperception. 2022 May 9;13(3):20416695221092802
pubmed: 35572076
Iperception. 2022 Oct 11;13(5):20416695221127325
pubmed: 36246303
Appetite. 2017 Jan 1;108:383-390
pubmed: 27784634
Cogn Sci. 2021 Apr;45(4):e12933
pubmed: 33873259
Front Psychol. 2018 Jan 05;8:2239
pubmed: 29354080
Front Psychol. 2014 Mar 17;5:72
pubmed: 24672492
Chem Senses. 2011 Mar;36(3):301-9
pubmed: 21163913
Perception. 2018 Jan;47(1):67-89
pubmed: 28927319
Multisens Res. 2022 Jun 08;35(5):407-446
pubmed: 35985652
Multisens Res. 2020 Jul 01;33(1):1-29
pubmed: 31648195
Perception. 2011;40(2):209-19
pubmed: 21650094
Multisens Res. 2019 Jan 1;32(4-5):367-400
pubmed: 31059486
Neurocase. 2012 Feb;18(1):50-6
pubmed: 21707266
Proc Biol Sci. 2019 Jul 10;286(1906):20190513
pubmed: 31288695
Cereb Cortex. 2012 Dec;22(12):2769-83
pubmed: 22178712
Iperception. 2018 Mar 08;9(2):2041669518761464
pubmed: 29755722
Front Psychol. 2015 Sep 15;6:1382
pubmed: 26441757
Multisens Res. 2022 Aug 09;35(6):495-536
pubmed: 35985650
PLoS One. 2014 Oct 20;9(10):e110490
pubmed: 25330315
Can J Appl Sport Sci. 1983 Mar;8(1):9-18
pubmed: 6850979
Multisens Res. 2016;29(1-3):157-93
pubmed: 27311295
Front Psychol. 2018 Jan 18;8:2368
pubmed: 29403412
Psychol Rev. 2003 Jan;110(1):145-72
pubmed: 12529060
Iperception. 2012;3(7):410-2
pubmed: 23145291
Percept Mot Skills. 1979 Dec;49(3):839-42
pubmed: 530783
Front Psychol. 2014 Jul 28;5:789
pubmed: 25120506
Multisens Res. 2016;29(1-3):133-55
pubmed: 27311294
Front Hum Neurosci. 2014 May 30;8:352
pubmed: 24910604
Q J Exp Psychol (Hove). 2023 Apr;76(4):731-761
pubmed: 35414309
Chem Senses. 2012 Feb;37(2):151-8
pubmed: 21852708
Food Res Int. 2021 Dec;150(Pt A):110795
pubmed: 34865810
Proc Natl Acad Sci U S A. 2013 Jun 18;110 Suppl 2:10446-53
pubmed: 23754408

Auteurs

Bruno Mesz (B)

Instituto de Investigación en Arte y Cultura, Universidad Nacional de Tres de Febrero, Sáenz Peña, Argentina.
Programa de Investigación STSEAS, EUdA, UNQ, Bernal, Argentina.

Sebastián Tedesco (S)

Instituto de Investigación en Arte y Cultura, Universidad Nacional de Tres de Febrero, Sáenz Peña, Argentina.

Felipe Reinoso-Carvalho (F)

Universidad de los Andes School of Management, Bogotá, Colombia.

Enrique Ter Horst (E)

Universidad de los Andes School of Management, Bogotá, Colombia.

German Molina (G)

Bayesian Solutions LLC, Charlotte, NC, United States.

Laura H Gunn (LH)

Department of Public Health Sciences, University of North Carolina at Charlotte, Charlotte, NC, United States.
School of Data Science, University of North Carolina at Charlotte, Charlotte, NC, United States.
Faculty of Medicine, Department of Primary Care and Public Health, Imperial College London, London, United Kingdom.

Mats B Küssner (MB)

Department of Musicology and Media Studies, Humboldt-Universität zu Berlin, Berlin, Germany.

Classifications MeSH