Exploring sex differences in auditory saliency: the role of acoustic characteristics in bottom-up attention.
Acoustic characteristics
Auditory attention
Bottom-up attention
Saliency judgments
Sex differences
Journal
BMC neuroscience
ISSN: 1471-2202
Titre abrégé: BMC Neurosci
Pays: England
ID NLM: 100966986
Informations de publication
Date de publication:
24 Oct 2024
24 Oct 2024
Historique:
received:
13
05
2024
accepted:
16
10
2024
medline:
25
10
2024
pubmed:
25
10
2024
entrez:
25
10
2024
Statut:
epublish
Résumé
Several cognitive functions are related to sex. However, the relationship between auditory attention and sex remains unclear. The present study aimed to explore sex differences in auditory saliency judgments, with a particular focus on bottom-up type auditory attention. Forty-five typical adults (mean age: 21.5 ± 0.64 years) with no known hearing deficits, intelligence abnormalities, or attention deficits were enrolled in this study. They were tasked with annotating attention capturing sounds from five audio clips played in a soundproof room. Each stimulus contained ten salient sounds randomly placed within a 1-min natural soundscape. We conducted a generalized linear mixed model (GLMM) analysis using the number of responses to salient sounds as the dependent variable, sex as the between-subjects factor, duration, maximum loudness, and maximum spectrum of each sound as the within-subjects factor, and each sound event and participant as the variable effect. No significant differences were found between male and female groups in age, hearing threshold, intellectual function, and attention function (all p > 0.05). Analysis confirmed 77 distinct sound events, with individual response rates of 4.0-100%. In a GLMM analysis, the main effect of sex was not statistically significant (p = 0.458). Duration and spectrum had a significant effect on response rate (p = 0.006 and p < 0.001). The effect of loudness was not statistically significant (p = 0.13). The results suggest that male and female listeners do not differ significantly in their auditory saliency judgments based on the acoustic characteristics studied. This finding challenges the notion of inherent sex differences in bottom-up auditory attention and highlights the need for further research to explore other potential factors or conditions under which such differences might emerge.
Sections du résumé
BACKGROUND
BACKGROUND
Several cognitive functions are related to sex. However, the relationship between auditory attention and sex remains unclear. The present study aimed to explore sex differences in auditory saliency judgments, with a particular focus on bottom-up type auditory attention.
METHODS
METHODS
Forty-five typical adults (mean age: 21.5 ± 0.64 years) with no known hearing deficits, intelligence abnormalities, or attention deficits were enrolled in this study. They were tasked with annotating attention capturing sounds from five audio clips played in a soundproof room. Each stimulus contained ten salient sounds randomly placed within a 1-min natural soundscape. We conducted a generalized linear mixed model (GLMM) analysis using the number of responses to salient sounds as the dependent variable, sex as the between-subjects factor, duration, maximum loudness, and maximum spectrum of each sound as the within-subjects factor, and each sound event and participant as the variable effect.
RESULTS
RESULTS
No significant differences were found between male and female groups in age, hearing threshold, intellectual function, and attention function (all p > 0.05). Analysis confirmed 77 distinct sound events, with individual response rates of 4.0-100%. In a GLMM analysis, the main effect of sex was not statistically significant (p = 0.458). Duration and spectrum had a significant effect on response rate (p = 0.006 and p < 0.001). The effect of loudness was not statistically significant (p = 0.13).
CONCLUSIONS
CONCLUSIONS
The results suggest that male and female listeners do not differ significantly in their auditory saliency judgments based on the acoustic characteristics studied. This finding challenges the notion of inherent sex differences in bottom-up auditory attention and highlights the need for further research to explore other potential factors or conditions under which such differences might emerge.
Identifiants
pubmed: 39448936
doi: 10.1186/s12868-024-00909-5
pii: 10.1186/s12868-024-00909-5
doi:
Types de publication
Journal Article
Langues
eng
Sous-ensembles de citation
IM
Pagination
54Informations de copyright
© 2024. The Author(s).
Références
Huang N, Elhilali M. Push-pull competition between bottom-up and top-down auditory attention to natural soundscapes. eLife. 2020;9:e52984.
doi: 10.7554/eLife.52984
pubmed: 32196457
pmcid: 7083598
Awh E, Belopolsky AV, Theeuwes J. Top-down versus bottom-up attentional control: a failed theoretical dichotomy. Trends Cogn Sci. 2012;16:437–43.
doi: 10.1016/j.tics.2012.06.010
pubmed: 22795563
pmcid: 3426354
Cherry EC. Some experiments on the recognition of speech, with one and with two ears. J Acoust Soc Am. 1953;25:975–9.
doi: 10.1121/1.1907229
Baluch F, Itti L. Mechanisms of top-down attention. Trends Neurosci. 2011;34:210–24.
doi: 10.1016/j.tins.2011.02.003
pubmed: 21439656
Kothinti SR, Huang N, Elhilali M. Auditory salience using natural scenes: an online study. J Acoust Soc Am. 2021;150:2952.
doi: 10.1121/10.0006750
pubmed: 34717500
pmcid: 8528551
Kaya EM, Elhilali M. Modelling auditory attention. Philos Trans R Soc Lond B Biol Sci. 2017. https://doi.org/10.1098/rstb.2016.0101 .
doi: 10.1098/rstb.2016.0101
pubmed: 28673922
pmcid: 5498306
Kayser C, Petkov CI, Lippert M, Logothetis NK. Mechanisms for allocating auditory attention: an auditory saliency map. Curr Biol. 2005;15:1943–7.
doi: 10.1016/j.cub.2005.09.040
pubmed: 16271872
Itti L, Koch C, Niebur E. A model of saliency-based visual attention for rapid scene analysis. IEEE Trans Pattern Anal Mach Intell. 1998;20:1254–9.
doi: 10.1109/34.730558
Chi T, Ru P, Shamma SA. Multiresolution spectrotemporal analysis of complex sounds. J Acoust Soc Am. 2005;118:887–906.
doi: 10.1121/1.1945807
pubmed: 16158645
Southwell R, Baumann A, Gal C, Barascud N, Friston KJ, Chait M. Is predictability salient? A study of attentional capture by auditory patterns. Philos Trans R Soc Lond B Biol Sci. 2017. https://doi.org/10.1098/rstb.2016.0105 .
doi: 10.1098/rstb.2016.0105
pubmed: 28044016
pmcid: 5206273
Kaya EM, Huang N, Elhilali M. Pitch, timbre and intensity interdependently modulate neural responses to salient sounds. Neuroscience. 2020;440:1–14.
doi: 10.1016/j.neuroscience.2020.05.018
pubmed: 32445938
Petsas T, Harrison J, Kashino M, Furukawa S, Chait M. The effect of distraction on change detection in crowded acoustic scenes. Hear Res. 2016;341:179–89.
doi: 10.1016/j.heares.2016.08.015
pubmed: 27598040
pmcid: 5090045
Vachon F, Labonté K, Marsh JE. Attentional capture by deviant sounds: a noncontingent form of auditory distraction? J Exp Psychol Learn Mem Cogn. 2017;43:622–34.
doi: 10.1037/xlm0000330
pubmed: 27656870
Kim K, Lin K, Walther DB, Hasegawa-Johnson MA, Huang TS. Automatic detection of auditory salience with optimized linear filters derived from human annotation. Pattern Recognit Lett. 2014;38:78–85.
doi: 10.1016/j.patrec.2013.11.010
Huang N, Elhilali M. Auditory salience using natural soundscapes. J Acoust Soc Am. 2017;141:2163.
doi: 10.1121/1.4979055
pubmed: 28372080
pmcid: 6909985
Borji A, Itti L. State-of-the-art in visual attention modeling. IEEE Trans Pattern Anal Mach Intell. 2013;35:185–207.
doi: 10.1109/TPAMI.2012.89
pubmed: 22487985
Rigo P, De Pisapia N, Bornstein MH, Putnick DL, Serra M, Esposito G, et al. Brain processes in women and men in response to emotive sounds. Soc Neurosci. 2017;12:150–62.
doi: 10.1080/17470919.2016.1150341
pubmed: 26905380
Burra N, Kerzel D, Munoz D, Grandjean D, Ceravolo L. Early spatial attention deployment toward and away from aggressive voices. Soc Cogn Affect Neurosci. 2019;4(14):73–80.
doi: 10.1093/scan/nsy100
Liao HI, Kidani S, Yoneya M, Kashino M, Furukawa S. Correspondences among pupillary dilation response, subjective salience of sounds, and loudness. Psychon Bull Rev. 2016;23:412–25.
doi: 10.3758/s13423-015-0898-0
pubmed: 26163191
Folstein MF, Folstein SE, McHugh PR. ‘Mini-mental state’. A practical method for grading the cognitive state of patients for the clinician. J Psychiatr Res. 1975;12:189–98.
doi: 10.1016/0022-3956(75)90026-6
pubmed: 1202204
Kato M. The development and standardization of clinical assessment for attention (CAT) and clinical assessment for spontaneity (CAS). Higher Brain Funct Res. 2006;26:310.
doi: 10.2496/hbfr.26.310
Salamon J, Jacoby C, Bello JP. A dataset and taxonomy for urban sound research. https://urbansounddataset.weebly.com/urbansound.html . 2014.
Šrámková H, Granqvist S, Herbst CT, Švec JG. The softest sound levels of the human voice in normal subjects. J Acoust Soc Am. 2015;137:407–18.
doi: 10.1121/1.4904538
pubmed: 25618070
Gilman TL, Shaheen R, Nylocks KM, Halachoff D, Chapman J, Flynn JJ, et al. A film set for the elicitation of emotion in research: a comprehensive catalog derived from four decades of investigation. Behav Res Methods. 2017;49:2061–82.
doi: 10.3758/s13428-016-0842-x
pubmed: 28078572
Araújo AJ, Neto PF, Torres SL, Remoaldo P. Low-frequency noise and its main effects on human health—a review of the literature between 2016 and 2019. Appl Sci. 2020;10:5205.
doi: 10.3390/app10155205
Javadi A, Pourabdian S, Forouharmajd F. The effect of low frequency noises exposure on the precision of human at the mathematical tasks. Int J Prev Med. 2022;23:13–33.
Kochanski G, Grabe E, Coleman J, Rosner B. Loudness predicts prominence: fundamental frequency lends little. J Acoust Soc Am. 2005;118:1038–54.
doi: 10.1121/1.1923349
pubmed: 16158659
Wang CA, Boehnke SE, Itti L, Munoz DP. Transient pupil response is modulated by contrast-based saliency. J Neurosci. 2014;34:408–17.
doi: 10.1523/JNEUROSCI.3550-13.2014
pubmed: 24403141
pmcid: 6608151
Russell BC, Torralba A, Murphy KP, Freeman WT. Labelme: a database and web-based tool for image annotation. Int J Comput Vis. 2008;77:157–73.
doi: 10.1007/s11263-007-0090-8
Deng J, Dong W, Socher R, Li L, Li K, Fei-Fei L. Imagenet: a large-scale hierarchical image database. IEEE Conf Comput Vis Pattern Recognit. 2009. https://doi.org/10.1109/CVPR.2009.5206848 .
doi: 10.1109/CVPR.2009.5206848