Keypoint-MoSeq: parsing behavior by linking point tracking to pose dynamics.
Journal
Nature methods
ISSN: 1548-7105
Titre abrégé: Nat Methods
Pays: United States
ID NLM: 101215604
Informations de publication
Date de publication:
Jul 2024
Jul 2024
Historique:
received:
05
04
2023
accepted:
22
05
2024
medline:
13
7
2024
pubmed:
13
7
2024
entrez:
12
7
2024
Statut:
ppublish
Résumé
Keypoint tracking algorithms can flexibly quantify animal movement from videos obtained in a wide variety of settings. However, it remains unclear how to parse continuous keypoint data into discrete actions. This challenge is particularly acute because keypoint data are susceptible to high-frequency jitter that clustering algorithms can mistake for transitions between actions. Here we present keypoint-MoSeq, a machine learning-based platform for identifying behavioral modules ('syllables') from keypoint data without human supervision. Keypoint-MoSeq uses a generative model to distinguish keypoint noise from behavior, enabling it to identify syllables whose boundaries correspond to natural sub-second discontinuities in pose dynamics. Keypoint-MoSeq outperforms commonly used alternative clustering methods at identifying these transitions, at capturing correlations between neural activity and behavior and at classifying either solitary or social behaviors in accordance with human annotations. Keypoint-MoSeq also works in multiple species and generalizes beyond the syllable timescale, identifying fast sniff-aligned movements in mice and a spectrum of oscillatory behaviors in fruit flies. Keypoint-MoSeq, therefore, renders accessible the modular structure of behavior through standard video recordings.
Identifiants
pubmed: 38997595
doi: 10.1038/s41592-024-02318-2
pii: 10.1038/s41592-024-02318-2
doi:
Types de publication
Journal Article
Langues
eng
Sous-ensembles de citation
IM
Pagination
1329-1339Subventions
Organisme : U.S. Department of Health & Human Services | National Institutes of Health (NIH)
ID : RF1AG073625
Organisme : U.S. Department of Health & Human Services | National Institutes of Health (NIH)
ID : R01NS114020
Organisme : U.S. Department of Health & Human Services | National Institutes of Health (NIH)
ID : U24NS109520
Organisme : U.S. Department of Health & Human Services | National Institutes of Health (NIH)
ID : F31NS113385
Organisme : U.S. Department of Health & Human Services | National Institutes of Health (NIH)
ID : F31NS122155
Informations de copyright
© 2024. The Author(s).
Références
Tinbergen, N. The Study of Instinct (Clarendon Press, 1951).
Dawkins, R. In Growing Points in Ethology (Bateson, P. P. G. & Hinde, R. A. eds.) Chap 1 (Cambridge University Press, 1976).
Baerends, G. P. The functional organization of behaviour. Anim. Behav. 24, 726–738 (1976).
doi: 10.1016/S0003-3472(76)80002-4
Pereira, T. D. et al. SLEAP: a deep learning system for multi-animal pose tracking. Nat. Methods 19, 486–495 (2022).
doi: 10.1038/s41592-022-01426-1
pubmed: 35379947
pmcid: 9007740
Mathis, A. et al. DeepLabCut: markerless pose estimation of user-defined body parts with deep learning. Nat. Neurosci. 21, 1281–1289 (2018).
doi: 10.1038/s41593-018-0209-y
pubmed: 30127430
Sun, J. J. et al. Self-supervised keypoint discovery in behavioral videos. Proc. IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recognit. 2022, 2161–2170 (2022).
pubmed: 36628357
pmcid: 9829414
Graving, J. M. et al. DeepPoseKit, a software toolkit for fast and robust animal pose estimation using deep learning. eLife 8, e47994 (2019).
doi: 10.7554/eLife.47994
pubmed: 31570119
pmcid: 6897514
Mathis, A., Schneider, S., Lauer, J. & Mathis, M. W. A primer on motion capture with deep learning: principles, pitfalls, and perspectives. Neuron 108, 44–65 (2020).
doi: 10.1016/j.neuron.2020.09.017
pubmed: 33058765
Datta, S. R., Anderson, D. J., Branson, K., Perona, P. & Leifer, A. Computational neuroethology: a call to action. Neuron 104, 11–24 (2019).
doi: 10.1016/j.neuron.2019.09.038
pubmed: 31600508
pmcid: 6981239
Anderson, D. J. & Perona, P. Toward a science of computational ethology. Neuron 84, 18–31 (2014).
doi: 10.1016/j.neuron.2014.09.005
pubmed: 25277452
Pereira, T. D., Shaevitz, J. W. & Murthy, M. Quantifying behavior to understand the brain. Nat. Neurosci. 23, 1537–1549 (2020).
doi: 10.1038/s41593-020-00734-z
pubmed: 33169033
pmcid: 7780298
Hsu, A. I. & Yttri, E. A. B-SOiD, an open-source unsupervised algorithm for identification and fast prediction of behaviors. Nat. Commun. 12, 5188 (2021).
doi: 10.1038/s41467-021-25420-x
pubmed: 34465784
pmcid: 8408193
Luxem, K. et al. Identifying behavioral structure from deep variational embeddings of animal motion. Commun. Biol. 5, 1267 (2022).
doi: 10.1038/s42003-022-04080-7
pubmed: 36400882
pmcid: 9674640
Marques, J. C., Lackner, S., Félix, R. & Orger, M. B. Structure of the Zebrafish locomotor repertoire revealed with unsupervised behavioral clustering. Curr. Biol. 28, 181–195 (2018).
doi: 10.1016/j.cub.2017.12.002
pubmed: 29307558
Todd, J. G., Kain, J. S. & de Bivort, B. L. Systematic exploration of unsupervised methods for mapping behavior. Phys. Biol. 14, 015002 (2017).
doi: 10.1088/1478-3975/14/1/015002
pubmed: 28166059
Wiltschko, A. B. et al. Mapping sub-second structure in mouse behavior. Neuron 88, 1121–1135 (2015).
doi: 10.1016/j.neuron.2015.11.031
pubmed: 26687221
pmcid: 4708087
Berman, G. J., Choi, D. M., Bialek, W. & Shaevitz, J. W. Mapping the stereotyped behaviour of freely moving fruit flies. J. R. Soc. Interface https://doi.org/10.1098/rsif.2014.0672 (2014).
Batty, E. et al. BehaveNet: nonlinear embedding and Bayesian neural decoding of behavioral videos. in Advances in Neural Information Processing Systems 32 (eds H. Larochelle et al.) 15706–15717 (Curran Associates, 2019).
Costacurta, J. C. et al. Distinguishing discrete and continuous behavioral variability using warped autoregressive HMMs. in Advances in Neural Information Processing Systems 35 (eds S. Koyejo et al.) 23838–23850 (Curran Associates, 2022).
Jia, Y. et al. Selfee, self-supervised features extraction of animal behaviors. eLife 11, e76218 (2022).
doi: 10.7554/eLife.76218
pubmed: 35708244
pmcid: 9296132
Findley, T. M. et al. Sniff-synchronized, gradient-guided olfactory search by freely moving mice. eLife 10, e58523 (2021).
doi: 10.7554/eLife.58523
pubmed: 33942713
pmcid: 8169121
Markowitz, J. E. et al. Spontaneous behaviour is structured by reinforcement without explicit reward. Nature 614, 108–117 (2023).
doi: 10.1038/s41586-022-05611-2
pubmed: 36653449
pmcid: 9892006
Markowitz, J. E. et al. The striatum organizes 3D behavior via moment-to-moment action selection. Cell 174, 44–58 (2018).
doi: 10.1016/j.cell.2018.04.019
pubmed: 29779950
pmcid: 6026065
Wiltschko, A. B. et al. Revealing the structure of pharmacobehavioral space through motion sequencing. Nat. Neurosci. https://doi.org/10.1038/s41593-020-00706-3 (2020).
doi: 10.1038/s41593-020-00706-3
pubmed: 32958923
pmcid: 7606807
Lin, S. et al. Characterizing the structure of mouse behavior using motion sequencing. Preprint at https://arxiv.org/abs/2211.08497 (2022).
Wu, A. et al. Deep Graph Pose: a semi-supervised deep graphical model for improved animal pose tracking. in Proceedings of the 34th International Conference on Neural Information Processing Systems (Curran Associates, 2020).
Murphy, K. P. Machine Learning (MIT Press, 2012).
Linderman, S. et al. In Proceedings of the 20th International Conference on Artificial Intelligence and Statistics Vol. 54 (eds Aarti, S. et al.) 914–922 (PMLR, Proceedings of Machine Learning Research, 2017).
Zhang, L., Dunn, T., Marshall, J., Olveczky, B. & Linderman, S. In Proceedings of The 24th International Conference on Artificial Intelligence and Statistics Vol. 130 (eds Banerjee Arindam & Fukumizu Kenji) 2800–2808 (PMLR, Proceedings of Machine Learning Research, 2021).
Klibaite, U. et al. Deep phenotyping reveals movement phenotypes in mouse neurodevelopmental models. Mol. Autism 13, 12 (2022).
doi: 10.1186/s13229-022-00492-8
pubmed: 35279205
pmcid: 8917660
Bohnslav, J. P. et al. DeepEthogram, a machine learning pipeline for supervised behavior classification from raw pixels. eLife 10, e63377 (2021).
doi: 10.7554/eLife.63377
pubmed: 34473051
pmcid: 8455138
Sun, J. J. et al. Caltech mouse social interactions (CalMS21) dataset. https://doi.org/10.22002/D1.1991 (2021).
Ye, S., Mathis, A. & Mathis, M. W. Panoptic animal pose estimators are zero-shot performers. Preprint at https://arxiv.org/abs/2203.07436 (2022).
Marshall, J. D. et al. Continuous whole-body 3D kinematic recordings across the rodent behavioral repertoire. Neuron 109, 420–437 (2021).
doi: 10.1016/j.neuron.2020.11.016
pubmed: 33340448
Moore, J. D. et al. Hierarchy of orofacial rhythms revealed through whisking and breathing. Nature 497, 205–210 (2013).
doi: 10.1038/nature12076
pubmed: 23624373
pmcid: 4159559
Kurnikova, A., Moore, J. D., Liao, S. -M., Deschênes, M. & Kleinfeld, D. Coordination of orofacial motor actions into exploratory behavior by rat. Curr. Biol. 27, 688–696 (2017).
doi: 10.1016/j.cub.2017.01.013
pubmed: 28216320
pmcid: 5653531
McAfee, S. S. et al. Minimally invasive highly precise monitoring of respiratory rhythm in the mouse using an epithelial temperature probe. J. Neurosci. Methods 263, 89–94 (2016).
doi: 10.1016/j.jneumeth.2016.02.007
pubmed: 26868731
pmcid: 4801653
DeAngelis, B. D., Zavatone-Veth, J. A. & Clark, D. A. The manifold structure of limb coordination in walking Drosophila. Elife https://doi.org/10.7554/eLife.46409 (2019).
Pereira, T. D. et al. Fast animal pose estimation using deep neural networks. Nat. Methods 16, 117–125 (2019).
doi: 10.1038/s41592-018-0234-5
pubmed: 30573820
Dan, B. et al. Lightning Pose: improved animal pose estimation via semi-supervised learning, Bayesian ensembling, and cloud-native open-source tools. Preprint at bioRxiv https://doi.org/10.1101/2023.04.28.538703 (2023).
Batty, E. et al. In NeurIPS vol. 32 (eds H. Wallach et al.) (Curran Associates, 2019).
Berman, G. J., Bialek, W. & Shaevitz, J. W. Predictability and hierarchy in Drosophila behavior. Proc. Natl Acad. Sci. USA 113, 11943–11948 (2016).
doi: 10.1073/pnas.1607601113
pubmed: 27702892
pmcid: 5081631
Berman, G. J. Measuring behavior across scales. BMC Biol. 16, 23 (2018).
doi: 10.1186/s12915-018-0494-7
pubmed: 29475451
pmcid: 5824583
Zhou, Z., et al. UNet++: a nested U-net architecture for medical image segmentation. in (eds Stoyanov, D. et al.) Deep Learning in Medical Image Analysis and Multimodal Learning for Clinical Decision Support. DLMIA ML-CDS 2018. Lecture Notes in Computer Science, vol 11045, 3–11 (Springer International Publishing, 2018). https://doi.org/10.1007/978-3-030-00889-5_1
Sun, K., Xiao, B., Liu, D. & Wang, J. Deep high-resolution representation learning for human pose estimation. in 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). 5686–5696 (2019).
Nath, T. et al. Using DeepLabCut for 3D markerless pose estimation across species and behaviors. Nat. Protoc. 14, 2152–2176 (2019).
doi: 10.1038/s41596-019-0176-0
pubmed: 31227823
Ye, S. et al. SuperAnimal pretrained pose estimation models for behavioral analysis. Preprint at https://arxiv.org/abs/2203.07436 (2023).
Ackerson, G. A. & Fu, K.-S. On state estimation in switching environments. IEEE Trans. Autom. Control. 15, 10–17 (1970).
doi: 10.1109/TAC.1970.1099359
Fox, E. B., Sudderth, E. B., Jordan, M. I. & Willsky, A. S. A sticky HDP-HMM with application to speaker diarization. Ann. Appl. Stat. 5, 1020–1056 (2009).
Andreella, A. & Finos, L. Procrustes analysis for high-dimensional data. Psychometrika 87, 1422–1438 (2022).
doi: 10.1007/s11336-022-09859-5
pubmed: 35583747
pmcid: 9636303
Marshall, J. D. et al. Rat 7M. figshare https://doi.org/10.6084/m9.figshare.c.5295370.v3 (2021).
Weinreb, C. et al. Keypoint-MoSeq: parsing behavior by linking point tracking to pose dynamics. Zenodo https://doi.org/10.5281/zenodo.10636983 (2024).
Weinreb, C. et al. dattalab/keypoint-moseq: Keypoint MoSeq 0.4.3. Zenodo https://doi.org/10.5281/zenodo.10524840 (2024).
Weinreb, C. et al. dattalab/jax-moseq: JAX MoSeq 0.2.1. Zenodo https://doi.org/10.5281/zenodo.10403244 (2023).