Quantification of Oculomotor Responses and Accommodation through Instrumentation and Analysis Toolboxes.
Journal
Journal of visualized experiments : JoVE
ISSN: 1940-087X
Titre abrégé: J Vis Exp
Pays: United States
ID NLM: 101313252
Informations de publication
Date de publication:
03 03 2023
03 03 2023
Historique:
pmc-release:
03
03
2024
entrez:
20
3
2023
pubmed:
21
3
2023
medline:
23
3
2023
Statut:
epublish
Résumé
Through the purposeful stimulation and recording of eye movements, the fundamental characteristics of the underlying neural mechanisms of eye movements can be observed. VisualEyes2020 (VE2020) was developed based on the lack of customizable software-based visual stimulation available for researchers that does not rely on motors or actuators within a traditional haploscope. This new instrument and methodology have been developed for a novel haploscope configuration utilizing both eye tracking and autorefractor systems. Analysis software that enables the synchronized analysis of eye movement and accommodative responses provides vision researchers and clinicians with a reproducible environment and shareable tool. The Vision and Neural Engineering Laboratory's (VNEL) Eye Movement Analysis Program (VEMAP) was established to process recordings produced by VE2020's eye trackers, while the Accommodative Movement Analysis Program (AMAP) was created to process the recording outputs from the corresponding autorefractor system. The VNEL studies three primary stimuli: accommodation (blur-driven changes in the convexity of the intraocular lens), vergence (inward, convergent rotation and outward, divergent rotation of the eyes), and saccades (conjugate eye movements). The VEMAP and AMAP utilize similar data flow processes, manual operator interactions, and interventions where necessary; however, these analysis platforms advance the establishment of an objective software suite that minimizes operator reliance. The utility of a graphical interface and its corresponding algorithms allow for a broad range of visual experiments to be conducted with minimal required prior coding experience from its operator(s).
Identifiants
pubmed: 36939267
doi: 10.3791/64808
pmc: PMC10375222
mid: NIHMS1916539
doi:
Types de publication
Journal Article
Video-Audio Media
Research Support, N.I.H., Extramural
Research Support, Non-U.S. Gov't
Langues
eng
Sous-ensembles de citation
IM
Subventions
Organisme : NEI NIH HHS
ID : R01 EY023261
Pays : United States
Références
Optom Vis Sci. 2017 Jan;94(1):74-88
pubmed: 27464574
Comput Methods Programs Biomed. 1985 Nov;21(2):81-8
pubmed: 3853489
Optometry. 2003 Jan;74(1):25-34
pubmed: 12539890
Sci Rep. 2015 Jan 23;5:7976
pubmed: 25613165
J Neurotrauma. 2019 Jul 15;36(14):2200-2212
pubmed: 30829134
Sci Rep. 2021 Mar 22;11(1):6545
pubmed: 33753864
Ophthalmic Physiol Opt. 2019 Jul;39(4):253-259
pubmed: 31236979
Cochrane Database Syst Rev. 2020 Dec 2;12:CD006768
pubmed: 33263359
Vision Res. 2010 Aug 6;50(17):1728-39
pubmed: 20561972
Annu Int Conf IEEE Eng Med Biol Soc. 2019 Jul;2019:104-109
pubmed: 31945855
Ophthalmic Epidemiol. 2020 Feb;27(1):52-72
pubmed: 31640452
Optom Vis Sci. 1999 Apr;76(4):221-8
pubmed: 10333184
Am J Optom Arch Am Acad Optom. 1965 Jan;42:3-8
pubmed: 14253905
J Optom. 2019 Jan - Mar;12(1):22-29
pubmed: 29580938
Exp Brain Res. 2011 Jul;212(2):267-78
pubmed: 21594645
Behav Res Methods. 2017 Dec;49(6):2146-2162
pubmed: 28130727
J Vis Exp. 2011 Mar 25;(49):
pubmed: 21490568
J Vis. 2018 Jun 1;18(6):2
pubmed: 30029212
Arch Ophthalmol. 2008 Oct;126(10):1336-49
pubmed: 18852411
Vision Res. 2021 Aug;185:58-67
pubmed: 33895648