Blink-To-Live eye-based communication system for users with speech impairments.
Journal
Scientific reports
ISSN: 2045-2322
Titre abrégé: Sci Rep
Pays: England
ID NLM: 101563288
Informations de publication
Date de publication:
17 05 2023
17 05 2023
Historique:
received:
10
11
2022
accepted:
27
04
2023
medline:
19
5
2023
pubmed:
18
5
2023
entrez:
17
5
2023
Statut:
epublish
Résumé
Eye-based communication languages such as Blink-To-Speak play a key role in expressing the needs and emotions of patients with motor neuron disorders. Most invented eye-based tracking systems are complex and not affordable in low-income countries. Blink-To-Live is an eye-tracking system based on a modified Blink-To-Speak language and computer vision for patients with speech impairments. A mobile phone camera tracks the patient's eyes by sending real-time video frames to computer vision modules for facial landmarks detection, eye identification and tracking. There are four defined key alphabets in the Blink-To-Live eye-based communication language: Left, Right, Up, and Blink. These eye gestures encode more than 60 daily life commands expressed by a sequence of three eye movement states. Once the eye gestures encoded sentences are generated, the translation module will display the phrases in the patient's native speech on the phone screen, and the synthesized voice can be heard. A prototype of the Blink-To-Live system is evaluated using normal cases with different demographic characteristics. Unlike the other sensor-based eye-tracking systems, Blink-To-Live is simple, flexible, and cost-efficient, with no dependency on specific software or hardware requirements. The software and its source are available from the GitHub repository ( https://github.com/ZW01f/Blink-To-Live ).
Identifiants
pubmed: 37198193
doi: 10.1038/s41598-023-34310-9
pii: 10.1038/s41598-023-34310-9
pmc: PMC10192441
doi:
Types de publication
Journal Article
Research Support, Non-U.S. Gov't
Langues
eng
Sous-ensembles de citation
IM
Pagination
7961Informations de copyright
© 2023. The Author(s).
Références
J Neurol. 2022 Jun;269(6):2910-2921
pubmed: 35059816
IEEE Trans Pattern Anal Mach Intell. 2014 Oct;36(10):2033-46
pubmed: 26352633
Acta Neurol Scand. 2014 Jul;130(1):40-5
pubmed: 24350578
Front Neurorobot. 2022 Feb 01;15:796895
pubmed: 35177973
Sensors (Basel). 2019 Dec 14;19(24):
pubmed: 31847432
Amyotroph Lateral Scler Frontotemporal Degener. 2013 Dec;14(7-8):546-52
pubmed: 23834069
J Neuroeng Rehabil. 2015 Sep 04;12:76
pubmed: 26338101
Augment Altern Commun. 2019 Mar;35(1):1-12
pubmed: 30648903
Electroencephalogr Clin Neurophysiol. 1988 Dec;70(6):510-23
pubmed: 2461285
Amyotroph Lateral Scler Frontotemporal Degener. 2015;17(1-2):101-11
pubmed: 26312652
Neurorehabil Neural Repair. 2015 Nov-Dec;29(10):950-7
pubmed: 25753951
Int Symp Appl Sci Biomed Commun Technol. 2011;2011:
pubmed: 28966929
J Clin Imaging Sci. 2018 Dec 06;8:53
pubmed: 30652056
IEEE J Transl Eng Health Med. 2013 Nov 06;1:2100212
pubmed: 27170851
J Neurol Sci. 2020 Nov 15;418:117081
pubmed: 32882437
Sci Rep. 2020 Oct 13;10(1):17064
pubmed: 33051500
J Rehabil Assist Technol Eng. 2018 Jun 11;5:2055668318773991
pubmed: 31191938
Clin Neurophysiol. 2021 Oct;132(10):2404-2415
pubmed: 34454267
Surg Neurol Int. 2015 Nov 16;6:171
pubmed: 26629397
NeuroRehabilitation. 2007;22(6):445-50
pubmed: 18198430
PLoS One. 2019 Sep 6;14(9):e0221909
pubmed: 31490999
Arch Phys Med Rehabil. 2015 Mar;96(3 Suppl):S16-26
pubmed: 25721543