Advances in the development and application of non-contact intraoperative image access systems.
Gesture control
Human–computer interaction
Non-contact interaction
Journal
Biomedical engineering online
ISSN: 1475-925X
Titre abrégé: Biomed Eng Online
Pays: England
ID NLM: 101147518
Informations de publication
Date de publication:
30 Oct 2024
30 Oct 2024
Historique:
received:
14
07
2024
accepted:
25
10
2024
medline:
31
10
2024
pubmed:
31
10
2024
entrez:
31
10
2024
Statut:
epublish
Résumé
This article provides an overview of recent progress in the achievement of non-contact intraoperative image control through the use of vision and sensor technologies in operating room (OR) environments. A discussion of approaches to improving and optimizing associated technologies is also provided, together with a survey of important challenges and directions for future development aimed at improving the use of non-contact intraoperative image access systems.
Identifiants
pubmed: 39478601
doi: 10.1186/s12938-024-01304-1
pii: 10.1186/s12938-024-01304-1
doi:
Types de publication
Journal Article
Review
Langues
eng
Sous-ensembles de citation
IM
Pagination
108Subventions
Organisme : Hainan Provincial Department of Science and Technology
ID : ZDYF2021GXJS004
Informations de copyright
© 2024. The Author(s).
Références
Nishikawa A, Hosoi T, Koara K, Negoro D, Hikita A, Asano S, et al. FAce MOUSe: a novel human-machine interface for controlling the position of a laparoscope. IEEE Trans Robot Automat. 2003;19:825–41. https://doi.org/10.1109/TRA.2003.817093 .
doi: 10.1109/TRA.2003.817093
Wang H, Ru B, Miao X, Gao Q, Habib M, Liu L, et al. MEMS devices-based hand gesture recognition via wearable computing. Micromachines. 2023;14:947. https://doi.org/10.3390/mi14050947 .
doi: 10.3390/mi14050947
Wachs JP, Stern HI, Edan Y, Gillam M, Handler J, Feied C, et al. A gesture-based tool for sterile browsing of radiology images. J Am Med Inform Assoc. 2008;15:321–3. https://doi.org/10.1197/jamia.M2410 .
doi: 10.1197/jamia.M2410
Oshiro Y, Ohuchida K, Okada T, Hashizume M, Ohkohchi N. Novel imaging using a touchless display for computer-assisted hepato-biliary surgery. Surg Today. 2017;47:1512–8. https://doi.org/10.1007/s00595-017-1541-7 .
doi: 10.1007/s00595-017-1541-7
Kurillo G, Hemingway E, Cheng M-L, Cheng L. Evaluating the accuracy of the azure kinect and kinect v2. Sensors. 2022;22:2469. https://doi.org/10.3390/s22072469 .
doi: 10.3390/s22072469
Ruppert GCS, Reis LO, Amorim PHJ, De Moraes TF, Da Silva JVL. Touchless gesture user interface for interactive image visualization in urological surgery. World J Urol. 2012;30:687–91. https://doi.org/10.1007/s00345-012-0879-0 .
doi: 10.1007/s00345-012-0879-0
Tan JH, Chao C, Zawaideh M, Roberts AC, Kinney TB. Informatics in radiology: developing a touchless user interface for intraoperative image control during interventional radiology procedures. Radiographics. 2013;33:E61-70. https://doi.org/10.1148/rg.332125101 .
doi: 10.1148/rg.332125101
Yoshimitsu K, Muragaki Y, Maruyama T, Yamato M, Iseki H. Development and initial clinical testing of “opect”: an innovative device for fully intangible control of the intraoperative image-displaying monitor by the surgeon. Oper Neurosurg. 2014;10:46–50. https://doi.org/10.1227/NEU.0000000000000214 .
doi: 10.1227/NEU.0000000000000214
Gobhiran A, Wongjunda D, Kiatsoontorn K, Charoenpong T. Hand movement-controlled image viewer in an operating room by using hand movement pattern code. Wirel Pers Commun. 2022;123:103–21. https://doi.org/10.1007/s11277-021-09121-8 .
doi: 10.1007/s11277-021-09121-8
Liu J, Tateyama T, IWAMOTO Y, Chen Y-W. A preliminary study of kinect-based real-time hand gesture interaction systems for touchless visualizations of hepatic structures in surgery. Med Imag Infor Sci. 2019;36:128–35. https://doi.org/10.11318/mii.36.128 .
doi: 10.11318/mii.36.128
Glinkowski WM, Miścior T, Sitnik R. Remote, touchless interaction with medical images and telementoring in the operating room using a kinect-based application—a usability study. Appl Sci. 2023;13:11982. https://doi.org/10.3390/app132111982 .
doi: 10.3390/app132111982
Weichert F, Bachmann D, Rudak B, Fisseler D. Analysis of the accuracy and robustness of the leap motion controller. Sensors. 2013;13:6380–93. https://doi.org/10.3390/s130506380 .
doi: 10.3390/s130506380
Vysocký A, Grushko S, Oščádal P, Kot T, Babjak J, Jánoš R, et al. Analysis of precision and stability of hand tracking with leap motion sensor. Sensors. 2020;20:4088. https://doi.org/10.3390/s20154088 .
doi: 10.3390/s20154088
Feng Y, Uchidiuno UA, Zahiri HR, George I, Park AE, Mentis H. Comparison of kinect and leap motion for intraoperative image interaction. Surg Innov. 2021;28:33–40. https://doi.org/10.1177/1553350620947206 .
doi: 10.1177/1553350620947206
Rosa GM, Elizondo ML. Use of a gesture user interface as a touchless image navigation system in dental surgery: case series report. Imaging Sci Dent. 2014;44:155. https://doi.org/10.5624/isd.2014.44.2.155 .
doi: 10.5624/isd.2014.44.2.155
Chiang P-Y, Chen C-C, Hsia C-H. A touchless interaction interface for observing medical imaging. J Vis Commun Image Represent. 2019;58:363–73. https://doi.org/10.1016/j.jvcir.2018.12.004 .
doi: 10.1016/j.jvcir.2018.12.004
Hatscher B, Mewes A, Pannicke E, Kägebein U, Wacker F, Hansen C, et al. Touchless scanner control to support MRI-guided interventions. Int J CARS. 2020;15:545–53. https://doi.org/10.1007/s11548-019-02058-1 .
doi: 10.1007/s11548-019-02058-1
Zhang X, Wang J, Dai X, Shen S, Chen X. A non-contact interactive system for multimodal surgical robots based on LeapMotion and visual tags. Front Neurosci. 2023;17:1287053. https://doi.org/10.3389/fnins.2023.1287053 .
doi: 10.3389/fnins.2023.1287053
Sa-nguannarm P, Charoenpong T, Chianrabutra C, Kiatsoontorn K. A Method of 3D Hand Movement Recognition by a Leap Motion Sensor for Controlling Medical Image in an Operating Room. 2019 First International Symposium on Instrumentation, Control, Artificial Intelligence, and Robotics (ICA-SYMP), Bangkok, Thailand: IEEE; 2019; 17–20. https://doi.org/10.1109/ICA-SYMP.2019.8645985 .
Cho Y, Lee A, Park J, Ko B, Kim N. Enhancement of gesture recognition for contactless interface using a personalized classifier in the operating room. Comput Methods Programs Biomed. 2018;161:39–44. https://doi.org/10.1016/j.cmpb.2018.04.003 .
doi: 10.1016/j.cmpb.2018.04.003
Ameur S, Ben Khalifa A, Bouhlel MS. Hand-Gesture-Based Touchless Exploration of Medical Images with Leap Motion Controller. 2020 17th International Multi-Conference on Systems, Signals & Devices (SSD), Monastir, Tunisia: IEEE; 2020; 6–11. https://doi.org/10.1109/SSD49366.2020.9364244 .
Cronin S, Doherty G. Touchless computer interfaces in hospitals: a review. Health Informatics J. 2019;25:1325–42. https://doi.org/10.1177/1460458217748342 .
doi: 10.1177/1460458217748342
Xue Z. Foresight interaction: from voice and gesture design to multimode convergence. Beijing: Publishing House of Electronics Industry; 2022.
di Tommaso L, Aubry S, Godard J, Katranji H, Pauchot J. Un nouvel interface homme machine en neurochirurgie: le Leap Motion®. Note technique sur une nouvelle interface homme machine sans contact. Neurochirurgie. 2016;62:178–81. https://doi.org/10.1016/j.neuchi.2016.01.006 .
doi: 10.1016/j.neuchi.2016.01.006
Mewes A, Hensen B, Wacker F, Hansen C. Touchless interaction with software in interventional radiology and surgery: a systematic literature review. Int J CARS. 2017;12:291–305. https://doi.org/10.1007/s11548-016-1480-6 .
doi: 10.1007/s11548-016-1480-6
Ahmed S, Kallu KD, Ahmed S, Cho SH. Hand gestures recognition using radar sensors for human-computer-interaction: a review. Remote Sens. 2021;13:527. https://doi.org/10.3390/rs13030527 .
doi: 10.3390/rs13030527
Jalaliniya S, Smith J, Sousa M, Büthe L, Pederson T. Touch-less interaction with medical images using hand & foot gestures. Proceedings of the 2013 ACM conference on Pervasive and ubiquitous computing adjunct publication, Zurich Switzerland: ACM. 2013; 1265–74. https://doi.org/10.1145/2494091.2497332 .
Bigdelou A, Schwarz L, Navab N. An adaptive solution for intra-operative gesture-based human-machine interaction. Proceedings of the 2012 ACM international conference on Intelligent User Interfaces, Lisbon Portugal: ACM. 2012; 75–84. https://doi.org/10.1145/2166966.2166981 .
Esfandiari H, Troxler P, Hodel S, Suter D, Farshad M, Collaboration Group, et al. Introducing a brain-computer interface to facilitate intraoperative medical imaging control—a feasibility study. BMC Musculoskelet Disord. 2022;23:701. https://doi.org/10.1186/s12891-022-05384-9 .
doi: 10.1186/s12891-022-05384-9
Hettig J, Mewes A, Riabikin O, Skalej M, Preim B, Hansen C. Exploration of 3D Medical Image Data for Interventional Radiology using Myoelectric Gesture Control. Eurographics Workshop on Visual Computing for Biomedicine 2015.
Sánchez-Margallo FM, Sánchez-Margallo JA, Moyano-Cuevas JL, Pérez EM, Maestre J. Use of natural user interfaces for image navigation during laparoscopic surgery: initial experience. Minim Invasive Ther Allied Technol. 2017;26:253–61. https://doi.org/10.1080/13645706.2017.1304964 .
doi: 10.1080/13645706.2017.1304964
Liu Y, Peng X, Tan Y, Oyemakinde TT, Wang M, Li G, et al. A novel unsupervised dynamic feature domain adaptation strategy for cross-individual myoelectric gesture recognition. J Neural Eng. 2023;20:066044. https://doi.org/10.1088/1741-2552/ad184f .
doi: 10.1088/1741-2552/ad184f
Xu H, Xiong A. Advances and disturbances in sEMG-based intentions and movements recognition: a review. IEEE Sens J. 2021;21:13019–28. https://doi.org/10.1109/JSEN.2021.3068521 .
doi: 10.1109/JSEN.2021.3068521
Ha M-K, Phan T-L, Nguyen D, Quan N, Ha-Phan N-Q, Ching C, et al. Comparative analysis of audio processing techniques on doppler radar signature of human walking motion using CNN models. Sensors. 2023;23:8743. https://doi.org/10.3390/s23218743 .
doi: 10.3390/s23218743
Miller E, Li Z, Mentis H, Park A, Zhu T, Banerjee N. RadSense: enabling one hand and no hands interaction for sterile manipulation of medical images using Doppler radar. Smart Health. 2020;15:100089. https://doi.org/10.1016/j.smhl.2019.100089 .
doi: 10.1016/j.smhl.2019.100089
Yang K, Kim M, Jung Y, Lee S. Hand gesture recognition using FSK radar sensors. Sensors. 2024;24:349. https://doi.org/10.3390/s24020349 .
doi: 10.3390/s24020349
Stetco C, Muhlbacher-Karrer S, Lucchi M, Weyrer M, Faller L-M, Zangl H. Gesture-based Contactless Control of Mobile Manipulators using Capacitive Sensing. 2020 IEEE International Instrumentation and Measurement Technology Conference (I2MTC), Dubrovnik, Croatia: IEEE. 2020; 1–6. https://doi.org/10.1109/I2MTC43012.2020.9128751 .
NZ technologies inc. HoverTap MD
NZ technologies inc. TIPSO AirPad
Yang Y, Gao Y, Liu K, He Z, Cao L. Contactless human–computer interaction system based on three-dimensional holographic display and gesture recognition. Appl Phys B. 2023;129:192. https://doi.org/10.1007/s00340-023-08128-2 .
doi: 10.1007/s00340-023-08128-2
Hui WS, Huang W, Hu J, Tao K, Peng Y. A new precise contactless medical image multimodal interaction system for surgical practice. IEEE Access. 2020;8:121811–20. https://doi.org/10.1109/ACCESS.2019.2946404 .
doi: 10.1109/ACCESS.2019.2946404
Perrakis A, Hohenberger W, Horbach T. Integrated operation systems and voice recognition in minimally invasive surgery: comparison of two systems. Surg Endosc. 2013;27:575–9. https://doi.org/10.1007/s00464-012-2488-9 .
doi: 10.1007/s00464-012-2488-9
Argoty JA, Figueroa P. Design and development of a prototype of an interactive hospital room with Kinect. Proceedings of the XV International Conference on Human Computer Interaction, Puerto de La Cruz Tenerife Spain: ACM. 2014; 1–4. https://doi.org/10.1145/2662253.2662290 .
Ebert LC, Hatch G, Ampanozi G, Thali MJ, Ross S. You can’t touch this: touch-free navigation through radiological images. Surg Innov. 2012;19:301–7. https://doi.org/10.1177/1553350611425508 .
doi: 10.1177/1553350611425508
Schulte A, Suarez-Ibarrola R, Wegen D, Pohlmann P-F, Petersen E, Miernik A. Automatic speech recognition in the operating room—an essential contemporary tool or a redundant gadget? A survey evaluation among physicians in form of a qualitative study. Ann Med Surg. 2020;59:81–5. https://doi.org/10.1016/j.amsu.2020.09.015 .
doi: 10.1016/j.amsu.2020.09.015
Nishihori M, Izumi T, Nagano Y, Sato M, Tsukada T, Kropp AE, et al. Development and clinical evaluation of a contactless operating interface for three-dimensional image-guided navigation for endovascular neurosurgery. Int J CARS. 2021;16:663–71. https://doi.org/10.1007/s11548-021-02330-3 .
doi: 10.1007/s11548-021-02330-3
Lopes D, Relvas F, Paulo S, Rekik Y, Grisoni L, Jorge J. FEETICHE: FEET Input for Contactless Hand gEsture Interaction. Proceedings of the 17th International Conference on Virtual-Reality Continuum and its Applications in Industry, Brisbane QLD Australia: ACM. 2019; 1–10. https://doi.org/10.1145/3359997.3365704 .
Paulo SF, Relvas F, Nicolau H, Rekik Y, Machado V, Botelho J, et al. Touchless interaction with medical images based on 3D hand cursors supported by single-foot input: a case study in dentistry. J Biomed Inform. 2019;100:103316. https://doi.org/10.1016/j.jbi.2019.103316 .
doi: 10.1016/j.jbi.2019.103316
Potter LE, Araullo J, Carter L. The Leap Motion controller: a view on sign language. Proceedings of the 25th Australian Computer-Human Interaction Conference: Augmentation, Application, Innovation, Collaboration, Adelaide Australia: ACM. 2013; 175–8. https://doi.org/10.1145/2541016.2541072 .
Lei J, Wang S, Zhu D, Wu Y. Non-contact gesture interaction method based on cursor model in immersive medical visualization. J Comput Aided Des Comput Gr. 2019;31:208–17. https://doi.org/10.3724/SP.J.1089.2019.17593 .
doi: 10.3724/SP.J.1089.2019.17593
Wu B-F, Chen B-R, Hsu C-F. Design of a facial landmark detection system using a dynamic optical flow approach. IEEE Access. 2021;9:68737–45. https://doi.org/10.1109/ACCESS.2021.3077479 .
doi: 10.1109/ACCESS.2021.3077479
Siratanita S, Chamnongthai K, Muneyasu M. A method of football-offside detection using multiple cameras for an automatic linesman assistance system. Wirel Pers Commun. 2021;118:1883–905. https://doi.org/10.1007/s11277-019-06635-0 .
doi: 10.1007/s11277-019-06635-0
Freitas A, Santos D, Lima R, Santos CG, Meiguins B. Pactolo bar: an approach to mitigate the Midas touch problem in non-conventional interaction. Sensors. 2023;23:2110. https://doi.org/10.3390/s23042110 .
doi: 10.3390/s23042110
Cronin S, Freeman E, Doherty G. Investigating Clutching Interactions for Touchless Medical Imaging Systems. CHI Conference on Human Factors in Computing Systems, New Orleans LA USA: ACM. 2022; 1–14. https://doi.org/10.1145/3491102.3517512 .
Schreiter J, Mielke T, Schott D, Thormann M, Omari J, Pech M, et al. A multimodal user interface for touchless control of robotic ultrasound. Int J CARS. 2022;18:1429–36. https://doi.org/10.1007/s11548-022-02810-0 .
doi: 10.1007/s11548-022-02810-0
Waugh K, McGill M, Freeman E. Push or Pinch? Exploring Slider Control Gestures for Touchless User Interfaces. Nordic Human-Computer Interaction Conference, Aarhus Denmark: ACM. 2022; 1–10. https://doi.org/10.1145/3546155.3546702 .
Waugh K, McGill M, Freeman E. Proxemic Cursor Interactions for Touchless Widget Control. Proceedings of the 2023 ACM Symposium on Spatial User Interaction, Sydney NSW Australia: ACM. 2023; 1–12. https://doi.org/10.1145/3607822.3614525 .
Chung J, Liu DM. Experimental assessment of a novel touchless interface for intraprocedural imaging review. Cardiovasc Intervent Radiol. 2019;42:1192–8. https://doi.org/10.1007/s00270-019-02207-8 .
doi: 10.1007/s00270-019-02207-8