Using Artificial Intelligence to Measure Facial Expression following Facial Reanimation Surgery.
Journal
Plastic and reconstructive surgery
ISSN: 1529-4242
Titre abrégé: Plast Reconstr Surg
Pays: United States
ID NLM: 1306050
Informations de publication
Date de publication:
11 2020
11 2020
Historique:
entrez:
2
11
2020
pubmed:
3
11
2020
medline:
1
1
2021
Statut:
ppublish
Résumé
Social interactions are largely dependent on the interpretation of information conveyed through facial expressions. Although facial reanimation seeks restoration of the facial expression of emotion, outcome measures have not addressed this directly. This study evaluates the use of a machine learning technology to directly measure facial expression before and after facial reanimation surgery. Fifteen study subjects with facial palsy were evaluated both before and after undergoing cross-facial nerve grafting and free gracilis muscle transfer. Eight healthy volunteers were assessed for control comparison. Video footage of subjects with their face in repose and with a posed, closed-lip smile was obtained. The video data were then analyzed using the Noldus FaceReader software application to measure the relative proportions of seven cardinal facial expressions detected within each clip. The facial expression recognition application detected a far greater happy signal in postoperative (42 percent) versus preoperative (13 percent) smile videos (p < 0.0001), compared to 53 percent in videos of control faces smiling. This increase in postoperative happy signal was achieved in exchange for a reduction in the sad signal (15 percent to 9 percent; p = 0.092) and the neutral signal (57 percent to 37 percent; p = 0.0012). For video clips of patients in repose, no significant difference in happy emotion was detected between preoperative (3.1 percent) and postoperative (1.4 percent) states (p = 0.5). This study provides the first proof of concept for the use of a machine learning software application to objectively quantify facial expression before and after surgical reanimation. CLINICAL QUESTION/LEVEL OF EVIDENCE:: Diagnostic, IV.
Identifiants
pubmed: 33136962
doi: 10.1097/PRS.0000000000007251
pii: 00006534-202011000-00043
doi:
Types de publication
Journal Article
Langues
eng
Sous-ensembles de citation
IM
Pagination
1147-1150Commentaires et corrections
Type : CommentIn
Type : CommentIn
Type : CommentIn
Type : CommentIn
Références
Schmidt KL, Cohn JF. Human facial expressions as adaptations: Evolutionary questions in facial expression research. Am J Phys Anthropol. 2001;33Suppl3–24.
Ekman P, Friesen W. Facial Action Coding System: A Technique for the Measurement of Facial Movement. 1978Palo Alto, Calif: Consulting Psychologists Press;
Coulson SE, O’dwyer NJ, Adams RD, Croxson GR. Expression of emotion and quality of life after facial nerve paralysis. Otol Neurotol. 2004;25:1014–1019.
Ekman P. May M, ed. Psychosocial aspects of facial paralysis. In: The Facial Nerve. 1986:New York: Thieme;781–787.
Twerski A, Twerski B. May M, ed. The emotional impact of facial paralysis. In: The Facial Nerve. 1986:New York: Thieme;788–794.
Valls-Solé J, Montero J. Movement disorders in patients with peripheral facial palsy. Mov Disord. 2003;18:1424–1435.
Bradbury ET, Simons W, Sanders R. Psychological and social factors in reconstructive surgery for hemi-facial palsy. J Plast Reconstr Aesthet Surg. 2006;59:272–278.
Ishii L, Godoy A, Encarnacion CO, Byrne PJ, Boahene KD, Ishii M. Not just another face in the crowd: Society’s perceptions of facial paralysis. Laryngoscope. 2012;122:533–538.
Kang TS, Vrabec JT, Giddings N, Terris DJ. Facial nerve grading systems (1985-2002): Beyond the House-Brackmann scale. Otol Neurotol. 2002;23:767–771.
Fattah AY, Gurusinghe AD, Gavilan J, et al. Sir Charles Bell Society. Facial nerve grading instruments: Systematic review of the literature and suggestion for uniformity. Plast Reconstr Surg. 2015;135:569–579.
van der Schalk J, Hawk ST, Fischer AH, Doosje B. Moving faces, looking places: Validation of the Amsterdam Dynamic Facial Expression Set (ADFES). Emotion. 2011;11:907–920.
Bishop CM. Neural Networks for Pattern Recognition. 1995New York: Oxford University Press;
Cootes TF, Edwards GJ, Taylor CJ. Active appearance models. IEEE Trans Pattern Anal Mach Intell. 2001;23:681–685.
Gudi A, Tasli HE, den Uyl TM, Maroulis A. Deep learning based FACS action unit occurrence and intensity estimation. Paper presented at: 11th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG);May 4–8, 2015. Ljubljana, Slovenia.
Viola P, Jones M. Rapid object detection using a boosted cascade of simple features. Paper presented at: Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition: CVPR 2001;December 8–14, 2001. Kauai, Hawaii.
Viola P, Jones MJ. Robust real-time face detection. Int J Comput Vision. 2004;57:137–154.
Apple, IncAbout Face ID advanced technology. Available at: https://support.apple.com/en-us/HT208108. Accessed April 6, 2019
Homeland Security TodayCBP at JFK airport intercept imposter using biometrics technology. Available at: https://www.hstoday.us/federal-pages/dhs/cbp/cbp-at-jfk-airport-intercept-imposter-using-facial-recognition-biometrics-technology/. Accessed April 6, 2019
Tian YL, Kanade T, Cohn JF. Recognizing action units for facial expression analysis. IEEE Trans Pattern Anal Mach Intell. 2001;23:97–115.
Tversky A, Kahneman D. Judgment under uncertainty: Heuristics and biases. Science. 1974;185:1124–1131.