Session: 16-01-01: Government Agency Student Poster Competition
Paper Number: 149890
149890 - Computational Analysis of Facial Expression Production & Perception for Autism Candidate Biomarker Discovery
The heterogeneity of perception and production of facial expressions in autism spectrum disorder (ASD) suggests the potential presence of behavioral biomarkers that may stratify individuals on the spectrum into more internally homogeneous subgroups. Such stratification biomarkers may identify prognostic subgroups with different tracks of longitudinal symptom development or treatment subgroups for selective enrollment in interventions, e.g., to improve social skills. High-speed internet and the ease of access to technology have enabled remote, scalable, affordable, and timely access to medical care, such as measurements of ASD-related facial expression behaviors in familiar environments to complement clinical observation. Computational analysis of video tracking (VT) of facial expression production and eye tracking (ET) of facial expression perception may aid in the discovery of stratification biomarkers for children and young adults diagnosed with ASD. Deep learning techniques such as convolutional neural networks have shown promise for fine-grained facial expression analysis (FEA) of VT data based on the Facial Action Coding System (FACS). FACS is the gold standard for FEA and provides a taxonomy of action unit (AU) labels each representing one or more constituent muscle movements. However, there are open challenges in FEA across age groups to overcome the domain shift between adult and child facial expressions, FACS-labeled 3D avatar-based stimuli to improve user engagement for eliciting facial expressions, and evaluation of behavioral measurements (production and perception) using ASD candidate biomarker selection criteria (construct validity and group discriminability). Therefore, we propose novel contrastive, deep domain adaptation fusing deep texture features with geometric landmark features for age-invariant child/adult FEA, develop FACS-labeled customizable avatars for improved user engagement, and conduct an online pilot study of 11 autistic children and young adults and 11 age- and gender-matched neurotypical (NT) individuals. Participants complete validated facial expression recognition and mimicry tasks using the FACS-labeled 3D avatar-based stimuli while their facial expression production and perception are captured by webcam-based VT and ET. Domain-adapted deep learning models are used for FEA of the collected VT data. We assess construct validity, i.e., that the tasks measure the intended phenomena, via analysis of variance of the NT group’s responses. For group discriminability (ASD or NT), the Boruta statistical method circumvents unrealistic assumptions of normality and independence in the ASD group in order to capture and showcase group behaviors. Extensive statistical analyses identify one candidate ET biomarker and 14 additional ET and VT measurements that may be candidates for more comprehensive future studies with increased sample size for validation and clinical translation. This material is based upon work supported by the National Science Foundation Graduate Research Fellowship under Grant Nos. 1753793 and 2139907, and by the Research Computing clusters at Old Dominion University under National Science Foundation Grant No. 1828593.
Presenting Author: Megan Witherow Old Dominion University
Presenting Author Biography: Megan A. Witherow received the B.S. degree in computer engineering in 2018 and the PhD degree in electrical and computer engineering in 2024 from Old Dominion University, Norfolk, VA, USA. She is a 2020 National Science Foundation Graduate Research Fellow. Her research interests include computer vision, machine and deep learning, human-computer and human-robot interaction, affective computing, and responsible AI.
Authors:
Megan Witherow Old Dominion UniversityNorou Diawara Old Dominion University
Janice Keener Children's Hospital of The King's Daughters
John Harrington Children's Hospital of The King's Daughters
Khan Iftekharuddin Old Dominion University
Computational Analysis of Facial Expression Production & Perception for Autism Candidate Biomarker Discovery
Paper Type
Government Agency Student Poster Presentation