Session: Government Agency Student Posters
Paper Number: 173479
A Framework for Personalized Human-Robot Collaboration: Integrating Metahuman Guidance and Real-Time Biosensing
Introduction
Collaborative robots (cobots) are transforming modern manufacturing by working safely alongside human operators. Most existing systems lack personalization and offer little adaptation to individual cognitive load or real-time performance. This limitation reduces their effectiveness and makes them less suitable for complex assembly tasks that demand sustained attention and learning. We present an innovative framework that integrates realistic synthetic actors (MetaHumans) and physiological monitoring to deliver adaptive, personalized guidance aimed at enhancing task performance, trust, and learning in human-robot collaboration (HRC). This approach addresses critical gaps in current automation systems by enabling cobots to dynamically respond to human cognitive states.
Methods
We designed a manufacturing-inspired assembly task where participants use voice commands to instruct a UR3e robotic arm to sort 16 uniquely shaped and colored objects into four quadrants that ensure no repeated shapes or colors within any quadrant. To increase cognitive demands, participants must track prior placements from memory and answer periodic recall prompts. A MetaHuman provides adaptive real-time guidance by continuously monitoring physiological signals such as heart rate variability (HRV) and functional near-infrared spectroscopy (fNIRS) and intervenes when elevated cognitive workload is detected. This within-subject study compares conditions with and without MetaHuman support under both low and high cognitive load. Planned outcome measures include task accuracy, completion time, recall performance, subjective workload using NASA-TLX, trust, and physiological stress. Statistical analyses will use repeated measures ANOVA and will apply non-parametric or mixed-effects models if assumptions are not met.
Advancing Science and Engineering
This work makes a significant contribution to both human-robot interaction research and intelligent manufacturing systems by demonstrating how immersive virtual agents coupled with physiological sensing can create closed-loop adaptive collaboration. It pushes the boundaries of current industrial automation by introducing real-time personalization strategies that account for individual mental workload. This framework establishes a methodological foundation for future adaptive cobot systems that optimize human performance and well-being, representing a critical step toward safer and more efficient smart manufacturing environments.
Expected Outcomes
We anticipate that adaptive MetaHuman guidance will lead to improved task accuracy, faster completion times, and reduced subjective workload, especially under high cognitive load conditions. Additionally, we expect increased participant trust in the collaborative system and measurable reductions in physiological stress. These findings will validate our framework’s potential to personalize human-robot collaboration, supporting more effective learning and safer operation in manufacturing contexts.
Conclusion
This work advances personalized HRC by integrating immersive synthetic actors, real-time biosensing, and adaptive intervention strategies. We envision this framework serving as a blueprint for next-generation collaborative systems in manufacturing and extending to broader domains such as healthcare and complex team robotics. Future efforts will involve completing participant trials, refining adaptive algorithms, and scaling to multi-robot or multi-user environments.
Acknowledgment
This work is supported by the National Science Foundation and National Institutes of Health Award #2013651
Presenting Author: Ramisha Baki Montana State University
Presenting Author Biography: Ramisha Fariha Baki is a Graduate Research Assistant at Montana State University, where she conducts research at the intersection of Human-Robot Collaboration, Augmented/Extended Reality, and Machine Learning. Her research centers on trust between humans and intelligent systems, with a particular focus on how design, behavior, and contextual awareness in machines affect human perception, decision-making, and collaboration. Her academic contributions reflect a strong focus on intelligent systems, usability analysis, and AI-integrated solutions across diverse domains including education, agriculture, and healthcare. She is currently working on a National Science Foundation (NSF)-funded project that investigates collaborative tasks between humans, robots, and AI-driven virtual agents (e.g., Metahumans), focusing on cognitive decision-making, adaptive timing, and multimodal interaction. The project explores how intelligent systems can interpret and respond to human actions and intentions in real time, especially under uncertainty, to support effective and natural team performance. Her interdisciplinary collaborations include researchers from the Military Institute of Science and Technology (MIST), UC Davis, and other international institutions. Passionate about human-centered AI, Ramisha aims to bridge the gap between emerging technologies and meaningful, trust-driven human experiences.
Authors:
Ramisha Baki Montana State UniversityKyla Anderson Montana State University
Apostolos Kalatzis Cleveland State University
Laura Stanley Montana State University
A Framework for Personalized Human-Robot Collaboration: Integrating Metahuman Guidance and Real-Time Biosensing
Paper Type
Government Agency Student Poster Presentation
