Session: 16-01-01: Government Agency Student Poster Competition
Paper Number: 149957
149957 - A Multimodal Information-Based Bi-Directional Emotion Interaction Interface for Friendly and Empathic Collaborative Robots
Human-robot collaboration has become an important topic in advanced manufacturing. These collaborative robots are usually designed to work closely alongside humans in a shared space. They have been widely proposed in many manufacturing industries such as automotive, food, packaging, pharmaceutical, etc. The main purpose of using these collaborative robots is to assist humans with any dull, repetitive, or dangerous working tasks and to help improve manufacturing productivity and efficiency. Traditionally, robots in manufacturing operate in isolation and completely take over the tasks provided to them. Collaborative robots on the other hand, share their space with a human collaborator and work closely with them to complete their tasks. Unfortunately, current collaborative robots usually have very stiff and mechanic behaviors of many collaborative robots make their interactions with humans dull and uninteresting, especially for an extended period of time. A human’s willingness to work alongside collaborative robots can be deterred by such interaction patterns, negatively impacting user acceptance and further limiting the wider application of collaborative robots in manufacturing areas.
To solve this issue and be inspired by the human-human communication seen every day, we developed a multimodal information-based bidirectional emotion interface (MI-BEI) system to enable collaborative robots’ social-emotional competence by integrating the developed MI-BEI system into the human-robot collaboration process. Manufacturing co-assembly tasks were used as test scenarios in our experiments for evaluating the developed system. This MI-BEI system allows the robot to (1) recognize a human’s emotions visually and audibly through monitoring human facial expressions and vocal tones, and (2) respond to humans using artificial emotion feedback generated by 3D simulation technology which has multiple advantages (flexible and quick synthesis, customization, and upgrading) compared to hardware design. Specifically, our work can be summarized in the following three parts. First, the development of a 3D human interface that can monitor a human’s facial expressions and vocal tones while providing artificial emotional feedback. Second, the integration of the developed interface enables a collaborative manufacturing robot to express emotions in real-time while being able to perform actions during co-assembly tasks, enabling a friendly and empathetic collaboration process. Third, the validation experiments and analysis are conducted to evaluate the performance and effectiveness of the enhanced collaborative robot through real-world assembly tasks. The results and analysis from the experiment demonstrate the current systems’ advantages and effectiveness, as well as guide the future development of collaborative robots to enhance a more friendly and empathetic human-robot interaction process.
Presenting Author: Jordan Murphy Montclair State University
Presenting Author Biography: Jordan Murphy is a graduate research assistant at Multimodal Interaction and Affective Computing lab, School of Computing, Montclair State University. His research is focusing on multimodal human-machine interaction, computer graphics and animation, robotics.
Authors:
Jordan Murphy Montclair State UniversityRui Li Montclair State University
A Multimodal Information-Based Bi-Directional Emotion Interaction Interface for Friendly and Empathic Collaborative Robots
Paper Type
Government Agency Student Poster Presentation