Design of a Real-Time Human-Robot Collaboration System Using Dynamic Gestures
With the development of industrial automation and artificial intelligence, robotic systems are developing into an essential part of factory production, and the human-robot collaboration (HRC) becomes a new trend in the industrial field. It has been shown that in the industry with a high degree of automation, the HRC system can increase production efficiency and also provide more flexibility in the work environment. Collaborative human-robot workspaces, which combine human’s flexibilities and robot’s productivities, are increasingly gathering attention in both manufacturing industries and research communities. Ideally, an HRC system should work similarly to human-human collaboration on the factory floor. However, it is challenging to develop such HRC applications with desired productivity because of the space-separation and the time-discontinuity of workers and robots. In the limited communication channels between human workers and industrial robots, gesture communication has been effectively applied, thus robots need to understand and respond to human gestures correctly to collaborate with human workers seamlessly.
In our previous work, ten dynamic gestures have been designed for communication between a human worker and an industrial robot in manufacturing scenarios. The location of motion and path of dynamic gestures are converted to static image templates with a Motion History Image (MHI) method, and a dynamic gesture recognition model based on Convolutional Neural Networks (CNN) has been developed. Based on the model, this study aims to design and develop a real-time HRC system for actual assembly and manufacturing tasks in industrial applications. Firstly, to interact with human workers, designated responding robot operations are matched with ten dynamic gestures. These responding operations include some routine procedures of industrial robots (e.g., initialization, emergency stop) as well as a group of designed operations for specific assembly/manufacturing tasks (e.g., handling a specific tool to the worker from a certain location). Then, a real-time dynamic gesture recognition algorithm is developed, where a human worker’s behavior and motion are continuously monitored and captured, and real-time motion history images are generated. A simultaneous processing strategy is adopted in this procedure for the sake of reducing the time complexity. The generation of the MHIs and their identification using the classification model are synchronously accomplished. If a designated dynamic gesture is detected, it is immediately transmitted to the robot to conduct a real-time response and interaction. Besides, a Graphic User Interface (GUI) for the integration of the proposed HRC system is developed for the visualization of the real-time motion history and classification results of the gesture identification. In the end, a series of actual collaboration experiments are carried out between a human worker and a six-degree-of-freedom (6 DOF) Comau industrial robotic arm. It is shown that the worker’s behaviors can be continuously captured and identified, and when designated gestures appear, the robotic arm can conduct responding operations accordingly to interact with the worker sequentially and seamlessly.
Design of a Real-Time Human-Robot Collaboration System Using Dynamic Gestures
Category
Technical Paper Publication
Description
Session: 02-08-01 Innovative Product and Process Design & Robotics and Automation in Advanced Manufacturing
ASME Paper Number: IMECE2020-23650
Session Start Time: November 19, 2020, 05:35 PM
Presenting Author: Haodong Chen
Presenting Author Bio: Haodong Chen
Ph.D. student of Mechanical Engineering, Advisor: Dr. Ming C. Leu
Missouri University of Science and Technology
Authors: Haodong Chen Missouri University of Science and Technology
Ming C. Leu Missouri University of Science and Technology
Zhaozheng Yin Stony Brook University
Wenjin Tao Missouri University of Science and Technology