Session: 05-12-02: Robotics, Rehabilitation - II
Paper Number: 95828
95828 - On the Development and Evaluation of a Framework for Brain-Computer Interface and Vibrotactile Feedback for Human-Robot-Interaction in Virtual Spaces and Robotic Hardware
The goal of Brain-Computer Interface (BCI) research is to develop a framework to identify human intent and control an external device, in this case a robotic arm. The goal of researching BCI is to further enhance the field of human-robot-interaction (HRI) particularly in assistive robotics. Our goal is to apply the developed tools to help those (for example those with upper limb disabilities) who rely on others for seemingly simple and routine daily tasks such as picking up a bottle of water. However, the developed tools find applications in other domains where interacting with a robot through gestures could be advantageous. In this research, we will present the development and testing in virtual and hardware environments of a framework to acquire, process, evaluate, and map BCI signals to a specific robot process, and provide vibrotactile feedback for process verification prior to and after completing the process. The BCI used is the non-invasive Emotiv EPOC+ headset (14-electrode electroencephalogram (EEG) with additional sensors to detect facial expressions/gestures and head movement). The Emotiv software was trained by recording facial expressions according to the instructions provided by the manufacturer to improve the recognition of the gestures. The proposed framework was developed and tested in Webots, a robot simulation environment programmed using MATLAB. The simulation environment provided a platform to analyze the reproducibility of BCI signals and subsequently map them to a particular robot action. A software architecture was developed to process the Emotiv headset expression/gesture data using Node-RED and pass the information to MATLAB for simulation purposes and to LabVIEW for hardware control and execution. A developed pick and place process was mapped to different signals such as facial expressions/gestures, and head movement was successfully refined and demonstrated in Webots. Subsequently, the framework with minimal effort was seamlessly transferred and executed on robotic hardware controlled through LabVIEW and myRIO. The vibrotactile feedback is provided by a custom developed wearable glove with multiple operational states and is used to inform the user of the process to be performed thus providing the wearer the option to abort the process, as well as inform the user of the completion of the process. The framework and procedures developed along with the vibrotactile feedback were evaluated in Webots using two different robot models and then verified on two different robotic platforms demonstrating the ease of portability to different robotic platforms. The successes of the multiple experiments validate the developed BCI framework and provides a solid foundation for further research in human robot interaction in a virtual space for proof-of-concept studies, process refinement and improvement, and evaluation of concepts for custom and/or personalized assistive robots before seamlessly transferring to robotic hardware.
Presenting Author: Panos. S. Shiakolas University of Texas at Arlington
Presenting Author Biography: P. S. SHIAKOLAS received the Higher National Diploma degree from the Higher Technical Institute, Nicosia, Cyprus, in 1982, the B.S. and M.S. degrees from The University of Texas at Austin, in 1986 and 1988, respectively, and the Ph.D. degree from The University of Texas at Arlington (UTA), in 1992, all in mechanical engineering in the areas of robotics and computer aided design. From 1993 to 1996, he worked as a Faculty-Research Associate with UTA. He joined the UTA Faculty as an Assistant Professor, in 1996, where he currently serves as a Tenured Associate Professor for mechanical and aerospace engineering. He is also the Director of the MARS Laboratory, UTA. His research interests include generals areas of robotics, manufacturing, microsystems, automation, and controls as they apply to the betterment of society currently focusing in the medical/biomedical fields. He has a passion for engineering education and he has developed educational testbeds and routinely uses hardware testbeds to demonstrate concepts in his courses in the areas of robotics, automation, and controls.
Authors:
Sudip Hazra University of Texas at ArlingtonShane Whitaker University of Texas at Arlington
Panos. S. Shiakolas University of Texas at Arlington
On the Development and Evaluation of a Framework for Brain-Computer Interface and Vibrotactile Feedback for Human-Robot-Interaction in Virtual Spaces and Robotic Hardware
Paper Type
Technical Paper Publication