Design of Omnidirectional Robot Using Hybrid Brain Computer Interface
Current research on Brain-Computer Interface (BCI) controllers has expanded the opportunities of robotic applications within the biomechanical field. With the implementation of a real-time BCI controller researchers have developed smart prosthetics, semi-autonomous wheelchairs and collaborative robots for human interactions, allowing patients with neuromuscular disabilities the freedom to interact with the world. These advances have been made possible by the ease of non-invasive procedures of recording and processing electroencephalography (EEG) signals from the human scalp. However, EEG based BCI controllers are limited in their ability to accurately process real-time signals and convert them into input for a system. This research seeks to develop a BCI controller for a semi-autonomous three-wheeled omnidirectional robot capable of processing accurate real-time commands. To date omnidirectional mobile robots are popularly employed as they can rotate and self translate simultaneously in narrow space. In this research paper, kinematical modeling of the omni-directional robot and the software architecture of the overall hybrid system with motion control algorithm are expounded. The system design, acquisition of the EEG signal, recognition processing technology and implementation are the main focus.
For the real-time input signal, EEG scans are recorded utilizing a sixteen electrode channel cap provided by Easycap utilizing Emotiv Epoc hardware. Signals are recorded and processed by a program called OpenVibe. Preprocessed signals, which are cleaned by EEGLAB and used to train OpenVibe classifiers to accurately identify the expected signals produced by the users. Once identified, the controller converts the signal into input commands of {forward, left, right, rotate, stop}, which are written in the Python syntax and delivered to the robot system. The robot, which utilises three omnidirectional wheels, has three degrees of freedom (DoF) allowing it to traverse its environment in any direction and orientation. The system is equipped with both the Intel RealSense Depth Camera D435 and Tracking Camera T265 as well as LidarLite sensors to build a full map of the robot’s surroundings. The robot is controlled with the Robot Operating System (ROS) which takes the inputs provided, computes a trajectory and navigates the robot along the trajectory. The sensor system provides feedback allowing for semi-autonomous control in order to avoid obstacles and keeps the user focused on driving. Overall, this paper demonstrates the architecture of the hybrid control system for omni-directional robot using BCI. The developed system integrates the EEG signal to control the motion of the robot and the experimental results show the system performance and effectiveness of possessing the user’s EEG signals.
Design of Omnidirectional Robot Using Hybrid Brain Computer Interface
Category
Technical Paper Publication
Description
Session: 05-13-01 Robotics, Rehabilitation
ASME Paper Number: IMECE2020-23935
Session Start Time: November 18, 2020, 12:45 PM
Presenting Author: Bryan Ghoslin
Presenting Author Bio: Mr. Bryan Ghoslin is a Mechanical Engineering graduate research scholar at CSUN (California State University, Northridge) who is working in extracting biological signal and developing a reliable input to drive a mechanical robot. He received his bachelors degree in Mechanical Engineering from UCR (University of California, Riverside) in 2017.
Authors: Vidya Nandikolla California State University
Bryan Ghoslin California State University Northridge