Session: 08-11-01: Mobile Robots and Unmanned Ground Vehicles I
Paper Number: 165181
Design and Development of an Omnidirectional Mobile Robot for Load Transportation Using a Nonlinear Control and Computer Vision-Based Navigation System
Autonomous mobile robots have become essential in modern logistics, industrial automation, and research applications. The increasing demand for efficient, adaptable, and intelligent transportation systems has driven the development of omnidirectional mobile robots capable of handling loads in structured and semi-structured environments. This paper presents the design and implementation of an omnidirectional mobile robot with a nonlinear control strategy and computer vision-based navigation system to transport loads of up to 20 kg. This research aims to enhance trajectory accuracy, stability under load variations, and adaptive control in real-time navigation scenarios. The study contributes to the advancement of robotic mobility by integrating a nonlinear multi-input multi-output (MIMO) control system that ensures precise movement control while adapting to variations in load distribution and external disturbances.
The mechanical structure of the robot was developed using CAD modeling, followed by finite element analysis (FEA) to validate its structural integrity under operational conditions. The robot’s base features a four-wheeled omnidirectional Mecanum drive, allowing movement in any direction without requiring steering mechanisms. The propulsion system comprises high-torque DC motors with encoder feedback, an ESP32 microcontroller, and a Wi-Fi-based user interface for manual remote operation. The control system integrates a nonlinear MIMO strategy, improving traditional PID-based approaches by dynamically adjusting motor inputs based on real-time sensor feedback. This ensures smooth trajectory execution, even under changing loads and external perturbations.
A key innovation in this work is implementing a computer vision-based navigation system that uses ArUco markers for real-time localization. The vision system processes 3D spatial data, projecting it onto a 2D camera plane to accurately estimate the robot’s position. The navigation algorithm, implemented using OpenCV and Python, computes optimal paths and adjusts trajectories dynamically based on visual input. This approach significantly improves localization accuracy compared to traditional dead-reckoning methods. Additionally, the system is designed to be easily scalable, allowing future integration with LiDAR sensors and machine learning-based predictive control models to enhance obstacle avoidance capabilities.
The system was experimentally validated in a structured indoor environment, where the robot successfully followed predefined paths while adapting to external perturbations. Performance metrics analyzed include positioning accuracy, trajectory deviation, response time, and load-handling efficiency. Results indicate that the robot maintains an average positioning error of less than 2 cm, demonstrating its potential for precise and reliable autonomous navigation. A comparative study with existing mobile robotic platforms highlights improvements in trajectory stability, maneuverability, and efficiency in material transportation tasks. The system is well-suited for automated logistics, warehouse operations, and innovative manufacturing applications.
This work contributes to the growing field of autonomous mobile robots by integrating adaptive control strategies and real-time computer vision-based localization. The proposed system enhances trajectory precision, improves navigation efficiency, and enables intelligent load transportation in structured environments. Future research directions will explore integrating AI-driven control models and multi-sensor fusion techniques to further optimize motion planning and expand the robot’s capabilities beyond structured environments. The findings presented in this paper provide a solid foundation for future developments in mobile robotic systems, particularly in industrial automation, smart warehouses, and logistics 4.0 applications.
Presenting Author: Edwin Caizalitin-Quinaluisa Universidad de las Fuerzas Armadas ESPE
Presenting Author Biography: Edwin Caizalitin-Quinaluisa is an Assistant Professor and Head of the Mechatronics Laboratory at Universidad de las Fuerzas Armadas ESPE, Ecuador. He holds a Master's degree in Automation and Robotics and has extensive experience in mechatronics, robotics, and control systems. His research focuses on autonomous mobile robots, nonlinear control strategies, and intelligent vision-based navigation systems. He is a Senior Member of IEEE and serves as the advisor for the ASME Student Section at Universidad de las Fuerzas Armadas ESPE, Latacunga campus. He has participated in several research projects related to robotic systems for industrial and medical applications, contributing to the advancement of intelligent automation technologies.
Authors:
Jahir Vernaza-Bone Universidad de las Fuerzas Armadas ESPEDavid Ushiña-Chacon Universidad de las Fuerzas Armadas ESPE
Edwin Caizalitin-Quinaluisa Universidad de las Fuerzas Armadas ESPE
Dario Mendoza-Chipantasi Universidad de las Fuerzas Armadas ESPE
Design and Development of an Omnidirectional Mobile Robot for Load Transportation Using a Nonlinear Control and Computer Vision-Based Navigation System
Paper Type
Technical Paper Publication
