Autonomous robot prototype based on the Arduino and Raspberry Pi platforms. Developed in Python, Java and C, leveraging OpenCV for computer vision.
Software based network capable simulator was developed in Java and LibGDX as a means of emulating a robotic platform with a number of simulated sensors, compass and rotary encoders, for example. This simulator was used to generate realistic data for testing and verifying the implementation of the core system prototypes prior to the completion of the initial hardware platform.
The current platform has three ultrasonic sensors positioned around the front of the vehicle in an arc, in addition to a magnetometer and rotary encoders for feedback on the rotation count of each wheel for speed determination and corrections for straight line travelling
The Arduino is directly responsible for interfacing with the sensors present on the robot and actioning on commands received over the serial link from the Raspberry Pi module. Sensor data is formatted and periodically dispatched as an event to the Raspberry Pi for processing
The firmware of the robot is divided into subsystems where the core system is responsible for managing the communication between the Raspberry Pi and Arduino; for example, command buffering, command formatting and error handling. All code outside of the core system is dynamically loaded and unloaded through a plugin system that encourages modularity and the ability to alter the behaviour of the robot in real-time.
A stand-out feature is the Lane Detection plugin that utilises the Raspberry Pi camera and OpenCV to automatically capture video frames and extract lane data from the test environment. The robot will estimate its position in the lane and automatically make adjustments to maintain the vehicle within the lane center as it moves forward through the course.