Rebooting the Mio Robot project

I revived the Mio home robot project after shelving it over 2 years ago. When I stopped, the robot can be wirelessly controlled (via ROS teleop_twist_joy) and I had SLAM navigation running on Gazebo simulator, but did not implement it on the robot itself.

I recently came across Philip Schmidt’s “Visual navigation for wheeled autonomous robots” talk and was impressed by Intel’s achievement to cram a robust SLAM solution into a pair of low-power sensors: the RealSense T265 tracking camera and D400-family depth camera. Needless to say, I took the plunge and bought the bundled pair.

My goal is to replicate the Intel demo, but on the Arlo robot platform that Mio was built on. I want my robot to roam around the house autonomously while building and continuously updating a map of its environment. This is more sophisticated than the common approach of first building a static map in the form of an occupancy grid, editing the map to remove non-stationary objects, then using the cleaned-up map as input to the robot.

While working on this project, I will also make these enhancements:

  1. upgrade to ROS 2
  2. integrate my HB-25 motor controller code into the (Git-forked) ros_arduino_bridge package and open-source it

By modernizing the sensor and the ROS software stack, Mio can take advantage of recent technology innovations. And moving to ROS 2 makes it more future-proof.

Leave a Reply

Your email address will not be published. Required fields are marked *