Simultaneous Location and Mapping (SLAM) for “Mac” in Preparation for Advanced RFID Research
In collaboration with the Auburn University RFID Lab, the CARV Lab used vision hardware and software to advance real-time robot localization. In collaboration with the Auburn University RFID Lab, the Construction Automation, Robotics, and Visualization Lab used vision hardware and software to advance the location and mapping of “Mac” – BSCI’s robotic dog – in preparation for ongoing advanced RFID-based research. This research attempted to deploy a Robot Operating System (ROS) based Simultaneous Location and Mapping (SLAM) algorithm on a Boston Dynamics Spot to enable real-time robot localization. Boston Dynamics robots run on an opened source platform, allowing developers to use the Application Program Interface (API) and Software Developers Kit (SDK) to build functionality onto the base robot. This is a highly effective system but can be labor intensive for complex tasks. One of these tasks is real-time localization; having the robot create a map using onboard and/or payload sensors and localizing itself within that map space displayed on a graphical user interface. This functionality, although quite complicated to develop from scratch, is readily available under a separate system known as ROS. However, ROS is not native to the Boston Dynamics Spot and requires a “wrapper.” The research functionally tested three visualization systems over the year, a Realsense D435i camera, a Velodyne VLP-16 LiDAR, and the robot’s onboard cameras. This required two versions of ROS (Humble and Noetic), two separate Spot wrappers (spot_ros and spot_ros2), and two versions of Ubuntu (20.04 and 22.04). The research team successfully deployed ROS-based SLAM.