A real-time Sparse mapping SLAM implementation of ORB-SLAM3 on Raspberry Pi 5 using ROS2 (Humble) with an external USB camera.
This project provides:
- Real-time monocular SLAM on Raspberry Pi 5 (ROS2)
- Dataset-based evaluation (TUM, EuRoC MAV, KITTI)
- Camera calibration tools
- Automated dataset & thermal evaluation scripts
- Scripts for converting datasets to ROS2 bags, running native ORB-SLAM3 examples
SLAM (Simultaneous Localisation and Mapping) enables a robot to construct a three-dimensional map of the environment while localising its own pose (position and orientation) within it. ORB-SLAM3 is a one of the most powerful SLAM implementations which have real-time, multi-map, and multi-mode support even on low-powered devices (such as the Raspberry Pi). I have implemented ORB-SLAM3 in ROS2 (Humble) on the Pi5, and built a full modular system that supports real-time monocular SLAM, RGB-D mapping, and dataset-based evaluation (TUM, EuRoC, and KITTI) with automated scripts and camera calibration tools. The system is capable of publishing 3D sparse maps, integrating with RViz2 for visualization, and supports live or recorded input via ROS 2 topics or native ORB-SLAM3 executables.
This repository accompanies the bachelor's minor project at Manipal University Jaipur.
- Repo (fork/branch used in experiments):
https://github.com/eshan-sud/ORB_SLAM3/tree/978d82ef265e184a5f787d71ccdcd53df5adeba6 - Scripts & automation live in
~/ros2_test/scripts/ - Thermal evaluation pipeline:
~/ros2_test/scripts/evaluation/
- Camera calibration
- Real-time Monocular SLAM (ROS2)
- Dataset-based evaluation scripts (TUM, EuRoC, KITTI)
- RViz2 visualization support
- Modular bash automation & helpers
- Hardware: Raspberry Pi 5 Model B Rev 1.0 (with fan + heatsink)
- OS: Debian GNU/Linux 12 (bookworm)
- Mode tested: Monocular (real-time), other modes dataset-offline
- Swap: experiments used a 4 GB swap file to avoid memory crashes for heavy modes
- Raspberry Pi 5 Model B Rev 1.0
- Raspberry Pi Fan & Heatsink
- Raspberry Pi Power Adaptor (minimum 27 Watts)
- 256 GB Micro SD Card
- External USB Camera
- Raspian OS (12 (bookworm))
- Python (3.11.2)
- CMake (3.25.1)
- OpenCV (4.6.0)
- OpenCV Contrib
- Vision_opencv
- ROS2 (Humble Hawksbill)
- Eigen3 (3.3.7)
- g2o
- Sophus (v1.1.0)
- DBoW2 (v1.1)
- Pangolin (4.5.0)
- OpenGL (3.1)
- Mesa (23.2.1-1~bpo12+rpt3)
- ORB-SLAM3
- Octomap
- Image Common
- Message Filters
- Rclcpp
- Octomap_ros
- Geometry2
- Common_interfaces
- Steps on how to setup & execute this project
- Download Raspberry Pi Imager from the official website.
- Connect your microSD card to your laptop/PC.
- Run Raspberry Pi Imager:
- Select the appropriate Raspberry Pi device (e.g., Raspberry Pi 5).
- Select the required operating system (e.g., Raspberry Pi OS (64-bit)).
- Select the appropriate storage (your microSD card).
- Press
Ctrl + Shift + Xto open Advanced Options:- Set a hostname (default:
raspberrypi). - Enable SSH and set a username and password (remember these).
- Configure Wi-Fi settings (SSID, password, and country code).
- Set the correct locale & keyboard layout.
To check your layout: press
Windows + Spacebaror check your system settings. - Click Save.
- Set a hostname (default:
- Click Write to write the OS to the microSD card The Imager will automatically eject the card when done
- Insert the microSD card into the Raspberry Pi and power it on
- Open Command Prompt (Windows) or Terminal (Mac/Linux).
- Run the following command:
ssh <username>@<hostname>
- Example:
ssh eshan-sud@raspberrypi
- Swap space is increased for max performance from the raspberry pi 5
sudo nano /etc/dphys-swapfile
- Change swapsize to 4096
sudo systemctl restart dphys-swapfile
cd ./scripts/
chmod +x setup.sh
./setup/setup.sh
- Use this checkerboard for calibration: checkerboard-patterns
- To calibrate your camera and generate an ORB-SLAM3-compatible .yaml file:
./start_camera_calibration.sh
- TUM RGB-D
cd ~/ros2_test/scripts/tum/
./download_tum_sequences.sh
- EuRoC MAV
cd ~/ros2_test/scripts/euroc/
./download_euroc_sequences.sh
- KITTI Visual Odometry
cd ~/ros2_test/scripts/kitti/
./download_kitti_sequences.sh
- Monocular
ros2 bag record /camera/image_raw
ros2 run image_tools showimage --ros-args -r image:=/camera/image_raw
- RGB-D
ros2 bag record /camera/color/image_raw /camera/depth/image_raw
- Convert any TUM dataset to a ROS2 bag:
cd ~/ros2_test/scripts/tum
python3 convert_tum_to_ros2_bag.py <tum_sequence_folder_path> <output_folder>
- Convert any euroc dataset to a ROS2 bag:
cd ~/ros2_test/scripts/euroc
python3 convert_euroc_to_ros2_bag.py <tum_sequence_folder_path> <output_folder>
- Example Usage:
cd ~/ros2_test/scripts/tum
python3 convert_tum_to_ros2_bag.py \
/home/{your_username}/ros2_test/datasets/TUM/rgbd_dataset_freiburg1_desk \
/home/{your_username}/tum_ros2_bag
ros2 bag info ~/ros2_test/{path_to_rosbag}/{rosbag_name}
ros2 bag play ~/ros2_test/{path_to_rosbag}/{rosbag_name}
- Contains ROS2-based, pre-written sripts, & native ORB-SLAM3 execution scripts
- Executions will work only after successful setup of all libraries, packages, & datasets (if executing on them)
- Check for updates:
sudo apt update && sudo apt upgrade -y
- Before executing, execute these on each terminal(s) where you want to execute the 3D sparse mapping
export DISPLAY=:0
export LIBGL_ALWAYS_INDIRECT=1
export MESA_GL_VERSION_OVERRIDE=3.3
export MESA_GLSL_VERSION_OVERRIDE=330
export LD_PRELOAD=/usr/lib/aarch64-linux-gnu/libGL.so
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:~/ros2_test/src/ORB_SLAM3/lib
export LD_LIBRARY_PATH=~/ros2_test/src/ORB_SLAM3/Thirdparty/DBoW2/lib:$LD_LIBRARY_PATH
export LD_LIBRARY_PATH=~/ros2_test/src/ORB_SLAM3/Thirdparty/g2o/lib:$LD_LIBRARY_PATH
source ~/ros2_test/install/local_setup.bash
- Real-time SLAM (Monocular)
cd ~/ros2_test/
./start_real_time_slam.sh-
- In this case; first, we also need to play rosbags, & in separate terminals we can execute ROS2 execution on the datasets
ros2 bag play ~/ros2_test/datasets/{path_to_rosbag}- TUM - Monocular :
./start_dataset_slam.sh tum_mono recorded
- TUM - RGB-D:
./start_dataset_slam.sh tum_rgbd recorded
- EuRoC - Monocular:
./start_dataset_slam.sh euroc_mono recorded
- Custom ROS2 Bag (RGB-D):
./start_dataset_slam.sh custom_rgbd recorded
-
TUM RGB-D - Monocular
- Usage:
./mono_tum path_to_vocabulary path_to_settings path_to_sequence [use_viewer=true|false]
- Example Usage (With Viewer):
cd ~/ros2_test/src/ORB_SLAM3/ ./Examples/Monocular/mono_tum \ ./Vocabulary/ORBvoc.txt \ ./Examples/Monocular/TUM1.yaml \ ~/ros2_test/datasets/TUM/rgbd_dataset_freiburg1_desk true
-
TUM RGB-D - RGB-D
- Usage:
./rgbd_tum path_to_vocabulary path_to_settings path_to_sequence path_to_association [use_viewer=true|false]
- Example Usage (With Viewer):
cd ~/ros2_test/src/ORB_SLAM3/ ./Examples/RGB-D/rgbd_tum \ ./Vocabulary/ORBvoc.txt \ ./Examples/RGB-D/TUM1.yaml \ ~/ros2_test/datasets/TUM/rgbd_dataset_freiburg1_desk \ ~/ros2_test/datasets/TUM/rgbd_dataset_freiburg1_desk/associate.txt \ true
-
EuRoC MAV - Monocular
- Usage:
./mono_euroc path_to_vocabulary path_to_settings path_to_sequence_folder path_to_times_file [bUseViewer=true|false]
- Example Usage (With Viewer):
cd ~/ros2_test/src/ORB_SLAM3/ ./Examples/Monocular/mono_euroc \ ./Vocabulary/ORBvoc.txt \ ./Examples/Monocular/EuRoC.yaml \ ~/ros2_test/datasets/EuRoC/MH01 \ ./Examples/Monocular/EuRoC_TimeStamps/MH01.txt \ true
-
EuRoC MAV - Stereo
- Usage:
./stereo_euroc path_to_vocabulary path_to_settings path_to_sequence_folder_1 path_to_times_file_1 (path_to_image_folder_2 path_to_times_file_2 ... path_to_image_folder_N path_to_times_file_N) (trajectory_file_name) [bUseviewer(true|false)]- Example Usage (With Viewer):
cd ~/ros2_test/src/ORB_SLAM3/ ./Examples/Stereo/stereo_euroc \ ./Vocabulary/ORBvoc.txt \ ./Examples/Stereo/EuRoC.yaml \ ~/ros2_test/datasets/EuRoC/MH01 \ ~/ros2_test/datasets/EuRoC/MH01/mav0/MH01.txt \ true
-
EuRoC MAV - Mono-Inertial
- Usage:
./mono_inertial_euroc path_to_vocabulary path_to_settings path_to_sequence_folder_1 path_to_times_file_1 (path_to_image_folder_2 path_to_times_file_2 ... path_to_image_folder_N path_to_times_file_N) [bUseViewer(true|false)] [output_file]- Example Usage (With Viewer):
cd ~/ros2_test/src/ORB_SLAM3/ ./Examples/Monocular-Inertial/mono_inertial_euroc \ ./Vocabulary/ORBvoc.txt \ ./Examples/Monocular-Inertial/EuRoC.yaml \ ~/ros2_test/datasets/EuRoC/MH01 \ ./Examples/Monocular-Inertial/EuRoC_TimeStamps/MH01.txt \ true
-
EuRoC MAV - Stereo-Inertial
- Usage:
./stereo_inertial_euroc path_to_vocabulary path_to_settings path_to_sequence_folder_1 path_to_times_file_1 (path_to_image_folder_2 path_to_times_file_2 ... path_to_image_folder_N path_to_times_file_N) [bUseViewer(true|false)] [output_file]- Example Usage (With Viewer):
cd ~/ros2_test/src/ORB_SLAM3/ ./Examples/Stereo-Inertial/stereo_inertial_euroc \ ./Vocabulary/ORBvoc.txt \ ./Examples/Stereo-Inertial/EuRoC.yaml \ ~/ros2_test/datasets/EuRoC/MH01 \ ./Examples/Stereo-Inertial/EuRoC_TimeStamps/MH01.txt \ true
-
KITTI Visual Odometry - Monocular
- Usage:
./mono_kitti path_to_vocabulary path_to_settings path_to_sequence [bUseViewer(true|false)]- Example Usage (With Viewer):
cd ~/ros2_test/src/ORB_SLAM3/ ./Examples/Monocular/mono_kitti \ ./Vocabulary/ORBvoc.txt \ ./Examples/Monocular/KITTI00-02.yaml \ ~/ros2_test/datasets/KITTI/dataset/sequences/00 \ true
-
KITTI Visual Odometry - Stereo
- Usage:
./stereo_kitti path_to_vocabulary path_to_settings path_to_sequence [bUseviewer(true|false)]- Example Usage (With Viewer):
cd ~/ros2_test/src/ORB_SLAM3/ ./Examples/Stereo/stereo_kitti \ ./Vocabulary/ORBvoc.txt \ ./Examples/Stereo/KITTI00-02.yaml \ ~/ros2_test/datasets/KITTI/dataset/sequences/00 \ true
- Monocular Node
ros2 run ros2_orbslam3_wrapper monocular_node --ros-args -p input_mode:=live- Camera Publisher with Calibration YAML
ros2 run ros2_orbslam3_wrapper camera_publisher_node \
/home/{your_username}/ros2_test/scripts/common/my_camera.yaml- RViz Launcher (from SSH, not VNC)
[!NOTE] RViz may not launch under VNC. For proper OpenGL rendering, use an SSH terminal session instead. Also, create a ros topic of map & set it to ORB_SLAM3/map
ros2 run ros2_orbslam3_wrapper rviz_launcher_node- Monocular Node
ros2 run ros2_orbslam3_wrapper monocular_node --ros-args -p input_mode:=recorded`- RGB-D Node
ros2 run ros2_orbslam3_wrapper rgbd_node --ros-args -p input_mode:=recorded`- Camera Publisher with Calibration YAML
ros2 run ros2_orbslam3_wrapper camera_publisher_node \
/home/{your_username}/ros2_test/scripts/common/my_camera.yaml- RViz Launcher (from SSH, not VNC)
[!NOTE] RViz may not launch under VNC. For proper OpenGL rendering, use an SSH terminal session instead. Also, create a ros topic of map & set it to ORB_SLAM3/map
ros2 run ros2_orbslam3_wrapper rviz_launcher_node- Play Rosbag
ros2 bag play ~/ros2_test/{path_to_rosbag}/{rosbag_name}Important
You should now see the sparse map being created on the pangolin viewer & on the RViz viewer as well
From project root:
cd ~/ros2_test/
./start_dataset_slam.sh tum_rgbd
# supported args:
# tum_rgbd, tum_mono, euroc_mono, kitti_mono, custom_rgbdThermal evaluation (pipeline): To run the dedicated thermal profiling shown in the paper:
cd ~/ros2_test/scripts/evaluation
chmod +x evaluate_thermal_conditions.sh
./evaluate_thermal_conditions.shevaluate_thermal_conditions.sh performs:
- runs ORB-SLAM3 deterministically over selected sequences,
- logs CPU temperature (vcgencmd measure_temp), vcgencmd get_throttled, CPU frequency and load, RAM usage, and per-process CPU usage at 1 Hz,
- writes CSV logs for each run to ~/ros2_test/results////thermal_log.csv
- OpenCV4
- opencv-contrib (OpenCV dependency)
- vision-opencv (ROS2-OpenCV dependency)
- Pangolin
- ORB-SLAM3 [Forked]
- ROS2
- ROS2 ORB-SLAM3 Wrapper [Referenced from]
- rviz
- image-common
- message-filters
- rclcpp
- octomap
- octomap_ros
- geometry2
- common_Interfaces