Dive into cutting-edge FusionSLAM, where SuperPoint, SuperGlue, Neural Depth Estimation, and Instant-NGP converge, elevating Monocular SLAM to unparalleled precision and performance. Redefining mapping, localization, and reconstruction in a single camera setup.
- NVIDIA Driver (Official Download Link)
- CUDA Toolkit (Official Link)
- ZED SDK (Official Guide)
- OpenCV CUDA (Github Guide)
- ROS 2 Humble (Official Link)
- Miniconda (Official Link)
- ZED ROS 2 Wrapper (Official Github Link)
- RTAB-Map (Official Github Link)
- RTAB-Map ROS 2 (Official Github Link)
- PyTorch (Official Link)
- Instant-ngp (Official Github Link)
- SuperPoint (Official Github Link)
- SuperGlue (Official Github Link)
- Nlohmann-JSON (Official Github Link)
- Install all non ROS 2 libraries
- Clone all ROS 2 packages into workspace
- Clone reporsitory into ROS 2 workspace
colcon build --symlink-install --cmake-args -DRTABMAP_SYNC_MULTI_RGBD=ON -DRTABMAP_SYNC_USER_DATA=ON -DPYTHON_EXECUTABLE=/usr/bin/python3 -DCMAKE_BUILD_TYPE=Release --parallel-workers $(nproc) --executor sequential
source ~/.bashrc
or source ROS 2 workspace- Run
python trace.py
and change path of SuperPoint weights, this will generate a model compatible with your version of PyTorch - Add libtorch path
export LD_LIBRARY_PATH=LD_LIBRARY_PATH:../miniconda3/envs/rtabmap/lib/python3.10/site-packages/torch/lib${LD_LIBRARY_PATH:+:${LD_LIBRARY_PATH}}
ensure the path is correct else RTAB-Map will not work
- Run SLAM to generate dataset
ros2 launch ngp_ros2 slam.launch.py rgb_topic:=/zed2i/zed_node/rgb/image_rect_color depth_topic:=/zed2i/zed_node/depth/depth_registered camera_info_topic:=/zed2i/zed_node/rgb/camera_info odom_topic:=/zed2i/zed_node/odom imu_topic:=/zed2i/zed_node/imu/data scan_cloud_topic:=/zed2i/zed_node/point_cloud/cloud_registered superpoint_model_path:=../SuperPointPretrainedNetwork/superpoint_v1.pt pydetector_path:=../rtabmap_superpoint.py pymatcher_path:=../rtabmap_superglue.py detection_rate:=1 image_path:=../images/ transform_path:=../transforms.json
Your dataset should get created in the image_path
along with transforms.json
in the transform_path
cd /instant-ngp/build
./instant-ngp ../PATH
give the path to your dataset whereimage_path
andtransforms.json
are located
- Ensure ZED ROS 2 Wrapper is set to run using Neural Depth Mode and Image quality is set to HD1080 for best renders
- This Render uses Depth Supervision, feel free to change RTAB-Map and instant-ngp parameters to generate better renders
- Use Pose-Graph from RTAB-Map to include loop closures for better renders
- Add Segmentation masks using Semantic Segmentation Network
- Generate render using Multi-camera SLAM