Event-Optimized Stereo-Inertial-LiDAR SLAM
EOS-SLAM is a real-time hybrid SLAM system that fuses stereo cameras, IMU, LiDAR, and wheel odometry into a single, efficient architecture. Designed with a fixed-size fusion window and adaptive triggering, it delivers consistent O(1) computational performance per frame while maintaining high accuracy and robust mapping.
- ✅ Real-time stereo + IMU + LiDAR fusion
- ✅ O(1) computational cost per frame
- ✅ Event-driven updates (e.g., pose delta, info gain)
- ✅ Hybrid 2D occupancy + 3D voxel map
- ✅ ROS 2 ready & modular C++ codebase
- ✅ Benchmarking support (KITTI, EuRoC, TUM)
fusion_core/: Sensor processing & fusion logicmapping/: 2D log-odds map + 3D local voxel mapslocalization/: EKF / trigger managersemantic/: Optional MobileNetV3-based semantic layertools/: Benchmark logger + RViz pluginsdocs/: System design & benchmarking results
EOS-SLAM is tested on:
- KITTI (Stereo + IMU + GPS)
- EuRoC MAV (Stereo + IMU)
- TUM RGB-D (for stereo fallback testing)
Evaluation is performed using:
- evo for ATE/RPE
- Internal logging system for CPU, FPS, and memory
This project is licensed under the MIT License - see the LICENSE file for details.