Assistive Navigation
interactive Demo
Assistive Navigation
Assistive Navigation System using RealSense, YOLOv8, and ROS2
The Assistive Navigation System is a robotics project designed to help visually impaired individuals navigate safely through indoor and outdoor environments. The system combines Intel RealSense depth cameras with YOLOv8 object detection and ROS2 (Robot Operating System 2) to create a comprehensive navigation assistant. Using RTAB (Real-Time Appearance-Based) mapping and SLAM (Simultaneous Localization and Mapping), the system builds 3D maps of the environment while tracking the user position in real-time. YOLOv8 identifies obstacles, hazards, and objects of interest, while the RealSense camera provides accurate depth measurements for distance estimation. The system provides audio feedback through bone conduction headphones, warning users of obstacles and guiding them along safe paths. ROS2 enables modular architecture, making it easy to integrate additional sensors and functionality.
1. Depth Sensing: Intel RealSense camera captures RGB-D data (color + depth information). 2. Object Detection: YOLOv8 processes camera feed in real-time to detect obstacles, hazards, and objects. 3. Distance Estimation: Depth data provides accurate distance measurements to detected objects. 4. SLAM: RTAB-SLAM builds 3D map of environment and tracks user position. 5. Path Planning: ROS2 navigation stack plans safe paths around detected obstacles. 6. Audio Feedback: Bone conduction headphones provide spatial audio warnings and guidance. 7. Localization: System maintains user position within mapped environment. 8. Obstacle Avoidance: Real-time alerts for immediate hazards in navigation path.