This project involved designing and building a fully autonomous vehicle capable of navigating indoor environments and generating accurate 3D maps using captured images and modern localization algorithms. The system was designed to combine autonomous mapping with the flexibility of manual control — users can seamlessly switch to remote operation using a mobile app with a virtual joystick.
Core System Components
1. Mobile Platform and Motion Control
The vehicle was built on a lightweight mobile chassis. Its motors are controlled by an ESP8266 microcontroller, which communicates with the Raspberry Pi 4B and receives commands from both the onboard autonomous system and the mobile joystick app.
2. Raspberry Pi 4B – the Brain of the System
A Raspberry Pi 4B acts as the central processing unit, managing the onboard camera, ultrasonic sensors, SLAM algorithm, and image transmission to the cloud. It handles decision-making during autonomous navigation and system supervision.
3. Ultrasonic Sensors – Obstacle Detection and Collision Avoidance
To avoid collisions during operation, the vehicle is equipped with ultrasonic sensors. These provide real-time distance measurements to detect nearby obstacles and dynamically adjust the route — both in autonomous and manual driving modes.
4. Camera + 3D Mapping in the Cloud via COLMAP
As the vehicle moves through an environment, it captures a series of images using an onboard camera. These images are automatically uploaded to the cloud, where COLMAP (a Structure-from-Motion + Multi-View Stereo tool) processes them into a detailed 3D reconstruction of the space — all without the need for expensive LiDAR systems.
5. Manual Control via Mobile App with Joystick
A key feature of the system is the ability to switch to manual mode, allowing the user to remotely control the vehicle using a mobile app with a virtual joystick. Commands are sent via Wi-Fi directly to the ESP8266, enabling precise maneuvering in tight spaces or during testing phases.
Features and Benefits
- Fully autonomous indoor navigation and 3D mapping
- Real-time obstacle detection and avoidance using ultrasonic sensors
- Remote manual control through a mobile joystick app (manual override)
- Image-based 3D reconstruction using COLMAP in the cloud
- Modular and low-cost architecture, ideal for research and prototyping
- Seamless transition between autonomous and manual modes
Project Outcome
The vehicle successfully completed multiple test runs in indoor environments, showcasing both its autonomous capabilities and the flexibility of manual control. The resulting 3D maps generated by COLMAP were detailed and spatially accurate, demonstrating the effectiveness of a camera-based mapping approach using accessible, low-cost hardware.