Abstract
In various practical scenarios, autonomous vehicles must navigate through unfamiliar areas to reach their destinations. This navigation is facilitated by two-dimensional (2D) and three-dimensional (3D) maps. Simultaneous localization and mapping (SLAM) systems enable autonomous vehicles to map their surroundings while in motion. Traditionally, SLAM systems rely on physical sensors like LiDAR to measure distances. However, these sensors are costly and consume significant power, particu-larly when used with drones. Consequently, the use of monocular cameras for depth estimation of surrounding objects has gained considerable interest from both academia and industry. In this study, we integrate a recently developed deep learning monocular depth estimation model into the ORB-SLAM2 system. The integrated system has been tested by estimating trajec-tories and constructing 3D point cloud maps of unknown areas. In addition, preliminary experiments were conducted using a live drone. These experiments demonstrated the ability of the proposed system to produce more accurate point-cloud maps which improve the trajectory errors by 34-54% compared to contemporary approaches.
| Original language | English |
|---|---|
| Journal | Drone Systems and Applications |
| Volume | 13 |
| DOIs | |
| State | Published - 2025 |
Bibliographical note
Publisher Copyright:© 2025 The Authors.
Keywords
- 3D SLAM
- DJI Tello drone
- air surveillance
- drones
- monocular depth estimation
ASJC Scopus subject areas
- Control and Systems Engineering
- Automotive Engineering
- Aerospace Engineering
- Computer Science Applications
- Control and Optimization
- Electrical and Electronic Engineering