Pix2Pix-Based Monocular Depth Estimation for Drones with Optical Flow on AirSim.

AirSim depth estimation optical flow

Journal

Sensors (Basel, Switzerland)
ISSN: 1424-8220
Titre abrégé: Sensors (Basel)
Pays: Switzerland
ID NLM: 101204366

Informations de publication

Date de publication:
08 Mar 2022
Historique:
received: 13 01 2022
revised: 03 03 2022
accepted: 04 03 2022
entrez: 26 3 2022
pubmed: 27 3 2022
medline: 1 4 2022
Statut: epublish

Résumé

In this work, we propose a method for estimating depth for an image of a monocular camera in order to avoid a collision for the autonomous flight of a drone. The highest flight speed of a drone is generally approximate 22.2 m/s, and long-distant depth information is crucial for autonomous flights since if the long-distance information is not available, the drone flying at high speeds is prone to collisions. However, long-range, measurable depth cameras are too heavy to be equipped on a drone. This work applies Pix2Pix, which is a kind of Conditional Generative Adversarial Nets (CGAN). Pix2Pix generates depth images from a monocular camera. Additionally, this work applies optical flow to enhance the accuracy of depth estimation. In this work, we propose a highly accurate depth estimation method that effectively embeds an optical flow map into a monocular image. The models are trained with taking advantage of AirSim, which is one of the flight simulators. AirSim can take both monocular and depth images over a hundred meter in the virtual environment, and our model generates a depth image that provides the long-distance information than images captured by a common depth camera. We evaluate accuracy and error of our proposed method using test images in AirSim. In addition, the proposed method is utilized for flight simulation to evaluate the effectiveness to collision avoidance. As a result, our proposed method is higher accuracy and lower error than a state of work. Moreover, our proposed method is lower collision than a state of work.

Identifiants

pubmed: 35336268
pii: s22062097
doi: 10.3390/s22062097
pmc: PMC8948838
pii:
doi:

Types de publication

Journal Article

Langues

eng

Sous-ensembles de citation

IM

Subventions

Organisme : Japan Society for the Promotion of Science
ID : 20K23333 and 20J21208

Références

IEEE Trans Pattern Anal Mach Intell. 2016 Oct;38(10):2024-39
pubmed: 26660697

Auteurs

Tomoyasu Shimada (T)

Graduate School of Science and Engineering, Ritsumeikan University, Kusatsu 525-8577, Japan.

Hiroki Nishikawa (H)

Graduate School of Science and Engineering, Ritsumeikan University, Kusatsu 525-8577, Japan.
Japan Society for the Promotion of Science, Tokyo 102-0083, Japan.

Xiangbo Kong (X)

Graduate School of Science and Engineering, Ritsumeikan University, Kusatsu 525-8577, Japan.

Hiroyuki Tomiyama (H)

Graduate School of Science and Engineering, Ritsumeikan University, Kusatsu 525-8577, Japan.

Articles similaires

Plant Diseases Algorithms Unmanned Aerial Devices Machine Learning Medicine, Chinese Traditional

Evapotranspiration measurements in pasture, crops, and native Brazilian Cerrado based on UAV-borne multispectral sensor.

Gabriella Santos Arruda de Lima, Manuel Eduardo Ferreira, Jepherson Correia Sales et al.
1.00
Brazil Crops, Agricultural Environmental Monitoring Plant Transpiration Agriculture
Remote Sensing Technology Animals Unmanned Aerial Devices Neural Networks, Computer Plant Diseases
1.00
Animals Optic Flow Neurons Drosophila melanogaster Motion Perception

Classifications MeSH