BMWi: AI-DroneBMWi: AI-Drone

Navigation using visual-inertial odometry and low-delay cloud communication

Funding Agency: BMWi
Duration: 3 years, 01.01.2018 - 31.12.2020
Partners: Lufthansa Technik AG, IBM Deutschland GmbH, Helmuth-Schmidt-Universität, Hochschule Neu-Ulm
Contact Persons: Martin Oelsch, Mojtaba Karimi

Scope of the project

In recent years drones have increasingly gained importance in various kinds of applications, for private use, but also professionally. They are used just as a hobby, in agriculture, media recordings (movies and sports events), military services (such as surveillance operations), for transportation of parcels, and recently also for inspection purposes. However, this popularity led to further restrictions in the usages of drones by the governments. Above a specific weight of the drone the pilot needs to have a license. It is forbidden to fly in non-designated areas and without permission.

In this project of the aviation research programme V (LuFo V) of the Federal Ministry for Economic Affairs and Energy the goal is to develop an autonomous drone for airplane inspection inside a hangar. The drone is expected to be equipped with a variety of sensors, such as LiDAR, stereo cameras, IMUs, compass and optionally active RGB-D sensors. The challenge in this environment are the large metal structures from the airplane, but also from the hangar itself, having a great influence on GPS signals and compass. In this GPS-denied environment the drone will collect data from the sensors and send them to a ground station, where SLAM algorithms map the environment and localize the drone precisely. The drone will follow preset inspection points at which it is capturing inspection images from the surface of the airplane. The images are then sent to the ground station, which performs machine learning techniques using IBM Watson in order to retrieve a damage classification result. The results are collected and an overall report is generated, documenting the current condition of the airplane.

The LMT will develop the software for module communication (ROS-based), sensor data acquisition and encoding, as well as data processing in the ground station, such as SLAM and drone control. In order to test the algorithms two test platforms are used at the LMT. Our small platform is based on "DJI Flame Wheel 550" for autonomous control and stability tests and the bigger drone is based on "DJI Spreading Wings S1000+" to mount all available sensors for data acquisition tests. During the development phase this data is then analyzed offline to adjust parameters and algorithms. This leads to an improvement of localization precision and better real-time control.

Another contribution of the LMT to the project is the measurement of communication delay in-the-loop. This is the time from the capturing of an event in a scene with a camera, over the processing in the ground station, until the reaction of the drone to this event. This is important for real-time SLAM and data processing. Images need to be encoded and point clouds compressed efficiently to save communication overhead and ensure real-time capabilities.

Data security plays an important role in this project to avoid eavesdropping. Hackers could manipulate sensor data to conceal airplane damage, leading to a safety risk for passengers. In another attack scenario the inspection drone control could be hijacked completely or the drone control signal can be jammed. Data encryption and drone-server authentication increases the security in this regard. This part of the project is carried out by the University of Applied Sciences Neu-Ulm.

Besides our real test platforms we use a Gazebo simulation to test sensor acquisition and path planning. This enables optimal inspection point computation for different types of aircrafts and hangars.