MSc thesis project proposal

Sensor-fusion for autonomous drone navigation

Assignment

Prevalent solutions for drone navigation heavily rely on manual intervention, to control and guide the drone to perform desired tasks. More recently, drones depend on collaborative sensors such as Mode-S and ADS-B, for anchored localization of their position. To enable (almost) complete drone autonomy, non-collaborative on-board sensor systems are employed e.g., LIDAR, RADAR, IMUs and stereo vision.

Given the variety of sensors on-board, the drone must accurately estimate their position, velocity and orientation at high update rates. How can the data from the on-board IMUs be combined to estimate these parameters? Can an optimal and robust Bayesian filter be designed for this ego-motion estimation problem in 3D space?

In addition, the reliable perception of the environment is a key aspect of drone safety, which include solutions for detect-and-avoid (DAA) and beyond visual line of sight (BVLOS) navigation. To this end, drones should be capable of object detection and classification by processing raw data into semantic classes. Conventional pattern recognition techniques, in combination to AI-approaches e.g., neural networks can be explored to address this challenge.

Requirements

A self-motivated student with a strong background in linear algebra and statistical signal processing. In addition, a student with machine learning (e.g., neural networks) experience is preferred. Previous experience with localization, synchronization and navigation projects is an added plus. Advanced coding skills in MATLAB and Python (or R) is required. Strong written and verbal communication skills in English is mandatory. The expected project duration is about 9 to 12 months.

Contact

dr. Raj Thilak Rajan

Circuits and Systems Group

Department of Microelectronics

Last modified: 2019-11-15