MSc thesis project proposal
[NLR] Sensor-fusion for autonomous drone navigation
Prevalent solutions for drone navigation heavily rely on manual intervention, to control and guide the drone to perform desired tasks. More recently, drones depend on collaborative sensors such as Mode-S and ADS-B, for anchored localization of their position. To enable (almost) complete drone autonomy, non-collaborative on-board sensor systems are employed e.g., LIDAR, RADAR, IMUs and stereo vision.
Given the variety of sensors on-board, the drone must accurately estimate their position, velocity and orientation at high update rates. How can the data from the on-board IMUs be combined to estimate these parameters? Can optimal and robust Bayesian algorithms be designed for this ego-motion estimation problem in 3D space?
In addition, the reliable perception of the environment is a key aspect of drone safety, which include solutions for beyond visual line of sight (BVLOS) navigation. To this end, drones should be capable of object detection and classification by processing raw data into semantic classes. Conventional pattern recognition techniques, in combination to AI-approaches e.g., neural networks can be explored to address this challenge.
Possible internship opportunities with NLR, The Netherlands or funded within ADACORSA project.
A self-motivated student with a strong background in linear algebra and statistical signal processing. In addition, a student with machine learning (e.g., neural networks) experience is preferred. Previous experience with localization, synchronization and navigation projects, or experience with drones is an added plus. Advanced coding skills in MATLAB and Python (or R) is required. Strong written and verbal communication skills in English is mandatory. The expected project duration is about 9 to 12 months.
dr. Raj Thilak Rajan
Circuits and Systems Group
Department of Microelectronics
Last modified: 2022-04-19