Thesis-details
Overview
 
Unified Range-Multispectral-Inertial Odometry (R-VTIO) for Autonomous Drone Navigation 
 
Subject 
Standard Visual-Inertial Odometry (VIO) pipelines fuse RGB imagery with IMU data but degrade under two distinct failure modes: visual degradation (fog, smoke, low light) and scale or altitude ambiguity in featureless terrain. Existing approaches address these failure modes independently Range-Visual-Inertial Odometry (RVIO) incorporates LiDAR altimeter range constraints. To date, no published system fuses all four modalities (RGB, Thermal, Range, IMU) in a single tightly-coupled estimation framework for UAV navigation. In this master thesis, the goal is to design and implement a unified factor-graph-based odometry system, R-VIO, that jointly optimises over all four modalities and dynamically reweights them based on per-sensor confidence metrics, validated on the Tarot 990 hexacopter platform.
Kind of work 
The student will implement a unified factor graph (using GTSAM or Ceres) that jointly optimises over RGB features, IMU pre-integration, and altimeter range factors. This includes designing a dynamic weighting module that adjusts modality contributions based on per-sensor confidence metrics (feature count, contrast ratio, altimeter return strength), running a full ablation study comparing RGB-only, RGB+IMU and full R-VIO on identical flight logs, and benchmarking against GNSS RTK ground truth alongside VINS-Fusion and ROVIO baselines. Flight data will be collected in at least three degraded scenarios: night flight, fog/smoke simulation, and high-altitude featureless terrain. The work will conclude with the preparation of a research paper.
Framework of the Thesis 
The thesis will start with a literature review on visual-inertial odometry (VINS-Mono, ROVIO), multispectral VIO, range-visual-inertial fusion, factor-graph optimisation (GTSAM), and degradation-aware sensor fusion (MIMOSA, Degradation-Resilient LiDAR-Radar-Inertial Odometry).
Next, the student will define the complete experimental framework: instrumentation of the Tarot 990 platform, synchronisation of the FLIR Blackfly cameras, radar altimeter, and IMU, formal definition of the factor-graph residuals and confidence-based weighting strategy, and implementation of the unified R-VIO pipeline with real-time benchmarking on the onboard compute.
In the final phase, the student will conduct extensive experimental validation: collecting flight logs across degraded scenarios, performing the full ablation study, benchmarking against state-of-the-art baselines, and providing a detailed quantitative analysis of how each modality contributes to robustness and accuracy. The validation phase concludes with the preparation of a publication.
Expected Student Profile 
The ideal candidate has a solid background in robotics and state estimation, with hands-on experience in ROS2 and proficient C++ programming skills. Knowledge of factor-graph optimisation (GTSAM or Ceres) and sensor fusion is required, alongside familiarity with computer vision (feature extraction, multi-view geometry). Experience with multispectral or thermal imaging is a plus. The candidate should be comfortable working with drone hardware and onboard compute platforms, and capable of conducting rigorous experimental validation against ground-truth data.