Direkt zum Inhalt springen
Machine Learning for Robotics
TUM Department of Informatics
Technical University of Munich

Technical University of Munich

Menu

Links

Informatik IX

Professorship for Machine Learning for Robotics

Smart Robotics Lab

Boltzmannstrasse 3
85748 Garching info@srl.in.tum.de

Follow us on:
SRL  CVG  DVL 


Student Projects

Please find a list of projects (BSc/MSc thesis) currently on offer below:

Drone collision avoidance with NMPC in adaptive resolution volumetric maps

Supervisor(s) and Contact

Context

The Smart Robotics Lab (SRL) has a volumetric occupancy mapping framework, supereight 2 (see Figure), that uses the concept of adaptive resolution. As such, we believe it is well suited for direct extraction of soft and/or hard constraints that can directly enter our Nonlinear Model Predictive Control (NMPC) formulation, in order to unify control and collision avoidance.

Learning to walk on uneven terrain: elevation maps for bipedal walking

Supervisor(s) and Contact

Context

The Chair of Applied Mechanics (Prof. Rixen) at TUM has a humanoid (see Figure) robot and bipedal walker. Ideally, these can perceive the potentially uneven terrain in front of them, in order to safely walk over it. Within this project, we would like to explore incorporating locally perceived elevation maps as (additional) inputs to learned gait control policies.

Localize Monocular Camera in Large-scale LiDAR Map

Supervisor(s) and Contact

Context

LiDAR and cameras are widely applied in robotic applications, e.g. mixed reality, dense mapping, autonomous driving. While it is well studied to localize camera w.r.t. a visual feature map, global monocular camera localization in LiDAR-maps remains fairly unexplored. This project aims to narrow the gap between LiDAR point could map and images by learning deep features in shared embedding space, and localize the camera in a large-scale LiDAR point cloud map accurately.

Learned plane based visual-inertial SLAM and AR applications

Supervisor(s) and Contact

Context

Monocular SLAM system suffers from scale agnostic, while the visual-inertial system with the aid of IMU is competent to estimate metric 6DoF poses. Since structural planes are essential in AR (augmented reality) applications and informative, it is worth recovering 3D planes for building the layout of the environment. This project targets to develop a monocular visual-inertial SLAM system that leverages deep neural networks to detect and predicted 3D planes with the aid of IMU and incorporate planes into the conventional geometric bundle adjustment.

Dynamic Object-level SLAM in Neural Radiance Field

Supervisor(s) and Contact

Context

Object-level SLAM has attracted a lot of attention and made tremendous progress recently, where each object in the scene can be represented in an individual sub-map. The Smart Robotics Lab has developed one of the first dynamic object-level SLAM systems that can simultaneously segment, track, and reconstruct both static and moving objects in the scene. More recently, the neural radiance field has caught the attention of the vision community, and has adopted it to the object-level mapping framework. However, the object and camera poses are given and a tightly-coupled tracking component is lacking and prevent such work from real-world applications.

Dense Monocular implicit SLAM

Supervisor(s) and Contact

Context

Recently, the neural radiance field has caught the attention of the vision community and many extension works have been proposed, among which iMAP has proposed to use this implicit map representation in a SLAM system. However, it requires depth input to perform tracking and mapping. More recently, DROID-SLAM has proposed a new recurrent iterative updating way to achieve reliable tracking and semi-dense mapping system in a monocular camera setting. In this project, we would like to explore a tight integration of NerF and DROID-SLAM to achieve a dense monocular SLAM system, ideally working even in a dynamic environment.

Rechte Seite

Informatik IX

Professorship for Machine Learning for Robotics

Smart Robotics Lab

Boltzmannstrasse 3
85748 Garching info@srl.in.tum.de

Follow us on:
SRL  CVG  DVL