Intelligent Systems

EM-Fusion: Dynamic Object-Level SLAM With Probabilistic Data Association

2020-11-23


The majority of approaches for acquiring dense 3D environment maps with RGB-D cameras assumes static environments or rejects moving objects as outliers. The representation and tracking of moving objects, however, has significant potential for applications in robotics or augmented reality. In this paper, we propose a novel approach to dynamic SLAM with dense object-level representations. We represent rigid objects in local volumetric signed distance function (SDF) maps, and formulate multi-object tracking as direct alignment of RGB-D images with the SDF representations. Our main novelty is a probabilistic formulation which naturally leads to strategies for data association and occlusion handling. We analyze our approach in experiments and demonstrate that our approach compares favorably with the state-of-the-art methods in terms of robustness and accuracy.

Author(s): Michael Strecke, Joerg Stueckler
Department(s): Embodied Vision
Publication(s): {EM}-Fusion: Dynamic Object-Level SLAM With Probabilistic Data Association
Authors: Michael Strecke, Joerg Stueckler
Maintainers: Michael Strecke
Release Date: 2020-11-23
License: GNU General Public License version 3 (GPL-3.0)
Repository: https://github.com/EmbodiedVision/emfusion
External Link: https://emfusion.is.tue.mpg.de