Drift Reduced Navigation with Deep Explainable Features

Mohd Omama1    Sundar Sripada V. S.1    Sandeep Chinchali2    Arun Kumar Singh3    K. Madhava Krishna1   

1 IIIT Hyderabad, India    2 The University of Texas at Austin    3 Institute of Technology, University of Tartu   


Modern autonomous vehicles (AVs) often rely on vision, LIDAR, and even radar-based simultaneous localization and mapping (SLAM) frameworks for precise localization and navigation. However, modern SLAM frameworks often lead to unacceptably high levels of drift (i.e., localization error) when AVs observe few visually distinct features or encounter occlusions due to dynamic obstacles. This paper argues that minimizing drift must be a key desiderata in AV motion planning, which requires an AV to take active control decisions to move towards feature-rich regions while also minimizing conventional control cost. To do so, we first introduce a novel data-driven perception module that observes LIDAR point clouds and estimates which features/regions an AV must navigate towards for drift minimization. Then, we introduce an interpretable model predictive controller (MPC) that moves an AV toward such feature-rich regions while avoiding visual occlusions and gracefully trading off drift and control cost. Our experiments on challenging, dynamic scenarios in the state-ofthe-art CARLA simulator indicate our method reduces drift up to 76.76% compared to benchmark approaches.