BirdSLAM: Monocular Multibody SLAM in Bird’s-Eye View

Swapnil Daga1    Gokul B. Nair1    Anirudha Ramesh1    Rahul Sajnani1    Junaid Ahmed Ansari2    K. Madhava Krishna1   

1 Robotics Research Center, International Institute of Information Technology, Hyderabad    2 Embedded Systems and Robotics, TCS Innovation Labs, Kolkata, India   

In this paper, we present BirdSLAM, a novel simultaneous localization and mapping (SLAM) system for the challenging scenario of autonomous driving platforms equipped with only a monocular camera. BirdSLAM tackles challenges faced by other monocular SLAM systems (such as scale ambiguity in monocular reconstruction, dynamic object localization, and uncertainty in feature representation) by using an orthographic (bird’s-eye) view as the configuration space in which localization and mapping are performed. By assuming only the height of the ego-camera above the ground, BirdSLAM leverages single-view metrology cues to accurately localize the ego-vehicle and all other traffic participants in bird’s-eye view. We demonstrate that our system outperforms prior work that uses strictly greater information, and highlight the relevance of each design decision via an ablation analysis.