CO433H-Advanced Estimation in Robotics

On successful completion of the module, students should be able to:

  • explain state-of-the-art SLAM systems,
  • explain the common approaches to multi-sensor fusion and SLAM challenges (including data association, initialisation, and loop-closure),
  • write down the maths for multi-sensor estimators for states defined on a manifold as well as sparse and dense map representations,
  • implement basic filtering as well as optimisation based estimators in Python for prototyping as well as in C++ for real-time performance

This course addresses real-time state estimation and mapping, which remains to be one of the main challenges in mobile robotics – and has further application to the booming fields of Augmented and Virtual Reality.

We build on the knowledge acquired in the Robotics course 333 by Andrew Davison. In this respect, the covered estimation challenges are extended towards 6D motion tracking, focusing on the camera as a core sensor. We furthermore discuss fusion with complementary sensors such as Inertial Measurement Units, which have recently become very popular.

The objective of this course is to provide the understanding, mathematical tools, and practical experience that allow students to implement their own multi-sensor Simultaneous Localisation And Mapping (SLAM) algorithms for deployment on a broad range of mobile robots, e.g. a multicopter Unmanned Aerial System (UAS).

Some more information can be found in the Syllabus.

Schedule and Slides

What Date Topic
Lecture 1 15/10/15 Introduction and Fundamentals
Practical 1 22/10/15 Kinematics and Camera Library
Lecture 2 29/10/15 Formulations of Estimation Problems
Practical 2 05/11/15 Different Filter Implementations
Lecture 3 12/11/15 Temporal Models
Practical 3 19/11/15 IMU Kinematics Module
Lecture 4 26/11/15 Vision-Based Localisation and Mapping
Practical 4 03/11/15 Visual-Inertial EKF Localisation