70003-Advanced Robotics

Course Aims

This course addresses topics of advanced robotics, with a focus on real-time state estimation and mapping, with application to drones and Augmented and Virtual Reality.

We build on the knowledge acquired in the Robotics course 333. In this respect, the covered challenges are extended towards 6D motion estimation and control, focusing on the camera as a core sensor. We furthermore discuss fusion with complementary sensors such as Inertial Measurement Units, which have recently become very popular.

The objective of this course is to provide the understanding, mathematical tools, and practical experience that allow students to implement their own multi-sensor Simultaneous Localisation And Mapping (SLAM) algorithms for deployment on a broad range of mobile robots, such as a multicopter Unmanned Aerial System (UAS) – which we will do in the practicals.

The practicals lead on to the “Amazin’ Challenge” to be held in the last session: students work on a multicopter UAS to be operating autonomously with on-board vision-based state estimation and control such that a simple delivery task can be achieved reliably, accurately, and fast.

More information in the course syllabus.

Learning Outcomes

On successful completion of the module, students should be able to:

  • explain the software components of a typical mobile robot, as well as their interactions with hardware (sensors, motors)
  • explain state-of-the-art SLAM systems,
  • explain the common approaches to multi-sensor fusion and SLAM challenges (including data association, initialisation, and loop-closure),
  • write down the maths for multi-sensor estimators for states defined on a manifold as well as sparse and dense map representations,
  • explain different modern feedback-control approaches, including model-based ones and write down the maths,
  • implement basic filtering and optimisation based estimators as well as feedback controllers to run in real-time.

Course syllabus

  1. Introduction, Problem Formulation and Examples
  2. Representations and Sensors
  3. Kinematics and Temporal Models
  4. The Extended Kalman Filter in Practice
  5. Feedback Control
  6. Nonlinear Least Squares
  7. The Extended Kalman Filter Revisited
  8. Vision-Based Simultaneous Localisation and Mapping

Course Material

(Authentication required)

Lecture Slides

What Date Topic
Lecture 1 18/01/20 Intro, Problem Formulation & Examples
Lecture 2 25/01/20 Representations and Sensors
Lecture 3 01/02/20 Kinematics and Temporal Models
Lecture 4 08/02/20 The Extended Kalman Filter
Lecture 5 15/02/20 Feedback Control
Lecture 6 22/02/20 Nonlinear Least Squares
Lecture 7 01/03/20 Vision-Based SLAM
Lecture 8 08/03/20 Revision tasks | solutions

Coursework Task Sheets

What Date Topic
Practical 1 21/01/21 Get the drone off the ground
Practical 2 28/01/21 Vision-only pose estimation
Practical 3 11/02/21 Visual-inertial state estimation
Practical 4 25/02/21 Feedback Control
Practical 5 04/03/21 The Amazin’ Challenge (preparation)
Challenge 11/03/21 The Amazin’ Challenge