Short video interview and on-board visual-inertial SLAM demo in “I, Science”: https://www.youtube.com/watch?v=M5kuSfVaReU
Posts By: Stafan Leutenegger
For a collaborative EPSRC-funded project on Aerial Manufacturing, we are looking for a Research Associate in the field of SLAM for aerial robotics. We are tackling the challenge of using swarms of Unmanned Aerial Systems (UAS) to build structure, carry out repairs, etc. The core research to be carried out will be centred on real-time on-board… Read more »
We are pleased to announce the open-source release of OKVIS: Open Keyframe-based Visual Inertial SLAM under the terms of the BSD 3-clause license. OKVIS tracks the motion of an assembly of an Inertial Measurement Unit (IMU) plus N cameras (tested: mono, stereo and four-camera setup) and reconstructs the scene sparsely. This is the Author’s implementation… Read more »
At ICCV 2015, Andy and I organised a workshop on “The Future of Real-Time SLAM: Sensors, Processors, Representations, and Algorithms”. Details and slides can be found here.
BRISK Version 2 with shorter descriptors, higher speed and compatibility with OpenCV version 3 is available: see software!
Tristan Laidlow, Andrea Nicastro and Jan Czarnowski have joined the lab as PhD students — welcome! See Dyson Robotics Lab – People.
We are currently looking for post-docs in the broader context of robot vision. Please find details here.
Please check out our new Dyson Robotics Lab webpage!
We are exhilarated to announce our newest work (RSS’15) with Thomas Whelan, myself, Renato Salas-Moreno, Ben Glocker and Andrew Davison: we perform RGB-D SLAM with both local and large-scale loop-closures where a dense surfel-map is aligned and deformed in real-time, in order to continuously improve the reconstruction consistency. video | paper
Andreas Forster, Master’s student from ETH Zurich, is visiting for 3 months as an intern to work on visual-inertial state estimation and mapping software for mobile robots.