[News Archive] April 2003: new applications of real-time single camera localisation and SLAM

April 2003: new applications of real-time single camera localisation and SLAM. On the left: real-time localisation with a known map of features using a single camera with a fish-eye lens (joint work with Nobuyuki Kita and Francois Berenger at AIST Japan). This lens has a field of view of around 150 degrees with a spherical projection curve such that image coordinate is proportional to incoming ray angle. Our camera localisation method is easily adapted to this case with a new measurement model. Camera position estimation actually works better using this than a normal perspective lens since the same set of features is visible during larger motions. 30Hz operation, all processing on a 2GHz laptop (this work was demoed at ICCV 2003). On the right: real-time SLAM for a wearable active vision robot built by Walterio Mayol and David Murray. The robot
has a miniature IEEE1394 camera with a perspective lens. Output from real-time visual SLAM is used to localise the robot and control its fixation point automatically: the robot’s camera can be directed to fixate on any of the feature points in its map as the wearer moves around freely. The wearable results were presented at ISMAR2003 and ISRR2003.

MPEGMPEG

Real-Time Localisation and Mapping with Wearable Active Vision (PDF format),
Andrew J. Davison, Walterio Mayol and David W. Murray, ISMAR 2003

Delicious Twitter Digg this StumbleUpon Facebook