logo

flickr.com
DTAM: Dense Tracking and Mapping in Real-Time

Richard Newcombe, Steven Lovegrove, and Andrew Davison
Published at ICCV 2011, Winner: Best Demo Award.
Paper (PDF) | Video (Youtube)

Abstract

DTAM is a system for real-time camera tracking and reconstruction which relies not on feature extraction but dense, every pixel methods. As a single hand-held RGB camera flies over a static scene, we estimate detailed textured depth maps at selected keyframes to produce a surface patchwork with millions of vertices. We use the hundreds of images available in a video stream to improve the quality of a simple photometric data term, and minimise a global spatially regularised energy functional in a novel non-convex optimisation framework. Interleaved, we track the camera’s 6DOF motion precisely by frame-rate whole image alignment against the entire dense model. Our algorithms are highly parallelisable throughout and DTAM achieves real- time performance using current commodity GPU hardware. We demonstrate that a dense model permits superior track- ing performance under rapid motion compared to a state of the art method using features; and also show the additional usefulness of the dense model for real-time scene interaction in a physics-enhanced augmented reality application.

This is the first single passive camera system to demonstrate a complete dense 6DOF tracking/dense mapping pipeline for non parametric scene reconstructions.

Shoutback

by Steven Lovegrove Jump to top