Home > Research > Publications & Outputs > KinectFusion
View graph of relations

KinectFusion: real-time dense surface mapping and tracking

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published

Standard

KinectFusion: real-time dense surface mapping and tracking. / Newcombe, Richard A.; Izadi, Shahram; Hilliges, Otmar et al.
Mixed and Augmented Reality (ISMAR), 2011 10th IEEE International Symposium on. Washington, DC, USA: IEEE Computer Society, 2011. p. 127-136.

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Harvard

Newcombe, RA, Izadi, S, Hilliges, O, Molyneaux, D, Kim, D, Davison, AJ, Kohli, P, Shotton, J, Hodges, S & Fitzgibbon, A 2011, KinectFusion: real-time dense surface mapping and tracking. in Mixed and Augmented Reality (ISMAR), 2011 10th IEEE International Symposium on. IEEE Computer Society, Washington, DC, USA, pp. 127-136. https://doi.org/10.1109/ISMAR.2011.6092378

APA

Newcombe, R. A., Izadi, S., Hilliges, O., Molyneaux, D., Kim, D., Davison, A. J., Kohli, P., Shotton, J., Hodges, S., & Fitzgibbon, A. (2011). KinectFusion: real-time dense surface mapping and tracking. In Mixed and Augmented Reality (ISMAR), 2011 10th IEEE International Symposium on (pp. 127-136). IEEE Computer Society. https://doi.org/10.1109/ISMAR.2011.6092378

Vancouver

Newcombe RA, Izadi S, Hilliges O, Molyneaux D, Kim D, Davison AJ et al. KinectFusion: real-time dense surface mapping and tracking. In Mixed and Augmented Reality (ISMAR), 2011 10th IEEE International Symposium on. Washington, DC, USA: IEEE Computer Society. 2011. p. 127-136 doi: 10.1109/ISMAR.2011.6092378

Author

Newcombe, Richard A. ; Izadi, Shahram ; Hilliges, Otmar et al. / KinectFusion : real-time dense surface mapping and tracking. Mixed and Augmented Reality (ISMAR), 2011 10th IEEE International Symposium on. Washington, DC, USA : IEEE Computer Society, 2011. pp. 127-136

Bibtex

@inproceedings{07d188978f68459c9e66dae7db3f7ef4,
title = "KinectFusion: real-time dense surface mapping and tracking",
abstract = "We present a system for accurate real-time mapping of complex and arbitrary indoor scenes in variable lighting conditions, using only a moving low-cost depth camera and commodity graphics hardware. We fuse all of the depth data streamed from a Kinect sensor into a single global implicit surface model of the observed scene in real-time. The current sensor pose is simultaneously obtained by tracking the live depth frame relative to the global model using a coarse-to-fine iterative closest point (ICP) algorithm, which uses all of the observed depth data available. We demonstrate the advantages of tracking against the growing full surface model compared with frame-to-frame tracking, obtaining tracking and mapping results in constant time within room sized scenes with limited drift and high accuracy. We also show both qualitative and quantitative results relating to various aspects of our tracking and mapping system. Modelling of natural scenes, in real-time with only commodity sensor and GPU hardware, promises an exciting step forward in augmented reality (AR), in particular, it allows dense surfaces to be reconstructed in real-time, with a level of detail and robustness beyond any solution yet presented using passive computer vision.",
keywords = "AR , Dense Reconstruction , Depth Cameras , GPU , Real-Time , SLAM , Tracking , Volumetric Representation",
author = "Newcombe, {Richard A.} and Shahram Izadi and Otmar Hilliges and David Molyneaux and David Kim and Davison, {Andrew J.} and Pushmeet Kohli and Jamie Shotton and Steve Hodges and Andrew Fitzgibbon",
year = "2011",
doi = "10.1109/ISMAR.2011.6092378",
language = "English",
isbn = "9781457721830",
pages = "127--136",
booktitle = "Mixed and Augmented Reality (ISMAR), 2011 10th IEEE International Symposium on",
publisher = "IEEE Computer Society",

}

RIS

TY - GEN

T1 - KinectFusion

T2 - real-time dense surface mapping and tracking

AU - Newcombe, Richard A.

AU - Izadi, Shahram

AU - Hilliges, Otmar

AU - Molyneaux, David

AU - Kim, David

AU - Davison, Andrew J.

AU - Kohli, Pushmeet

AU - Shotton, Jamie

AU - Hodges, Steve

AU - Fitzgibbon, Andrew

PY - 2011

Y1 - 2011

N2 - We present a system for accurate real-time mapping of complex and arbitrary indoor scenes in variable lighting conditions, using only a moving low-cost depth camera and commodity graphics hardware. We fuse all of the depth data streamed from a Kinect sensor into a single global implicit surface model of the observed scene in real-time. The current sensor pose is simultaneously obtained by tracking the live depth frame relative to the global model using a coarse-to-fine iterative closest point (ICP) algorithm, which uses all of the observed depth data available. We demonstrate the advantages of tracking against the growing full surface model compared with frame-to-frame tracking, obtaining tracking and mapping results in constant time within room sized scenes with limited drift and high accuracy. We also show both qualitative and quantitative results relating to various aspects of our tracking and mapping system. Modelling of natural scenes, in real-time with only commodity sensor and GPU hardware, promises an exciting step forward in augmented reality (AR), in particular, it allows dense surfaces to be reconstructed in real-time, with a level of detail and robustness beyond any solution yet presented using passive computer vision.

AB - We present a system for accurate real-time mapping of complex and arbitrary indoor scenes in variable lighting conditions, using only a moving low-cost depth camera and commodity graphics hardware. We fuse all of the depth data streamed from a Kinect sensor into a single global implicit surface model of the observed scene in real-time. The current sensor pose is simultaneously obtained by tracking the live depth frame relative to the global model using a coarse-to-fine iterative closest point (ICP) algorithm, which uses all of the observed depth data available. We demonstrate the advantages of tracking against the growing full surface model compared with frame-to-frame tracking, obtaining tracking and mapping results in constant time within room sized scenes with limited drift and high accuracy. We also show both qualitative and quantitative results relating to various aspects of our tracking and mapping system. Modelling of natural scenes, in real-time with only commodity sensor and GPU hardware, promises an exciting step forward in augmented reality (AR), in particular, it allows dense surfaces to be reconstructed in real-time, with a level of detail and robustness beyond any solution yet presented using passive computer vision.

KW - AR

KW - Dense Reconstruction

KW - Depth Cameras

KW - GPU

KW - Real-Time

KW - SLAM

KW - Tracking

KW - Volumetric Representation

U2 - 10.1109/ISMAR.2011.6092378

DO - 10.1109/ISMAR.2011.6092378

M3 - Conference contribution/Paper

SN - 9781457721830

SP - 127

EP - 136

BT - Mixed and Augmented Reality (ISMAR), 2011 10th IEEE International Symposium on

PB - IEEE Computer Society

CY - Washington, DC, USA

ER -