12,000

We have over 12,000 students, from over 100 countries, within one of the safest campuses in the UK

93%

93% of Lancaster students go into work or further study within six months of graduating

Home > Research > Publications & Outputs > Interactive Environment-Aware Handheld Projecto...
View graph of relations

« Back

Interactive Environment-Aware Handheld Projectors for Pervasive Computing Spaces

Research output: Contribution in Book/Report/ProceedingsPaper

Published

Publication date2012
Host publicationPervasive Computing 10th International Conference, Pervasive 2012, Newcastle, UK, June 18-22, 2012. Proceedings
EditorsJudy Kay, Paul Lukowicz , Hideyuki Tokuda , Patrick Olivier , Antonio Krüger
Place of publicationBerlin
PublisherSpringer
Pages197-215
Number of pages19
ISBN (Print)978-3-642-31204-5
Original languageEnglish

Publication series

NameLecture Notes in Computer Science
PublisherSpringer
Volume7319
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Abstract

This paper presents two novel handheld projector systems for indoor pervasive computing spaces. These projection-based devices are “aware” of their environment in ways not demonstrated previously. They offer both spatial awareness, where the system infers location and orientation of the device in 3D space, and geometry awareness, where the system constructs the 3D structure of the world around it, which can encompass the user as well as other physical objects, such as furniture and walls. Previous work in this area has predominantly focused on infrastructure-based spatial-aware handheld projection and interaction. Our prototypes offer greater levels of environment awareness, but achieve this using two opposing approaches; the first infrastructure-based and the other infrastructure-less sensing. We highlight a series of interactions including direct touch, as well as in-air gestures, which leverage the shadow of the user for interaction. We describe the technical challenges in realizing these novel systems; and compare them directly by quantifying their location tracking and input sensing capabilities.