Home > Research > Publications & Outputs > Virtual lifeline: Multimodal sensor data fusion...
View graph of relations

Virtual lifeline: Multimodal sensor data fusion for robust navigation in unknown environments

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published
  • Widyawan [No Value]
  • Gerald Pirkl
  • Daniele Munaretto
  • Carl Fischer
  • Chunlei An
  • Paul Lukowicz
  • Martin Klepal
  • Andreas Timm-Giel
  • Joerg Widmer
  • Dirk Pesch
  • Hans Gellersen
Close
<mark>Journal publication date</mark>06/2012
<mark>Journal</mark>Pervasive and Mobile Computing
Issue number3
Volume8
Number of pages14
Pages (from-to)388-401
Publication StatusPublished
<mark>Original language</mark>English

Abstract

We present a novel, multimodal indoor navigation technique that combines pedestrian dead reckoning (PDR) with relative position information from wireless sensor nodes. It is motivated by emergency response scenarios where no fixed or pre-deployed global positioning infrastructure is available and where typical motion patterns defeat standard PDR systems. We use RF and ultrasound beacons to periodically re-align the PDR system and reduce the impact of incremental error accumulation. Unlike previous work on multimodal positioning, we allow the beacons to be dynamically deployed (dropped by the user) at previously unknown locations. A key contribution of this paper is to show that despite the fact that the beacon locations are not known (in terms of absolute coordinates), they significantly improve the performance of the system. This effect is especially relevant when a user re-traces (parts of) the path he or she had previously travelled or lingers and moves around in an irregular pattern at single locations for extended periods of time. Both situations are common and relevant for emergency response scenarios. We describe the system architecture, the fusion algorithms and provide an in depth evaluation in a large scale, realistic experiment.