Home > Research > Publications & Outputs > iPfad
View graph of relations

iPfad: an iPad App for the real-time recording and encoding of direct observations of wayfinding behaviour

Research output: Contribution to conference - Without ISBN/ISSN Conference paperpeer-review

Published
Close
Publication date26/06/2012
<mark>Original language</mark>English
EventInternational Association of People-environment studies -
Duration: 26/06/201226/06/2012

Conference

ConferenceInternational Association of People-environment studies
Abbreviated titleiaps 22
Period26/06/1226/06/12

Abstract

This paper describes an iPad App, known as iPfad ('Pfad' means 'path' in German) created for the real-time recording of wayfinding behaviors in buildings and outdoor environments. This paper will start by describing antecedents to the iPfad App and will cover computer- and hand-based methods formerly employed by researchers of navigation in complex environments. In particular, it will describe the real-time data logging tool WayTracer (Kuhnmuench & Strube, 2009), upon which iPfad was based, highlighting the similarities and differences between the two approaches.The main section of the paper will describe the primary features of the iPfad App. These consist of the 'Home page' (where new participant records are entered and where the experiment-recording phase is initialized) and the 'Map' page (for behavior recording/encoding). The Map page screen is further divided into two sections: the upper 'map' section and the lower 'events' section. The map section displays the current floor level (for a multi-level building) and is a 'drawable' part of the screen, allowing the experimenter to trace the path of a subject onto the screen as they observe their progress through an environment. (For GPS enabled iPads in outdoor environments with good GPS coverage this path is created automatically, however since this is not applicable to most interior settings, the hand-drawn trace option is available.) The coordinates of the participant’s location are recorded in real-time. The lower half of the screen consists of a series of buttons to log actions. The buttons are classified as changes in floor level (the displayed map will be updated accordingly), as path events (starting a new task, pausing, backtracking, arriving at a false destination, becoming lost/giving up the task), the use of external aids (signage, maps, external views to the outside or equivalent invariant views, asking for help) and other log/action events (saving a compass direction in a pointing task, recording the location of a significant remark, if simultaneously recording an audio transcript). Every time an event is logged a colored 'dot' on the traced-path is created: it is time-stamped and its location noted in the log-file. The text-based log-files, annotated maps and any associated audio files are saved for subsequent retrieval.The third section of the paper will describe a case study in which the iPfad App was used. This took place in The Seattle Public Library and consisted of four wayfinding tasks undertaken by 28 participants and observed using iPfad. Post-experiment, the experimenters were asked to gauge the usability of the iPfad App by filling in a usability questionnaire and selecting descriptive words from a version of Microsoft’s Product Reaction Cards (Benedek and Miner, 2002). The final section will discuss the usability of the iPfad App based on the case study's feedback and will discuss implications for the automation of data gathering of human behavior in the future.