Home > Research > Publications & Outputs > Supporting Real-Time Contextual Inquiry through...

Electronic data

Links

View graph of relations

Supporting Real-Time Contextual Inquiry through Sensor Data

Research output: Contribution to conference - Without ISBN/ISSN Conference paperpeer-review

Published
Close
Publication date9/11/2019
<mark>Original language</mark>English
EventEPIC 2019 - Rhode Island School of Design , Providence, United States
Duration: 9/11/201912/11/2019
https://www.epicpeople.org/epic2019-video/

Conference

ConferenceEPIC 2019
CountryUnited States
CityProvidence
Period9/11/1912/11/19
Internet address

Abstract

A key challenge in carrying out product design research is obtaining rich contextual information about use in the wild. We present a method that algorithmically mediates between participants, researchers, and objects in order to enable real-time collaborative sensemaking. It facilitates contextual inquiry, revealing behaviours and motivations that frame product use in the wild. In particular, we are interested in developing a practice of use driven design, where products become research tools that generate design insights grounded in user experiences. The value of this method was explored through the deployment of a collection of Bluetooth speakers that capture and stream live data to remote but co-present researchers about their movement and operation. Researchers monitored a visualisation of the real-time data to build up a picture of how the speakers were being used, responding to moments of activity within the data, initiating text conversations and prompting participants to capture photos and video. Based on the findings of this explorative study, we discuss the value of this method, how it compares to contemporary research practices, and the potential of machine learning to scale it up for use within industrial contexts. As greater agency is given to both objects and algorithms, we explore ways to empower ethnographers and participants to actively collaborate within remote real-time research.