Home > Research > Publications & Outputs > Supporting Real-Time Contextual Inquiry through...

Electronic data

Links

View graph of relations

Supporting Real-Time Contextual Inquiry through Sensor Data

Research output: Contribution to conference - Without ISBN/ISSN Conference paperpeer-review

Published

Standard

Supporting Real-Time Contextual Inquiry through Sensor Data. / Gorkovenko, Katerina ; Murray-Rust, Dave; Burnett, Dan et al.
2019. Paper presented at EPIC 2019, Providence, Rhode Island, United States.

Research output: Contribution to conference - Without ISBN/ISSN Conference paperpeer-review

Harvard

Gorkovenko, K, Murray-Rust, D, Burnett, D, Thorp, J & Richards, D 2019, 'Supporting Real-Time Contextual Inquiry through Sensor Data', Paper presented at EPIC 2019, Providence, United States, 9/11/19 - 12/11/19. <https://www.epicpeople.org/supporting-real-time-contextual-inquiry-sensor-data/>

APA

Vancouver

Gorkovenko K, Murray-Rust D, Burnett D, Thorp J, Richards D. Supporting Real-Time Contextual Inquiry through Sensor Data. 2019. Paper presented at EPIC 2019, Providence, Rhode Island, United States.

Author

Gorkovenko, Katerina ; Murray-Rust, Dave ; Burnett, Dan et al. / Supporting Real-Time Contextual Inquiry through Sensor Data. Paper presented at EPIC 2019, Providence, Rhode Island, United States.

Bibtex

@conference{fd106016ff29428fa57d57356447f109,
title = "Supporting Real-Time Contextual Inquiry through Sensor Data",
abstract = "A key challenge in carrying out product design research is obtaining rich contextual information about use in the wild. We present a method that algorithmically mediates between participants, researchers, and objects in order to enable real-time collaborative sensemaking. It facilitates contextual inquiry, revealing behaviours and motivations that frame product use in the wild. In particular, we are interested in developing a practice of use driven design, where products become research tools that generate design insights grounded in user experiences. The value of this method was explored through the deployment of a collection of Bluetooth speakers that capture and stream live data to remote but co-present researchers about their movement and operation. Researchers monitored a visualisation of the real-time data to build up a picture of how the speakers were being used, responding to moments of activity within the data, initiating text conversations and prompting participants to capture photos and video. Based on the findings of this explorative study, we discuss the value of this method, how it compares to contemporary research practices, and the potential of machine learning to scale it up for use within industrial contexts. As greater agency is given to both objects and algorithms, we explore ways to empower ethnographers and participants to actively collaborate within remote real-time research.",
keywords = "digital sensors, experience sampling method, human-computer interaction, Internet of Things, machine learning, remote research, sensor data",
author = "Katerina Gorkovenko and Dave Murray-Rust and Dan Burnett and James Thorp and Daniel Richards",
year = "2019",
month = nov,
day = "9",
language = "English",
note = "EPIC 2019 ; Conference date: 09-11-2019 Through 12-11-2019",
url = "https://www.epicpeople.org/epic2019-video/",

}

RIS

TY - CONF

T1 - Supporting Real-Time Contextual Inquiry through Sensor Data

AU - Gorkovenko, Katerina

AU - Murray-Rust, Dave

AU - Burnett, Dan

AU - Thorp, James

AU - Richards, Daniel

PY - 2019/11/9

Y1 - 2019/11/9

N2 - A key challenge in carrying out product design research is obtaining rich contextual information about use in the wild. We present a method that algorithmically mediates between participants, researchers, and objects in order to enable real-time collaborative sensemaking. It facilitates contextual inquiry, revealing behaviours and motivations that frame product use in the wild. In particular, we are interested in developing a practice of use driven design, where products become research tools that generate design insights grounded in user experiences. The value of this method was explored through the deployment of a collection of Bluetooth speakers that capture and stream live data to remote but co-present researchers about their movement and operation. Researchers monitored a visualisation of the real-time data to build up a picture of how the speakers were being used, responding to moments of activity within the data, initiating text conversations and prompting participants to capture photos and video. Based on the findings of this explorative study, we discuss the value of this method, how it compares to contemporary research practices, and the potential of machine learning to scale it up for use within industrial contexts. As greater agency is given to both objects and algorithms, we explore ways to empower ethnographers and participants to actively collaborate within remote real-time research.

AB - A key challenge in carrying out product design research is obtaining rich contextual information about use in the wild. We present a method that algorithmically mediates between participants, researchers, and objects in order to enable real-time collaborative sensemaking. It facilitates contextual inquiry, revealing behaviours and motivations that frame product use in the wild. In particular, we are interested in developing a practice of use driven design, where products become research tools that generate design insights grounded in user experiences. The value of this method was explored through the deployment of a collection of Bluetooth speakers that capture and stream live data to remote but co-present researchers about their movement and operation. Researchers monitored a visualisation of the real-time data to build up a picture of how the speakers were being used, responding to moments of activity within the data, initiating text conversations and prompting participants to capture photos and video. Based on the findings of this explorative study, we discuss the value of this method, how it compares to contemporary research practices, and the potential of machine learning to scale it up for use within industrial contexts. As greater agency is given to both objects and algorithms, we explore ways to empower ethnographers and participants to actively collaborate within remote real-time research.

KW - digital sensors

KW - experience sampling method

KW - human-computer interaction

KW - Internet of Things

KW - machine learning

KW - remote research

KW - sensor data

M3 - Conference paper

T2 - EPIC 2019

Y2 - 9 November 2019 through 12 November 2019

ER -