Home > Research > Publications & Outputs > Evaluating HCI research beyond usability

Electronic data

  • 2018_chi-sig-eval

    Rights statement: © ACM, 2018. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in CHI 2018 - Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems, 2018 http://doi.acm.org/10.1145/3170427.3185371

    Accepted author manuscript, 155 KB, PDF document

    Available under license: CC BY-NC: Creative Commons Attribution-NonCommercial 4.0 International License

Links

Text available via DOI:

View graph of relations

Evaluating HCI research beyond usability

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published

Standard

Evaluating HCI research beyond usability. / Remy, Christian; Bates, Oliver; Mankoff, Jennifer et al.
CHI 2018 - Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems: Engage with CHI. Association for Computing Machinery (ACM), 2018. SIG13 (Conference on Human Factors in Computing Systems - Proceedings; Vol. 2018-April).

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Harvard

Remy, C, Bates, O, Mankoff, J & Friday, A 2018, Evaluating HCI research beyond usability. in CHI 2018 - Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems: Engage with CHI., SIG13, Conference on Human Factors in Computing Systems - Proceedings, vol. 2018-April, Association for Computing Machinery (ACM), 2018 CHI Conference on Human Factors in Computing Systems, CHI EA 2018, Montreal, Canada, 21/04/18. https://doi.org/10.1145/3170427.3185371

APA

Remy, C., Bates, O., Mankoff, J., & Friday, A. (2018). Evaluating HCI research beyond usability. In CHI 2018 - Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems: Engage with CHI Article SIG13 (Conference on Human Factors in Computing Systems - Proceedings; Vol. 2018-April). Association for Computing Machinery (ACM). https://doi.org/10.1145/3170427.3185371

Vancouver

Remy C, Bates O, Mankoff J, Friday A. Evaluating HCI research beyond usability. In CHI 2018 - Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems: Engage with CHI. Association for Computing Machinery (ACM). 2018. SIG13. (Conference on Human Factors in Computing Systems - Proceedings). doi: 10.1145/3170427.3185371

Author

Remy, Christian ; Bates, Oliver ; Mankoff, Jennifer et al. / Evaluating HCI research beyond usability. CHI 2018 - Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems: Engage with CHI. Association for Computing Machinery (ACM), 2018. (Conference on Human Factors in Computing Systems - Proceedings).

Bibtex

@inproceedings{5311aaa3ac9f4ac89ee287e3c28e57cf,
title = "Evaluating HCI research beyond usability",
abstract = "Evaluating research artefacts is an important step to showcase the validity of a chosen approach. The CHI community has developed and agreed upon a large variety of evaluation methods for HCI research; however, sometimes those methods are not applicable or not sufficient. This is especially the case when the contribution lies within the context of the application area, such as for research in sustainable HCI, HCI for development, or design fiction and futures studies. In this SIG, we invite the CHI community to share their insights from projects that encountered problems in evaluating research and aim to discuss solutions for this difficult topic. We invite researchers from all areas of HCI research who are interested to engage in a debate of issues in the process of validating research artefacts.",
keywords = "Design Fiction, Evaluation, Futures Studies, HCI4D, Research Methods, Sustainable HCI, Validation",
author = "Christian Remy and Oliver Bates and Jennifer Mankoff and Adrian Friday",
note = "{\textcopyright} ACM, 2018. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in CHI 2018 - Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems, 2018 http://doi.acm.org/10.1145/3170427.3185371; 2018 CHI Conference on Human Factors in Computing Systems, CHI EA 2018 ; Conference date: 21-04-2018 Through 26-04-2018",
year = "2018",
month = apr,
day = "20",
doi = "10.1145/3170427.3185371",
language = "English",
series = "Conference on Human Factors in Computing Systems - Proceedings",
publisher = "Association for Computing Machinery (ACM)",
booktitle = "CHI 2018 - Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems",

}

RIS

TY - GEN

T1 - Evaluating HCI research beyond usability

AU - Remy, Christian

AU - Bates, Oliver

AU - Mankoff, Jennifer

AU - Friday, Adrian

N1 - © ACM, 2018. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in CHI 2018 - Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems, 2018 http://doi.acm.org/10.1145/3170427.3185371

PY - 2018/4/20

Y1 - 2018/4/20

N2 - Evaluating research artefacts is an important step to showcase the validity of a chosen approach. The CHI community has developed and agreed upon a large variety of evaluation methods for HCI research; however, sometimes those methods are not applicable or not sufficient. This is especially the case when the contribution lies within the context of the application area, such as for research in sustainable HCI, HCI for development, or design fiction and futures studies. In this SIG, we invite the CHI community to share their insights from projects that encountered problems in evaluating research and aim to discuss solutions for this difficult topic. We invite researchers from all areas of HCI research who are interested to engage in a debate of issues in the process of validating research artefacts.

AB - Evaluating research artefacts is an important step to showcase the validity of a chosen approach. The CHI community has developed and agreed upon a large variety of evaluation methods for HCI research; however, sometimes those methods are not applicable or not sufficient. This is especially the case when the contribution lies within the context of the application area, such as for research in sustainable HCI, HCI for development, or design fiction and futures studies. In this SIG, we invite the CHI community to share their insights from projects that encountered problems in evaluating research and aim to discuss solutions for this difficult topic. We invite researchers from all areas of HCI research who are interested to engage in a debate of issues in the process of validating research artefacts.

KW - Design Fiction

KW - Evaluation

KW - Futures Studies

KW - HCI4D

KW - Research Methods

KW - Sustainable HCI

KW - Validation

U2 - 10.1145/3170427.3185371

DO - 10.1145/3170427.3185371

M3 - Conference contribution/Paper

AN - SCOPUS:85052025182

T3 - Conference on Human Factors in Computing Systems - Proceedings

BT - CHI 2018 - Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems

PB - Association for Computing Machinery (ACM)

T2 - 2018 CHI Conference on Human Factors in Computing Systems, CHI EA 2018

Y2 - 21 April 2018 through 26 April 2018

ER -