Home > Research > Publications & Outputs > Evaluation beyond Usability

Electronic data

  • Remy2018eval

    Accepted author manuscript, 667 KB, PDF document

    Available under license: CC BY-NC: Creative Commons Attribution-NonCommercial 4.0 International License

Links

Text available via DOI:

View graph of relations

Evaluation beyond Usability: Validating Sustainable HCI Research

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published

Standard

Evaluation beyond Usability: Validating Sustainable HCI Research. / Remy, Christian; Bates, Oliver Emile Glaves; Dix, Alan et al.
CHI '18 Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. New York: ACM, 2018. 216.

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Harvard

Remy, C, Bates, OEG, Dix, A, Thomas, V, Hazas, MD, Friday, AJ & Huang, E 2018, Evaluation beyond Usability: Validating Sustainable HCI Research. in CHI '18 Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems., 216, ACM, New York. https://doi.org/10.1145/3173574.3173790

APA

Remy, C., Bates, O. E. G., Dix, A., Thomas, V., Hazas, M. D., Friday, A. J., & Huang, E. (2018). Evaluation beyond Usability: Validating Sustainable HCI Research. In CHI '18 Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems Article 216 ACM. https://doi.org/10.1145/3173574.3173790

Vancouver

Remy C, Bates OEG, Dix A, Thomas V, Hazas MD, Friday AJ et al. Evaluation beyond Usability: Validating Sustainable HCI Research. In CHI '18 Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. New York: ACM. 2018. 216 doi: 10.1145/3173574.3173790

Author

Remy, Christian ; Bates, Oliver Emile Glaves ; Dix, Alan et al. / Evaluation beyond Usability : Validating Sustainable HCI Research. CHI '18 Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. New York : ACM, 2018.

Bibtex

@inproceedings{8f75989625f74ad5a072bd0b1296f357,
title = "Evaluation beyond Usability: Validating Sustainable HCI Research",
abstract = "The evaluation of research artefacts is an important step to validate research contributions. Sub-disciplines of HCI often pursue primary goals other than usability, such as Sustainable HCI (SHCI), HCI for development, or health and wellbeing. For such disciplines, established evaluation methods are not always appropriate or sufficient, and new conventions for identifying, discussing, and justifying suitable evaluation methods need to be established. In this paper, we revisit the purpose and goals of evaluation in HCI and SHCI, and elicit five key elements that can provide guidance to identifying evaluation methods for SHCI research. Our essay is meant as a starting point for discussing current and improving future evaluation practice in SHCI; we also believe it holds value for other subdisciplines in HCI that encounter similar challenges while evaluating their research.",
author = "Christian Remy and Bates, {Oliver Emile Glaves} and Alan Dix and Vanessa Thomas and Hazas, {Michael David} and Friday, {Adrian John} and Elaine Huang",
year = "2018",
month = apr,
day = "21",
doi = "10.1145/3173574.3173790",
language = "English",
isbn = "9781450356206",
booktitle = "CHI '18 Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems",
publisher = "ACM",

}

RIS

TY - GEN

T1 - Evaluation beyond Usability

T2 - Validating Sustainable HCI Research

AU - Remy, Christian

AU - Bates, Oliver Emile Glaves

AU - Dix, Alan

AU - Thomas, Vanessa

AU - Hazas, Michael David

AU - Friday, Adrian John

AU - Huang, Elaine

PY - 2018/4/21

Y1 - 2018/4/21

N2 - The evaluation of research artefacts is an important step to validate research contributions. Sub-disciplines of HCI often pursue primary goals other than usability, such as Sustainable HCI (SHCI), HCI for development, or health and wellbeing. For such disciplines, established evaluation methods are not always appropriate or sufficient, and new conventions for identifying, discussing, and justifying suitable evaluation methods need to be established. In this paper, we revisit the purpose and goals of evaluation in HCI and SHCI, and elicit five key elements that can provide guidance to identifying evaluation methods for SHCI research. Our essay is meant as a starting point for discussing current and improving future evaluation practice in SHCI; we also believe it holds value for other subdisciplines in HCI that encounter similar challenges while evaluating their research.

AB - The evaluation of research artefacts is an important step to validate research contributions. Sub-disciplines of HCI often pursue primary goals other than usability, such as Sustainable HCI (SHCI), HCI for development, or health and wellbeing. For such disciplines, established evaluation methods are not always appropriate or sufficient, and new conventions for identifying, discussing, and justifying suitable evaluation methods need to be established. In this paper, we revisit the purpose and goals of evaluation in HCI and SHCI, and elicit five key elements that can provide guidance to identifying evaluation methods for SHCI research. Our essay is meant as a starting point for discussing current and improving future evaluation practice in SHCI; we also believe it holds value for other subdisciplines in HCI that encounter similar challenges while evaluating their research.

U2 - 10.1145/3173574.3173790

DO - 10.1145/3173574.3173790

M3 - Conference contribution/Paper

SN - 9781450356206

BT - CHI '18 Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems

PB - ACM

CY - New York

ER -