Home > Research > Publications & Outputs > Going online

Electronic data

  • Brunfaut_Harding_Batty_(2018) Author accepted manuscript

    Rights statement: This is the author’s version of a work that was accepted for publication in Assessing Writing. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Assessing Writing, ???, ??, 2018 DOI:

    Accepted author manuscript, 1.08 MB, PDF document

    Available under license: CC BY-NC-ND: Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License

Links

Text available via DOI:

View graph of relations

Going online: The effect of mode of delivery on performances and perceptions on an English L2 writing test suite

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published

Standard

Going online: The effect of mode of delivery on performances and perceptions on an English L2 writing test suite. / Brunfaut, Tineke; Harding, Luke; Batty, Aaron.
In: Assessing Writing, Vol. 36, 04.2018, p. 3-18.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

APA

Vancouver

Brunfaut T, Harding L, Batty A. Going online: The effect of mode of delivery on performances and perceptions on an English L2 writing test suite. Assessing Writing. 2018 Apr;36:3-18. Epub 2018 Feb 26. doi: 10.1016/j.asw.2018.02.003

Author

Bibtex

@article{63adc954d7e54404a6753a9746c8db6a,
title = "Going online: The effect of mode of delivery on performances and perceptions on an English L2 writing test suite",
abstract = "In response to changing stakeholder needs, large-scale language test providers have increasingly considered the feasibility of delivering paper-based examinations online. Evidence is required, however, to determine whether online delivery of writing tests results in changes to writing performance reflected in differential test scores across delivery modes, and whether test-takers hold favourable perceptions of online delivery. The current study aimed to determine the effect of delivery mode on the two writing tasks (reading-into-writing and extended writing) within the Trinity College London Integrated Skills in English (ISE) test suite across three proficiency levels (CEFR B1-C1). 283 test-takers (107 at ISE I/B1, 109 at ISE II/B2, and 67 at ISE III/C1) completed both writing tasks in paper-based and online mode. Test-takers also completed a questionnaire to gauge perceptions of the impact, usability and fairness of the delivery modes. Many-facet Rasch measurement (MFRM) analysis of scores revealed that delivery mode had no discernible effect, apart from the reading-into-writing task at ISE I, where the paper-based mode was slightly easier. Test-takers generally held more positive perceptions of the online delivery mode, although technical problems were reported. Findings are discussed with reference to the need for further research into interactions between delivery mode, task and level.",
keywords = "Paper-based testing of writing, Computer-based testing of writing, Online testing of writing, Mode of delivery, Perceptions, Second language writing assessment",
author = "Tineke Brunfaut and Luke Harding and Aaron Batty",
year = "2018",
month = apr,
doi = "10.1016/j.asw.2018.02.003",
language = "English",
volume = "36",
pages = "3--18",
journal = "Assessing Writing",
issn = "1075-2935",
publisher = "Elsevier Ltd",

}

RIS

TY - JOUR

T1 - Going online

T2 - The effect of mode of delivery on performances and perceptions on an English L2 writing test suite

AU - Brunfaut, Tineke

AU - Harding, Luke

AU - Batty, Aaron

PY - 2018/4

Y1 - 2018/4

N2 - In response to changing stakeholder needs, large-scale language test providers have increasingly considered the feasibility of delivering paper-based examinations online. Evidence is required, however, to determine whether online delivery of writing tests results in changes to writing performance reflected in differential test scores across delivery modes, and whether test-takers hold favourable perceptions of online delivery. The current study aimed to determine the effect of delivery mode on the two writing tasks (reading-into-writing and extended writing) within the Trinity College London Integrated Skills in English (ISE) test suite across three proficiency levels (CEFR B1-C1). 283 test-takers (107 at ISE I/B1, 109 at ISE II/B2, and 67 at ISE III/C1) completed both writing tasks in paper-based and online mode. Test-takers also completed a questionnaire to gauge perceptions of the impact, usability and fairness of the delivery modes. Many-facet Rasch measurement (MFRM) analysis of scores revealed that delivery mode had no discernible effect, apart from the reading-into-writing task at ISE I, where the paper-based mode was slightly easier. Test-takers generally held more positive perceptions of the online delivery mode, although technical problems were reported. Findings are discussed with reference to the need for further research into interactions between delivery mode, task and level.

AB - In response to changing stakeholder needs, large-scale language test providers have increasingly considered the feasibility of delivering paper-based examinations online. Evidence is required, however, to determine whether online delivery of writing tests results in changes to writing performance reflected in differential test scores across delivery modes, and whether test-takers hold favourable perceptions of online delivery. The current study aimed to determine the effect of delivery mode on the two writing tasks (reading-into-writing and extended writing) within the Trinity College London Integrated Skills in English (ISE) test suite across three proficiency levels (CEFR B1-C1). 283 test-takers (107 at ISE I/B1, 109 at ISE II/B2, and 67 at ISE III/C1) completed both writing tasks in paper-based and online mode. Test-takers also completed a questionnaire to gauge perceptions of the impact, usability and fairness of the delivery modes. Many-facet Rasch measurement (MFRM) analysis of scores revealed that delivery mode had no discernible effect, apart from the reading-into-writing task at ISE I, where the paper-based mode was slightly easier. Test-takers generally held more positive perceptions of the online delivery mode, although technical problems were reported. Findings are discussed with reference to the need for further research into interactions between delivery mode, task and level.

KW - Paper-based testing of writing

KW - Computer-based testing of writing

KW - Online testing of writing

KW - Mode of delivery

KW - Perceptions

KW - Second language writing assessment

U2 - 10.1016/j.asw.2018.02.003

DO - 10.1016/j.asw.2018.02.003

M3 - Journal article

VL - 36

SP - 3

EP - 18

JO - Assessing Writing

JF - Assessing Writing

SN - 1075-2935

ER -