Home > Research > Publications & Outputs > Going online

Electronic data

  • Brunfaut_Harding_Batty_(2018) Author accepted manuscript

    Rights statement: This is the author’s version of a work that was accepted for publication in Assessing Writing. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Assessing Writing, ???, ??, 2018 DOI:

    Accepted author manuscript, 1 MB, PDF-document

    Embargo ends: 26/02/19

    Available under license: CC BY-NC-ND: Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License

Links

Text available via DOI:

View graph of relations

Going online: The effect of mode of delivery on performances and perceptions on an English L2 writing test suite

Research output: Contribution to journalJournal article

Published
<mark>Journal publication date</mark>04/2018
<mark>Journal</mark>Assessing Writing
Volume36
Pages (from-to)3-18
StatePublished
Early online date26/02/18
Original languageEnglish

Abstract

In response to changing stakeholder needs, large-scale language test providers have increasingly considered the feasibility of delivering paper-based examinations online. Evidence is required, however, to determine whether online delivery of writing tests results in changes to writing performance reflected in differential test scores across delivery modes, and whether test-takers hold favourable perceptions of online delivery. The current study aimed to determine the effect of delivery mode on the two writing tasks (reading-into-writing and extended writing) within the Trinity College London Integrated Skills in English (ISE) test suite across three proficiency levels (CEFR B1-C1). 283 test-takers (107 at ISE I/B1, 109 at ISE II/B2, and 67 at ISE III/C1) completed both writing tasks in paper-based and online mode. Test-takers also completed a questionnaire to gauge perceptions of the impact, usability and fairness of the delivery modes. Many-facet Rasch measurement (MFRM) analysis of scores revealed that delivery mode had no discernible effect, apart from the reading-into-writing task at ISE I, where the paper-based mode was slightly easier. Test-takers generally held more positive perceptions of the online delivery mode, although technical problems were reported. Findings are discussed with reference to the need for further research into interactions between delivery mode, task and level.