Specification and evaluation of an assessment engine for educational games: Empowering educators with an assessment editor and a learning analytics dashboard. Chaudy, Y. & Connolly, T. Entertainment Computing, 27:209-224, Elsevier B.V., 8, 2018.
abstract   bibtex   
Assessment is a crucial aspect of any teaching and learning process. Educational games offer promising advantages for assessment; personalised feedback to students and automated assessment process. However, while many teachers agree that educational games increase motivation, learning and retention, few of them are ready to fully trust them as an assessment tool. We believe there are two main reasons for this lack of trust: educators are not given sufficient information about the gameplays, and many educational games are distributed as black-boxes, unmodifiable by teachers. This paper presents an assessment engine designed to separate a game and its assessment. It allows teachers to modify a game's assessment after distribution and visualise gameplay data via a learning analytics dashboard. The engine was evaluated quantitatively by 31 educators. Findings were overall very positive: both the assessment editor and the learning analytics dashboard were rated useful and easy to use. The evaluation also indicates that, having access to EngAGe, educators would be more likely to trust a game's assessment. This paper concludes that EngAGe can be used by educators effectively to modify educational games’ assessment and visualise gameplay data, and that it contributes to increasing their trust in educational games as an assessment tool.
@article{
 title = {Specification and evaluation of an assessment engine for educational games: Empowering educators with an assessment editor and a learning analytics dashboard},
 type = {article},
 year = {2018},
 identifiers = {[object Object]},
 keywords = {Assessment,Assessment editor,Assessment engine,Educational games,Learning analytics},
 pages = {209-224},
 volume = {27},
 month = {8},
 publisher = {Elsevier B.V.},
 day = {1},
 id = {55b4c056-79c1-309c-beb0-196a691739cf},
 created = {2020-02-03T15:25:12.917Z},
 accessed = {2020-02-03},
 file_attached = {false},
 profile_id = {66be748e-b1e3-36e1-95e1-5830d0ccc3ca},
 group_id = {ed1fa25d-c56b-3067-962d-9d08ff49394c},
 last_modified = {2020-02-03T15:38:42.784Z},
 read = {false},
 starred = {false},
 authored = {false},
 confirmed = {true},
 hidden = {false},
 folder_uuids = {b6de18da-82e2-4c7f-ab8c-6131d929f4d1},
 private_publication = {false},
 abstract = {Assessment is a crucial aspect of any teaching and learning process. Educational games offer promising advantages for assessment; personalised feedback to students and automated assessment process. However, while many teachers agree that educational games increase motivation, learning and retention, few of them are ready to fully trust them as an assessment tool. We believe there are two main reasons for this lack of trust: educators are not given sufficient information about the gameplays, and many educational games are distributed as black-boxes, unmodifiable by teachers. This paper presents an assessment engine designed to separate a game and its assessment. It allows teachers to modify a game's assessment after distribution and visualise gameplay data via a learning analytics dashboard. The engine was evaluated quantitatively by 31 educators. Findings were overall very positive: both the assessment editor and the learning analytics dashboard were rated useful and easy to use. The evaluation also indicates that, having access to EngAGe, educators would be more likely to trust a game's assessment. This paper concludes that EngAGe can be used by educators effectively to modify educational games’ assessment and visualise gameplay data, and that it contributes to increasing their trust in educational games as an assessment tool.},
 bibtype = {article},
 author = {Chaudy, Yaëlle and Connolly, Thomas},
 journal = {Entertainment Computing}
}

Downloads: 0