Examining the designs of computer-based assessment and its impact on student engagement, satisfaction, and pass rates. Nguyen, Q., Rienties, B., Toetenel, L., Ferguson, R., & Whitelock, D. Computers in Human Behavior, 76:703-714, Elsevier Ltd, 11, 2017.
abstract   bibtex   
Many researchers who study the impact of computer-based assessment (CBA) focus on the affordances or complexities of CBA approaches in comparison to traditional assessment methods. This study examines how CBA approaches were configured within and between modules, and the impact of assessment design on students' engagement, satisfaction, and pass rates. The analysis was conducted using a combination of longitudinal visualisations, correlational analysis, and fixed-effect models on 74 undergraduate modules and their 72,377 students. Our findings indicate that educators designed very different assessment strategies, which significantly influenced student engagement as measured by time spent in the virtual learning environment (VLE). Weekly analyses indicated that assessment activities were balanced with other learning activities, which suggests that educators tended to aim for a consistent workload when designing assessment strategies. Since most of the assessments were computer-based, students spent more time on the VLE during assessment weeks. By controlling for heterogeneity within and between modules, learning design could explain up to 69% of the variability in students' time spent on the VLE. Furthermore, assessment activities were significantly related to pass rates, but no clear relation with satisfaction was found. Our findings highlight the importance of CBA and learning design to how students learn online.
@article{
 title = {Examining the designs of computer-based assessment and its impact on student engagement, satisfaction, and pass rates},
 type = {article},
 year = {2017},
 identifiers = {[object Object]},
 keywords = {Academic retention,Computer-based assessment,Learner satisfaction,Learning analytics,Learning design,Virtual learning environment},
 pages = {703-714},
 volume = {76},
 month = {11},
 publisher = {Elsevier Ltd},
 day = {1},
 id = {31f1744f-6497-3307-8c7f-27bd1cd39046},
 created = {2020-02-03T15:25:12.659Z},
 accessed = {2020-02-03},
 file_attached = {false},
 profile_id = {66be748e-b1e3-36e1-95e1-5830d0ccc3ca},
 group_id = {ed1fa25d-c56b-3067-962d-9d08ff49394c},
 last_modified = {2020-02-03T15:38:44.714Z},
 read = {false},
 starred = {false},
 authored = {false},
 confirmed = {true},
 hidden = {false},
 folder_uuids = {b6de18da-82e2-4c7f-ab8c-6131d929f4d1},
 private_publication = {false},
 abstract = {Many researchers who study the impact of computer-based assessment (CBA) focus on the affordances or complexities of CBA approaches in comparison to traditional assessment methods. This study examines how CBA approaches were configured within and between modules, and the impact of assessment design on students' engagement, satisfaction, and pass rates. The analysis was conducted using a combination of longitudinal visualisations, correlational analysis, and fixed-effect models on 74 undergraduate modules and their 72,377 students. Our findings indicate that educators designed very different assessment strategies, which significantly influenced student engagement as measured by time spent in the virtual learning environment (VLE). Weekly analyses indicated that assessment activities were balanced with other learning activities, which suggests that educators tended to aim for a consistent workload when designing assessment strategies. Since most of the assessments were computer-based, students spent more time on the VLE during assessment weeks. By controlling for heterogeneity within and between modules, learning design could explain up to 69% of the variability in students' time spent on the VLE. Furthermore, assessment activities were significantly related to pass rates, but no clear relation with satisfaction was found. Our findings highlight the importance of CBA and learning design to how students learn online.},
 bibtype = {article},
 author = {Nguyen, Quan and Rienties, Bart and Toetenel, Lisette and Ferguson, Rebecca and Whitelock, Denise},
 journal = {Computers in Human Behavior}
}
Downloads: 0