Principles for assessing adaptive online courses

Weiyu Chen, Carlee Joe-Wong, Christopher G. Brinton, Liang Zheng, Da Cao

Research output: Contribution to conferencePaper

Abstract

Adaptive online courses are designed to automatically customize material for different users, typically based on data captured during the course. Assessing the quality of these adaptive courses, however, can be difficult. Traditional assessment methods for (machine) learning algorithms, such as comparison against a ground truth, are often unavailable due to education’s unique goal of affecting both internal user knowledge, which cannot be directly measured, as well as external, measurable performance. Traditional metrics for education like quiz scores, on the other hand, do not necessarily capture the adaptive course’s ability to present the right material to different users. In this work, we present a mathematical framework for developing scalable, efficiently computable metrics for these courses that can be used by instructors to gauge the efficacy of the adaptation and their course content. Our metric framework takes as input a set of quantities describing user activities in the course, and balances definitions of user consistency and overall efficacy as inferred by the quantity distributions. We support the metric definitions by comparing the results of a comprehensive statistical analysis with a sample metric evaluation on a dataset of roughly 5,000 users from an online chess platform. In doing so, we find that our metrics yield important insights about the course that are embedded in the larger statistical analysis, as well as additional insights into student drop-off rates.

Original languageEnglish (US)
StatePublished - Jan 1 2018
Externally publishedYes
Event11th International Conference on Educational Data Mining, EDM 2018 - Buffalo, United States
Duration: Jul 15 2018Jul 18 2018

Conference

Conference11th International Conference on Educational Data Mining, EDM 2018
CountryUnited States
CityBuffalo
Period7/15/187/18/18

All Science Journal Classification (ASJC) codes

  • Computer Science Applications
  • Information Systems

Fingerprint Dive into the research topics of 'Principles for assessing adaptive online courses'. Together they form a unique fingerprint.

  • Cite this

    Chen, W., Joe-Wong, C., Brinton, C. G., Zheng, L., & Cao, D. (2018). Principles for assessing adaptive online courses. Paper presented at 11th International Conference on Educational Data Mining, EDM 2018, Buffalo, United States.