Bianca Clavio Christensen, Brian Bemman, Hendrik Knoche, Rikke Gade
pp. 44 – 60, download
(https://doi.org/10.55612/s-5002-039-002)
Abstract
Technical educations often exhibit poor student performance and consequently high rates of attrition. Providing students with early feedback on their learning progress can assist them in self-study activities or in their decision-making process regarding a change in educational direction. In this paper, we present a set of instruments designed to identify at-risk undergraduate students in a Problem-based Learning (PBL) university, using an introductory programming course as a case study. Collectively, these instruments form the basis of a proposed learning ecosystem designed to identify struggling students by predicting their final exam grades in this course. We implemented this ecosystem and analyzed how well the obtained data predicted the final exam scores. Best-subset-regression and lasso regressions yielded several significant predictors. Apart from relevant predictors known from the literature on exam scores and drop-out factors such as midterm exam results and student retention factors, data from self-assessment quizzes, peer reviewing activities, and interactive online exercises helped predict exam performance and identified struggling students.
Keywords: Academic performance, Student retention, Learning Management System, Learning Tools Interoperability, Problem-Based Learning, Flipped learning