An adaptive assessment: Online summary with automated feedback as a self-assessment tool in MOOCs environments
Abstract
Online assessment is one of the important factors in online learning today. An online summary assessment is an example of an open-ended question, offering the advantage of probing students' understanding of the learning materials. However, grading students’ summary writings is challenging due to the time-consuming process of evaluating students’ writing assignments. Particularly, if the course is delivered in a Massive Open Online Courses (MOOCs) platform where the number of students is massive. Therefore, the purpose of this research is to develop a feature that can analyze student summary results and providing feedback. The feedback given varies for each student as it depends on the results of the summary assessment. The algorithm employed in the online summary with automated feedback feature was Cosine similarity which is part of text similarity in natural language processing (NLP). To measure the effectiveness, usability, and student satisfaction of this feature, 100 students were involved as research participants. The results of this study indicated an increase in student learning outcomes. In conclusion, student responses to the use and satisfaction of this feature are good.
Full Text:
PDFRefbacks
- There are currently no refbacks.
This work is licensed under a Creative Commons Attribution 4.0 License.
Laboratory for Knowledge Management & E-Learning, The University of Hong Kong