×
Home Current Archive Editorial board
Instructions for papers
For Authors Aim & Scope Contact
Original scientific article

PYTHON-DRIVEN ADAPTIVE TESTING ALGORITHMS FOR PERSONALIZED ASSESSMENT IN E-LEARNING PLATFORMS

By
Ankita Sappa Orcid logo
Ankita Sappa

Wichita State University , Wichita , United States

Abstract

The recent advancement in e-learning systems has highlighted the need for more tailored and effective methods of assessment. The e-learning system has become increasingly common in society; however, it comes with its own unique challenges. This study explores the development of an adaptive testing framework implemented in Python, utilizing algorithms driven by the learner's real-time performance data to continually adjust the difficulty and order of questions presented. The system merges Item Response Theory (IRT) with advanced machine learning to proactively estimate learner's mastery level and modify the assessment sequence ion in real time. Different learner profiles yielded improved accuracy across tests in a broad range of assessments, less time spent on evaluation, and greater satisfaction from users. Important parameters of performance like response time, range of participation, and prediction precision were assessed with actual data in a simulated e-learning setting. This research is particularly important in its demonstration how responsive testing frameworks in Python can enhance digital assessment through adaptation and increase customized learning experiences throughout all levels. This work provides, for the first time, an open-source model to be built upon within the educational technology ecosystem while simultaneously creating pathways for innovative design of future intelligent tutoring systems.

References

1.
Sharma P, Hannafin MJ. Scaffolding in technology-enhanced learning environments. Interactive Learning Environments. 2007 Apr 1;15(1):27–46.
2.
Mogoui HM. Comparison of personality traits and initial maladaptive schemas of addicts and nonaddicts. International Academic Journal of Innovative Research. 2017;4(2):74–9.
3.
Choe EM, Kern JL, Chang HH. Optimizing the Use of Response Times for Item Selection in Computerized Adaptive Testing. Journal of Educational and Behavioral Statistics. 2018 Apr;43(2):135–58.
4.
Baggyalakshmi N, Keerthana A, Revathi R. Efficient Compressor Testing on Railways with A Mobile Application. International Academic Journal of Science and Engineering. 2023;10(2):123–30.
5.
Bennett RE. Formative assessment: a critical review. Assessment in Education: Principles, Policy & Practice. 2011 Feb 1;18(1):5–25.

Citation

This is an open access article distributed under the  Creative Commons Attribution Non-Commercial License (CC BY-NC) License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. 

Article metrics

Google scholar: See link

The statements, opinions and data contained in the journal are solely those of the individual authors and contributors and not of the publisher and the editor(s). We stay neutral with regard to jurisdictional claims in published maps and institutional affiliations.