Prediction of Students' Academic Performance in the Programming Fundamentals Course Using Long Short-Term Memory Neural Networks
Average rating
Cast your vote
You can rate an item by clicking the amount of stars they wish to award to this item.
When enough users have cast their vote on this item, the average rating will also be shown.
Star rating
Your vote was cast
Thank you for your feedback
Thank you for your feedback
Authors
Vives, LuisCabezas, Ivan
Vives, Juan Carlos
Reyes, Nilton German
Aquino, Janet
Condor, Jose Bautista
Altamirano, S. Francisco Segura
Issue Date
2024-01-01Keywords
Educational data mininggenerative adversarial networks
long-short term memory
synthetic minority over-sampling technique
Metadata
Show full item recordJournal
IEEE AccessDOI
https://doi.org/10.1109/ACCESS.2024.3350169Abstract
In recent years, there has been evidence of a growing interest on the part of universities to know in advance the academic performance of their students and allow them to establish timely strategies to avoid desertion and failure. One of the biggest challenges to predicting student performance is presented in the course 'Programming Fundamentals' of Computer Science, Software Engineering, and Information Systems Engineering careers in Peruvian universities for high student dropout rates. The objective of this research was to explore the efficiency of Long-Short Term Memory Networks (LSTM) in the field of Educational Data Mining (EDM) to predict the academic performance of students during the seventh, eighth, twelfth, and sixteenth weeks of the academic semester, which allowed us to identify students at risk of failing the course. This research compares several predictive models, such as Deep Neural Network (DNN), Decision Tree (DT), Random Forest (RF), Logistic Regression (LR), Support Vector Classifier (SVM), and K-Nearest Neighbor (KNN). A major challenge machine learning algorithms face is a class imbalance in a dataset, resulting in over-fitting to the available data and, consequently, low accuracy. We use Generative Adversarial Networks (GAN) and Synthetic Minority Over-sampling Technique (SMOTE) to balance the data needed in our proposal. From the experimental results based on accuracy, precision, recall, and F1-Score, the superiority of our model is verified concerning a better classification, with 98.3% accuracy in week 8 using LSTM-GAN, followed by DNN-GAN with 98.1% accuracy.Type
info:eu-repo/semantics/articleRights
info:eu-repo/semantics/openAccessLanguage
engEISSN
21693536ae974a485f413a2113503eed53cd6c53
https://doi.org/10.1109/ACCESS.2024.3350169
Scopus Count
Collections
The following license files are associated with this item:
- Creative Commons

