Towards the Selection of Distance Metrics for k-NN Classifier in Students' Performance Prediction Modeling

This paper investigates the impact of changing distance metrics on the performance of the k-NN classifier. The study investigates the variation in models performance with changing distance metric and value of k in the context of students' performance prediction models. The research utilizes dat...

Full description

Saved in:
Bibliographic Details
Main Authors: Khan I., Mohamed Zabil M.H., Ahmad A.R., Jabeur N.
Other Authors: 58061521900
Format: Conference Paper
Published: Institute of Electrical and Electronics Engineers Inc. 2024
Subjects:
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Universiti Tenaga Nasional
Description
Summary:This paper investigates the impact of changing distance metrics on the performance of the k-NN classifier. The study investigates the variation in models performance with changing distance metric and value of k in the context of students' performance prediction models. The research utilizes datasets specifically designed for students' performance prediction modeling. Starting with a I-NN model, the experiments increment the value of k by 2 until the size of the dataset is reached. The experiments are repeated with different distance metrics derived from Minkowski derivation, including Euclidean, Manhattan, and Chebyshev. The findings indicate that there is no unanimously dominant distance metric for every dataset. However, the Euclidean and Manhattan distance metrics emerge effective, while Chebyshev exhibits lower performance. The research concludes Euclidean and Manhattan distance metrics as appropriate metrics for students' performance prediction modeling using the k-NN classifier. � 2023 IEEE.