Preprint Review Version 4 Preserved in Portico This version is not peer-reviewed

Hyperparameter Optimization and Combined Data Sampling Techniques in Machine Learning for Customer Churn Prediction: A Comparative Analysis

Version 1 : Received: 21 August 2023 / Approved: 21 August 2023 / Online: 21 August 2023 (13:12:21 CEST)
Version 2 : Received: 24 September 2023 / Approved: 25 September 2023 / Online: 26 September 2023 (05:17:49 CEST)
Version 3 : Received: 7 November 2023 / Approved: 8 November 2023 / Online: 8 November 2023 (10:22:33 CET)
Version 4 : Received: 16 November 2023 / Approved: 17 November 2023 / Online: 17 November 2023 (14:15:58 CET)

A peer-reviewed article of this Preprint also exists.

Imani, M.; Arabnia, H.R. Hyperparameter Optimization and Combined Data Sampling Techniques in Machine Learning for Customer Churn Prediction: A Comparative Analysis. Technologies 2023, 11, 167. Imani, M.; Arabnia, H.R. Hyperparameter Optimization and Combined Data Sampling Techniques in Machine Learning for Customer Churn Prediction: A Comparative Analysis. Technologies 2023, 11, 167.

Abstract

This paper explores the application of various machine learning techniques for predicting customer churn in the telecommunications sector. We utilized a publicly accessible dataset and implemented several models, including Artificial Neural Networks, Decision Trees, Support Vector Machines, Random Forests, Logistic Regression, and gradient boosting techniques (XGBoost, LightGBM, and CatBoost). To mitigate the challenges posed by imbalanced datasets, we adopted different data sampling strategies, namely SMOTE, SMOTE combined with Tomek Links, and SMOTE combined with Edited Nearest Neighbors. Moreover, hyperparameter tuning was employed to enhance model performance. Our evaluation employed standard metrics such as Precision, Recall, F1-Score, and the Receiver Operating Characteristic Area Under Curve (ROC AUC). Regarding the F1-Score metric, CatBoost demonstrates superior performance compared to other machine learning models, achieving an outstanding 93% following the application of Optuna hyperparameter optimization. In the context of the ROC AUC metric, both XGBoost and CatBoost exhibit exceptional performance, recording remarkable scores of 91%. This achievement for XGBoost is attained after implementing a combination of SMOTE with Tomek Links, while CatBoost reaches this level of performance after the application of Optuna hyperparameter optimization.

Keywords

machine learning; churn prediction; imbalanced data; combined data sampling techniques; hyperparameter optimization

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (1)

Comment 1
Received: 17 November 2023
Commenter: Mehdi Imani
Commenter's Conflict of Interests: Author
Comment: Some changes we have implemented:

Abstract Revision: The abstract has been thoroughly rewritten to improve its readability and flow. This revision aims to provide a clearer and more concise overview of the study, making it more accessible to readers.

Clarification in Introduction: We have carefully rephrased several sentences in the introduction that were previously unclear. These modifications are intended to present the research context and objectives more coherently, ensuring that readers can easily grasp the study's premise.

Addition of Table 1: To further aid in readability and comprehension, we have included Table 1 in the manuscript. This table is designed to succinctly present important acronyms.
+ Respond to this comment

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 1
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.