BAYESIAN OPTIMIZATION FOR TUNING HYPERPARAMETRS OF MACHINE LEARNING MODELS: A PERFORMANCE ANALYSIS IN XGBOOST
DOI:
https://doi.org/10.31891/csit-2025-1-16Keywords:
XGBoost, Bayesian optimization, hyperparameter tuning, machine learning, tree-structured parzen estimatorAbstract
The performance of machine learning models depends on the selection and tuning of hyperparameters. As a widely used gradient boosting method, XGBoost relies on optimal hyperparameter configurations to balance model complexity, prevent overfitting, and improve generalization. Especially in high-dimensional hyperparameter spaces, traditional approaches including grid search and random search are computationally costly and ineffective. Recent findings in automated hyperparameter tuning, specifically Bayesian optimization with the tree-structured parzen estimator have shown promise in raising the accuracy and efficiency of model optimization. The aim of this paper is to analyze how effective Bayesian optimization is in tuning XGBoost hyperparameters for a real classification issue. Comparing Bayesian optimization with traditional search methods can help to assess its effects on model accuracy, convergence speed, and computing economy. As a case study in this research, a dataset of consumer spending behaviors was used. The classification task aimed to differentiate between two transaction categories: hotels, restaurants, and cafés against the retail sector. The performance of the model was evaluated using loss function minimization, convergence stability, and classification accuracy. This paper shows that Bayesian optimization improves XGBoost hyperparameter tuning, hence improving classification performance while lowering computational costs. The results offer empirical proof that Bayesian optimization outperforms traditional techniques in terms of accuracy, stability, and scalability.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2025 Микола ЗЛОБІН, Володимир БАЗИЛЕВИЧ

This work is licensed under a Creative Commons Attribution 4.0 International License.