Streamlining the Path from Data to Deployment: Intelligent Methods for Hyperparameter Tuning in Machine Learning
DOI:
https://doi.org/10.56892/bima.v8i1.603Keywords:
Hyper-parameter optimization, machine learning, Bayesian optimization, particle swarm optimization, genetic algorithm, grid search.Abstract
This study addresses the essential role of hyperparameter optimization in intricate machine learning models, particularly in image classification tasks. With manual tuning impractical in the face of escalating complexities, the research thoroughly evaluates eight automated optimization methods: grid search, random search, Gaussian process Bayesian optimization (BO), Tree Parzen estimator BO, Hyperband, BO/Hyperband hybrid, genetic algorithms, and particle swarm optimization. Assessments cover diverse model architectures and performance metrics, considering accuracy, mean squared error, and optimization time. Grid search proves exhaustive but time-prohibitive, random search is sensitive to seed values, Gaussian process BO excels in low-dimensional spaces, and Tree Parzen estimator BO is efficient in higher dimensions. Hyperband prioritizes time efficiency, genetic algorithms pose parallelization challenges, and particle swarm optimization excels with optimal accuracy and efficiency. Distinct advantages emerge based on model architecture and search space complexity, highlighting the need for tailored optimizers in specific machine learning applications. Comprehensive benchmarks provide valuable guidance, with future work recommended to extend evaluations to emerging model classes, particularly deep neural networks.Top of Form