Streamlining the Path from Data to Deployment: Intelligent Methods for Hyperparameter Tuning in Machine Learning

Authors

  • Bakare K.A Department of Computer Science and Information Technology, Faculty of Computing and Artificial Intelligence, Federal University Dutsin-ma, Dutsin-ma, Katsina State, Nigeria
  • Abubakar S.I Department of Computer Science and Information Technology, Faculty of Computing and Artificial Intelligence, Federal University Dutsin-ma, Dutsin-ma, Katsina State, Nigeria
  • Naveen A.Y Department of Computer Science and Information Technology, Faculty of Computing and Artificial Intelligence, Federal University Dutsin-ma, Dutsin-ma, Katsina State, Nigeria
  • Abdullahi A.J Department of Computer Science and Information Technology, Faculty of Computing and Artificial Intelligence, Federal University Dutsin-ma, Dutsin-ma, Katsina State, Nigeria
  • Gaku M.S Department of Computer Science and Information Technology, Faculty of Computing and Artificial Intelligence, Federal University Dutsin-ma, Dutsin-ma, Katsina State, Nigeria
  • Abdulganiyu I Department of Computer Science and Information Technology, Faculty of Computing and Artificial Intelligence, Federal University Dutsin-ma, Dutsin-ma, Katsina State, Nigeria
  • Asmau .U Department of Computer Science and Information Technology, Faculty of Computing and Artificial Intelligence, Federal University Dutsin-ma, Dutsin-ma, Katsina State, Nigeria
  • Ahmad .S Department of Computer Science and Information Technology, Faculty of Computing and Artificial Intelligence, Federal University Dutsin-ma, Dutsin-ma, Katsina State, Nigeria

DOI:

https://doi.org/10.56892/bima.v8i1.603

Keywords:

Hyper-parameter optimization, machine learning, Bayesian optimization, particle swarm optimization, genetic algorithm, grid search.

Abstract

This study addresses the essential role of hyperparameter optimization in intricate machine learning models, particularly in image classification tasks. With manual tuning impractical in the face of escalating complexities, the research thoroughly evaluates eight automated optimization methods: grid search, random search, Gaussian process Bayesian optimization (BO), Tree Parzen estimator BO, Hyperband, BO/Hyperband hybrid, genetic algorithms, and particle swarm optimization. Assessments cover diverse model architectures and performance metrics, considering accuracy, mean squared error, and optimization time. Grid search proves exhaustive but time-prohibitive, random search is sensitive to seed values, Gaussian process BO excels in low-dimensional spaces, and Tree Parzen estimator BO is efficient in higher dimensions. Hyperband prioritizes time efficiency, genetic algorithms pose parallelization challenges, and particle swarm optimization excels with optimal accuracy and efficiency. Distinct advantages emerge based on model architecture and search space complexity, highlighting the need for tailored optimizers in specific machine learning applications. Comprehensive benchmarks provide valuable guidance, with future work recommended to extend evaluations to emerging model classes, particularly deep neural networks.Top of Form

 

Downloads

Published

2024-03-31

How to Cite

K.A, B. ., S.I, A. ., A.Y, N. ., A.J, A. ., M.S, G. ., I, A. ., .U, A. ., & .S, A. . (2024). Streamlining the Path from Data to Deployment: Intelligent Methods for Hyperparameter Tuning in Machine Learning. BIMA JOURNAL OF SCIENCE AND TECHNOLOGY (2536-6041), 8(1A), 192-210. https://doi.org/10.56892/bima.v8i1.603