Bayesian Optimization
A sequential optimization strategy for finding the best hyperparameters by building a probabilistic model of the objective function and using it to select the most promising configurations to evaluate.
Why It Matters
Bayesian optimization finds good hyperparameters in far fewer evaluations than grid or random search, saving significant compute cost and time.
Example
Finding optimal learning rate, dropout rate, and layer count for a model in 30 evaluations instead of the 1,000+ needed for grid search.
Think of it like...
Like a smart treasure hunter who uses clues from previous digs to decide where to search next, rather than digging randomly — each attempt informs the next.
Related Terms
Hyperparameter Tuning
The process of systematically searching for the best combination of hyperparameters for a model. Since hyperparameters are set before training, finding optimal values requires experimentation.
AutoML
Automated Machine Learning — tools and techniques that automate the end-to-end process of applying machine learning, including feature engineering, model selection, and hyperparameter tuning.