Overview of Hyperparameter Tuning and Optimizing Hyperparameters
A Hyperparameter is a parameter whose value is set before the learning process begins. Two different methods for optimizing Hyperparameters -
Grid Search - It is a popular way to achieve hyperparameter optimization. It works by examining exhaustively through a designated subset of hyperparameters.
Random Search - Random search varies from grid search primarily in that it searches the specified subset of hyperparameters randomly somewhat of exhaustively. The significant benefit being decreased processing time.
World-class articles, delivered weekly.
See Akira AI in action
We transform large organizations around the world by translating cutting-edge AI
research into customizable, scalable and human-centric AI products.