🔥 Burn Fat Fast. Discover How! 💪

Optuna: How to Automate Hyperparameter Tuning Tuning the hyper | Big Data Science

Optuna: How to Automate Hyperparameter Tuning
Tuning the hyperparameters of an ML model takes a lot of time and effort. To simplify this task, you can use special frameworks, one of which is Optuna. Launched in 2019, this platform has the following advantages:
• compatibility with PyTorch, Chainer, TensorFlow, Keras, MXNet, Scikit-Learn, XGBoost, LightGBM and other ML frameworks;
• work in an understandable DS-language using conditional expressions, loops and Python syntax;
• the ability to handle continuous hyperparameter values by tuning alpha and lambda regularization to any floating point values within a given range;
• use of Bayesian selection algorithms with the ability to remove the obviously losing space of the given hyperparameters from the analysis in order to speed up the optimization;
• parallelization of the search for hyperparameters over several threads or processes without changing the code;
• works faster than analogs (RandomSearch, GridSearch, hyperopt, scikit-optimize);
• detailed documentation.
https://optuna.org/
https://towardsdatascience.com/kagglers-guide-to-lightgbm-hyperparameter-tuning-with-optuna-in-2021-ed048d9838b5