This page contains a list of example codes written with Optuna. The example files are in optuna/optuna-examples.
- AllenNLP
- AllenNLP (Jsonnet)
- Catalyst
- CatBoost without pruning
- CatBoost with pruning
- Chainer
- ChainerMN
- Dask-ML
- FastAI V1
- FastAI V2
- Haiku
- Gluon
- Keras
- LightGBM
- LightGBM Tuner
- MXNet
- PyTorch
- PyTorch Ignite
- PyTorch Lightning
- RAPIDS
- Scikit-learn
- Scikit-learn OptunaSearchCV
- Scikit-image
- SKORCH
- Tensorflow
- Tensorflow (eager)
- XGBoost
The following example demonstrates how to implement an objective function that uses additional arguments other than trial
.
The following example demonstrates how to implement pruning logic with Optuna.
In addition, integration modules are available for the following libraries, providing simpler interfaces to utilize pruning.
- Pruning with Catalyst integration module
- Pruning with Chainer integration module
- Pruning with ChainerMN integration module
- Pruning with FastAI V1 integration module
- Pruning with FastAI V2 integration module
- Pruning with Keras integration module
- Pruning with LightGBM integration module
- Pruning with MXNet integration module
- Pruning with PyTorch integration module
- Pruning with PyTorch Ignite integration module
- Pruning with PyTorch Lightning integration module
- Pruning with Tensorflow integration module
- Pruning with XGBoost integration module
- Pruning with XGBoost integration module (cross validation, XGBoost.cv)
- Allegro Trains
- BBO-Rietveld: Automated crystal structure refinement
- Catalyst
- CuPy
- Hydra's Optuna Sweeper plugin
- Mozilla Voice STT
- neptune.ai
- OptGBM: A scikit-learn compatible LightGBM estimator with Optuna
- PyKEEN
- RL Baselines Zoo
PRs to add additional projects welcome in optuna-examples!