Skip to content
forked from optuna/optuna

超参自动优化框架 A hyperparameter optimization framework

License

Notifications You must be signed in to change notification settings

robingong/optuna

This branch is 15465 commits behind optuna/optuna:master.

Folders and files

NameName
Last commit message
Last commit date
Jan 22, 2020
Dec 2, 2019
Jan 24, 2020
Jan 23, 2020
Jan 24, 2020
Jan 24, 2020
Jan 9, 2020
Nov 7, 2019
Nov 18, 2019
Nov 12, 2018
Oct 31, 2019
Jan 15, 2020
May 23, 2019
Jan 20, 2020

Repository files navigation

Optuna: A hyperparameter optimization framework

Python pypi conda GitHub license CircleCI Read the Docs Codecov Gitter chat

Website | Docs | Install Guide | Tutorial

Optuna is an automatic hyperparameter optimization software framework, particularly designed for machine learning. It features an imperative, define-by-run style user API. Thanks to our define-by-run API, the code written with Optuna enjoys high modularity, and the user of Optuna can dynamically construct the search spaces for the hyperparameters.

Key Features

Optuna has modern functionalities as follows:

  • Parallel distributed optimization
  • Pruning of unpromising trials
  • Lightweight, versatile, and platform agnostic architecture

Basic Concepts

We use the terms study and trial as follows:

  • Study: optimization based on an objective function
  • Trial: a single execution of the objective function

Please refer to sample code below. The goal of a study is to find out the optimal set of hyperparameter values (e.g., classifier and svm_c) through multiple trials (e.g., n_trials=100). Optuna is a framework designed for the automation and the acceleration of the optimization studies.

Open in Colab

import ...

# Define an objective function to be minimized.
def objective(trial):

    # Invoke suggest methods of a Trial object to generate hyperparameters.
    regressor_name = trial.suggest_categorical('classifier', ['SVR', 'RandomForest'])
    if regressor_name == 'SVR':
        svr_c = trial.suggest_loguniform('svr_c', 1e-10, 1e10)
        regressor_obj = sklearn.svm.SVR(C=svr_c)
    else:
        rf_max_depth = trial.suggest_int('rf_max_depth', 2, 32)
        regressor_obj = sklearn.ensemble.RandomForestRegressor(max_depth=rf_max_depth)

    X, y = sklearn.datasets.load_boston(return_X_y=True)
    X_train, X_val, y_train, y_val = sklearn.model_selection.train_test_split(X, y, random_state=0)

    regressor_obj.fit(X_train, y_train)
    y_pred = regressor_obj.predict(X_val)

    error = sklearn.metrics.mean_squared_error(y_val, y_pred)

    return error  # A objective value linked with the Trial object.

study = optuna.create_study()  # Create a new study.
study.optimize(objective, n_trials=100)  # Invoke optimization of the objective function.

Integrations

Integrations modules, which allow pruning, or early stopping, of unpromising trials are available for the following libraries:

  • XGBoost
  • LightGBM
  • Chainer
  • Keras
  • TensorFlow
  • tf.keras
  • MXNet
  • PyTorch Ignite
  • PyTorch Lightning
  • FastAI

Installation

Optuna is available at the Python Package Index and on Anaconda Cloud.

# PyPI
$ pip install optuna
# Anaconda Cloud
$ conda install -c conda-forge optuna

Optuna supports Python 3.5 or newer.

Communication

Contribution

Any contributions to Optuna are welcome! When you send a pull request, please follow the contribution guide.

License

MIT License (see LICENSE).

Reference

Takuya Akiba, Shotaro Sano, Toshihiko Yanase, Takeru Ohta, and Masanori Koyama. 2019. Optuna: A Next-generation Hyperparameter Optimization Framework. In KDD (arXiv).

About

超参自动优化框架 A hyperparameter optimization framework

Resources

License

Code of conduct

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 99.9%
  • Mako 0.1%