HParams is a thoughtful approach to configuration management for machine learning projects. It enables you to externalize your hyperparameters into a configuration file. In doing so, you can reproduce experiments, iterate quickly, and reduce errors.
Features:
- Approachable and easy-to-use API
- Battle-tested over three years
- Fast with little to no runtime overhead (< 1e-05 seconds) per configured function
- Robust to most use cases with 100% test coverage and 74 tests
- Lightweight with only one dependency
Logo by Chloe Yeo, Corporate Sponsorship by WellSaid Labs
Make sure you have Python 3. You can then install hparams
using pip
:
pip install hparams
Install the latest code via:
pip install git+https://github.com/PetrochukM/HParams.git
With HParams, you will avoid common but needless hyperparameter mistakes. It will throw a warning or error if:
- A hyperparameter is overwritten.
- A hyperparameter is declared but not set.
- A hyperparameter is set but not declared.
- A hyperparameter type is incorrect.
Finally, HParams is built with developer experience in mind. The errors thrown by HParams are verbose to ensure a swift resolution.
Add HParams to your project by following one of these common use cases:
Configure your training run, like so:
# main.py
from hparams import configurable, add_config, HParams, HParam
from typing import Union
@configurable
def train(batch_size: Union[int, HParam]=HParam(int)):
pass
class Model():
@configurable
def __init__(self, hidden_size=HParam(int), dropout=HParam(float)):
pass
add_config({ 'main': {
'train': HParams(batch_size=32),
'Model.__init__': HParams(hidden_size=1024, dropout=0.25),
}})
HParams supports optional configuration typechecking to help you find bugs! 🐛
Configure PyTorch and Tensorflow defaults to match via:
from torch.nn import BatchNorm1d
from hparams import configurable, add_config, HParams
# NOTE: `momentum=0.01` to match Tensorflow defaults
BatchNorm1d.__init__ = configurable(BatchNorm1d.__init__)
add_config({ 'torch.nn.BatchNorm1d.__init__': HParams(momentum=0.01) })
Configure your random seed globally, like so:
# config.py
import random
from hparams import configurable, add_config, HParams
random.seed = configurable(random.seed)
add_config({'random.seed': HParams(a=123)})
# main.py
import config
import random
random.seed()
Experiment with hyperparameters through your command line, for example:
foo@bar:~$ file.py --torch.optim.adam.Adam.__init__ 'HParams(lr=0.1,betas=(0.999,0.99))'
import sys
from torch.optim import Adam
from hparams import configurable, add_config, parse_hparam_args
Adam.__init__ = configurable(Adam.__init__)
parsed = parse_hparam_args(sys.argv[1:]) # Parse command line arguments
add_config(parsed)
Hyperparameter optimization is easy to-do, check this out:
import itertools
from torch.optim import Adam
from hparams import configurable, add_config, HParams
Adam.__init__ = configurable(Adam.__init__)
def train(): # Train the model and return the loss.
pass
for betas in itertools.product([0.999, 0.99, 0.9], [0.999, 0.99, 0.9]):
add_config({Adam.__init__: HParams(betas=betas)}) # Grid search over the `betas`
train()
Easily track your hyperparameters using tools like Comet.
from comet_ml import Experiment
from hparams import get_config
experiment = Experiment()
experiment.log_parameters(get_config())
Export a Python functools.partial
to use in another process, like so:
from hparams import configurable, HParam
@configurable
def func(hparam=HParam()):
pass
partial = func.get_configured_partial()
With this approach, you don't have to transfer the global state to the new process. To transfer the
global state, you'll want to use get_config
and add_config
.
The complete documentation for HParams is available here.
We've released HParams because a lack of hyperparameter management solutions. We hope that other people can benefit from the project. We are thankful for any contributions from the community.
Read our contributing guide to learn about our development process, how to propose bugfixes and improvements, and how to build and test your changes to HParams.
- Michael Petrochuk — Developer
- Chloe Yeo — Logo Design
If you find HParams useful for an academic publication, then please use the following BibTeX to cite it:
@misc{hparams,
author = {Petrochuk, Michael},
title = {HParams: Hyperparameter management solution},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/PetrochukM/HParams}},
}