This package was written for distributed asynchronous (parallel) hyper-parameter optimization of machine learning algorithms. It can also be used as a framework for Bayesian Optimization for general algorithm configuration.
forked from hyperopt/hyperopt
-
Notifications
You must be signed in to change notification settings - Fork 0
Distributed Asynchronous Hyper-parameter Optimization
gwtaylor/hyperopt
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
Distributed Asynchronous Hyper-parameter Optimization
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published
Languages
- Python 100.0%