Skip to content

Reproducible results for the various types of optimization techniques I have implemented

License

Notifications You must be signed in to change notification settings

BeeGass/Optimization

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Optimization
Jax - Julia

A collection of optimization techniques I have implemented in jax and flux with particular effort put into readability and reproducibility.

Python

Requirements For Jax

  • Python >= 3.8
  • jax

Installation

$ git clone https://github.com/BeeGass/Readable-Optimization.git

Usage

$ cd Readable-Optimization/vae-jax
$ python main.py 

Julia

Requirements For Flux

  • TODO
  • TODO

Usage

$ cd Readable-Optimization/vae-flux
$ # TBA 

Config File Template

TBA

First Order Results

Model Jax/Flax Flux Config Paper Animations Samples
Gradient Descent Link TBA TBA
Stochastic Gradient Descent Link TBA TBA
Batch Gradient Descent Link TBA TBA
Mini-Batch Gradient Descent Link TBA TBA
SGD w/ Momentum Link TBA TBA
Nesterov's Gradient Acceleration Link TBA TBA
AdaGrad(Adaptive Gradient Descent) Link TBA TBA
AdaDelta Link TBA TBA
RMS-Prop (Root Mean Square Propagation) Link TBA TBA
Adam(Adaptive Moment Estimation) Link TBA TBA
Adamw Link TBA TBA

Second Order Results

Model Jax/Flax Flux Config Paper Animations Samples
Newton's Method Link TBA TBA
Secant Method Link TBA TBA
Davidson-Fletcher-Powell (DFP) Link TBA TBA
Broyden-Fletcher-Goldfarb-Shanno (BFGS) Link TBA TBA
Limited-memory BFGS (L-BFGS) Link TBA TBA
Newton-Raphson Link TBA TBA
Levenberg-Marquardt Link TBA TBA
Powell's method Link TBA TBA
Steepest Descent Link TBA TBA
Truncated Newton Link TBA TBA
Fletcher-Reeves Link TBA TBA

Citation

@software{B_Gass_Optimization_2022,
author = {B Gass, B Gass},
doi = {10.5281/zenodo.1234},
month = {1},
title = {{Readable-Optimization}},
url = {https://github.com/BeeGass/Optimization},
version = {1.0.0},
year = {2022}
}

About

Reproducible results for the various types of optimization techniques I have implemented

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages