diff --git a/README.md b/README.md index 4655c76..b037d65 100644 --- a/README.md +++ b/README.md @@ -57,38 +57,40 @@ TBA First Order Results
-| Model | Code | Config | Paper | Animations | Samples | -|-----------------------------------------|-------|---------|---------------------------------------------------|----------------|---------| -| Gradient Descent |☐| ☐ | [Link]() | **TBA** | **TBA** | -| Stochastic Gradient Descent |☐| ☐ | [Link]() | **TBA** | **TBA** | -| Batch Gradient Descent |☐| ☐ | [Link]() | **TBA** | **TBA** | -| Mini-Batch Gradient Descent |☐| ☐ | [Link]() | **TBA** | **TBA** | -| SGD w/ Momentum |☐| ☐ | [Link]() | **TBA** | **TBA** | -| Nesterov's Gradient Acceleration |☐| ☐ | [Link]() | **TBA** | **TBA** | -| AdaGrad(Adaptive Gradient Descent) |☐| ☐ | [Link]() | **TBA** | **TBA** | -| AdaDelta |☐| ☐ | [Link]() | **TBA** | **TBA** | -| RMS-Prop (Root Mean Square Propagation) |☐| ☐ | [Link]() | **TBA** | **TBA** | -| Adam(Adaptive Moment Estimation) |☐| ☐ | [Link]() | **TBA** | **TBA** | -| Adamw |☐| ☐ | [Link]() | **TBA** | **TBA** | + +| Model | PyTorch | Jax/Flax | Flux | Config | Paper | Animations | Samples | +|:--------------------------------------- |:-------:|:--------:|:-------:|:-------:|:-------- |:----------:|:-------:| +| Gradient Descent | ☐ | ☐ | ☐ | ☐ | [Link]() | **TBA** | **TBA** | +| Stochastic Gradient Descent | ☐ | ☐ | ☐ | ☐ | [Link]() | **TBA** | **TBA** | +| Batch Gradient Descent | ☐ | ☐ | ☐ | ☐ | [Link]() | **TBA** | **TBA** | +| Mini-Batch Gradient Descent | ☐ | ☐ | ☐ | ☐ | [Link]() | **TBA** | **TBA** | +| SGD w/ Momentum | ☐ | ☐ | ☐ | ☐ | [Link]() | **TBA** | **TBA** | +| Nesterov's Gradient Acceleration | ☐ | ☐ | ☐ | ☐ | [Link]() | **TBA** | **TBA** | +| AdaGrad(Adaptive Gradient Descent) | ☐ | ☐ | ☐ | ☐ | [Link]() | **TBA** | **TBA** | +| AdaDelta | ☐ | ☐ | ☐ | ☐ | [Link]() | **TBA** | **TBA** | +| RMS-Prop (Root Mean Square Propagation) | ☐ | ☐ | ☐ | ☐ | [Link]() | **TBA** | **TBA** | +| Adam(Adaptive Moment Estimation) | ☐ | ☐ | ☐ | ☐ | [Link]() | **TBA** | **TBA** | +| Adamw | ☐ | ☐ | ☐ | ☐ | [Link]() | **TBA** | **TBA** | ----

Second Order Results

-| Model | Code | Config | Paper | Animations | Samples | -|-----------------------------------------|-------|---------|---------------------------------------------------|----------------|---------| -| Newton's Method |☐| ☐ | [Link]() | **TBA** | **TBA** | -| Secant Method |☐| ☐ | [Link]() | **TBA** | **TBA** | -| Davidson-Fletcher-Powell (DFP) |☐| ☐ | [Link]() | **TBA** | **TBA** | -| Broyden-Fletcher-Goldfarb-Shanno (BFGS) |☐| ☐ | [Link]() | **TBA** | **TBA** | -| Limited-memory BFGS (L-BFGS) |☐| ☐ | [Link]() | **TBA** | **TBA** | -| Newton-Raphson |☐| ☐ | [Link]() | **TBA** | **TBA** | -| Levenberg-Marquardt |☐| ☐ | [Link]() | **TBA** | **TBA** | -| Powell's method |☐| ☐ | [Link]() | **TBA** | **TBA** | -| Steepest Descent |☐| ☐ | [Link]() | **TBA** | **TBA** | -| Truncated Newton |☐| ☐ | [Link]() | **TBA** | **TBA** | -| Fletcher-Reeves |☐| ☐ | [Link]() | **TBA** | **TBA** | + +| Model | PyTorch | Jax/Flax | Flux | Config | Paper | Animations | Samples | +|:--------------------------------------- |:-------:|:--------:|:-------:|:-------:|:-------- |:----------:|:-------:| +| Newton's Method | ☐ | ☐ | ☐ | ☐ | [Link]() | **TBA** | **TBA** | +| Secant Method | ☐ | ☐ | ☐ | ☐ | [Link]() | **TBA** | **TBA** | +| Davidson-Fletcher-Powell (DFP) | ☐ | ☐ | ☐ | ☐ | [Link]() | **TBA** | **TBA** | +| Broyden-Fletcher-Goldfarb-Shanno (BFGS) | ☐ | ☐ | ☐ | ☐ | [Link]() | **TBA** | **TBA** | +| Limited-memory BFGS (L-BFGS) | ☐ | ☐ | ☐ | ☐ | [Link]() | **TBA** | **TBA** | +| Newton-Raphson | ☐ | ☐ | ☐ | ☐ | [Link]() | **TBA** | **TBA** | +| Levenberg-Marquardt | ☐ | ☐ | ☐ | ☐ | [Link]() | **TBA** | **TBA** | +| Powell's method | ☐ | ☐ | ☐ | ☐ | [Link]() | **TBA** | **TBA** | +| Steepest Descent | ☐ | ☐ | ☐ | ☐ | [Link]() | **TBA** | **TBA** | +| Truncated Newton | ☐ | ☐ | ☐ | ☐ | [Link]() | **TBA** | **TBA** | +| Fletcher-Reeves | ☐ | ☐ | ☐ | ☐ | [Link]() | **TBA** | **TBA** | ### Citation ```bib