Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Milestone 2 second tt ode working version backward gradient descent #18

Open
wants to merge 59 commits into
base: master
Choose a base branch
from

Conversation

mbaddar1
Copy link

@mbaddar1 mbaddar1 commented Feb 9, 2023

This is the second working version (Milestone 2)
Forward pass is using TT structure from dlra ( written by david) . Backward is via pytorch gradient descent

…eter scale (Tensor of shape ()) of distribution Normal(loc: 0.0, scale: 0.0) to satisfy the constraint GreaterThan(lower_bound=0.0), but found invalid values: 0.0
…eter scale (Tensor of shape ()) of distribution Normal(loc: 0.0, scale: 0.0) to satisfy the constraint GreaterThan(lower_bound=0.0), but found invalid values: 0.0
…eavily depended on h and not stable when changing it, needs more experimentation and next step should be rk45
…s - too slow - just to isolate stucking issue
…is exploding gradient problem led to underflow exception during the ode solve forward phase
…ckward schema through the ode trajectory . Forward is torch-rk45 integrate and backward is builtin pytorch backrop
…my anode extension , updated by Gradient descent and the builtin backward mechanism of pytorch
…egrate for forward pass, and backward is implemented implicited via pytorch backward gradient descent (Adam)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant