A PyTorch library dedicated to neural differential equations and equilibrium models. Maintained by DiffEqML.
Neural differential equations made easy:
from torchdyn import NeuralODE
# your preferred torch.nn.Module here
f = nn.Sequential(nn.Conv2d(1, 32, 3),
nn.Softplus(),
nn.Conv2d(32, 1, 3)
)
nde = NeuralODE(f)
And you're good to train. Feel free to combine the NeuralODE
class with any PyTorch
modules to build derivative models. We offer additional tools to build custom neural differential equation and implicit models, including a functional API for numerical methods. There is much more in torchdyn
other than NeuralODE
and NeuralSDE
classes: tutorials, a functional API to a variety of GPU-compatible numerical methods, benchmarks...
Contribute to the library with your benchmark and model variants! No need to reinvent the wheel :)
Stable release:
pip install torchdyn
We note that PyPI
currently does not allow non-packaged dependencies. You will have to install torchsde
and torchcde
separately, or build a virtual environment with poetry following the steps outlined in Contributing
Bleeding-edge version:
git clone https://github.com/DiffEqML/torchdyn.git && cd torchdyn && python setup.py install
Don't forget to install in your environment of choice if necessary. We offer an automated method for setting up your torchdyn
environment designed specifically for contributors or those planning to tinker with the internals. Check Contributing
below for more details.
Check our wiki for a full description of available features.
Interest in the blend of differential equations, deep learning and dynamical systems has been reignited by recent works [1,2, 3, 4]. Modern deep learning frameworks such as PyTorch, coupled with further improvements in computational resources have allowed the continuous version of neural networks, with proposals dating back to the 80s [5], to finally come to life and provide a novel perspective on classical machine learning problems. Central to the torchdyn
approach are continuous and implicit neural networks, where layer depth is taken to an infinite limit.
By providing a centralized, easy-to-access collection of model templates, tutorial and application notebooks, we hope to speed-up research in this area and ultimately establish neural differential equations and implicit models as an effective tool for control, system identification and general machine learning tasks.
torchdyn
leverages modern PyTorch best practices and handles training with pytorch-lightning
[6]. We build Graph Neural ODEs utilizing the Graph Neural Networks (GNNs) API of dgl
[7]. For a complete list of references, check pyproject.toml
. We offer a complete suite of ODE solvers and sensitivity methods, extending the functionality offered by torchdiffeq
[1]. We have light dependencies on torchsde
[7] and torchcde
[8].
torchdyn
contains a variety of self-contained quickstart examples / tutorials built for practitioners and researchers. Refer to the tutorial readme
torchdyn
is designed to be a community effort: we welcome all contributions of tutorials, model variants, numerical methods and applications related to continuous and implicit deep learning. We do not have specific style requirements, though we subscribe to many of Jeremy Howard's ideas.
We use poetry
to manage requirements, virtual python environment creation, and packaging. To install poetry
, refer to the docs.
To set up your dev environment, run poetry install
. In example, poetry run pytest
will then run all torchdyn
tests inside your newly created env.
poetry
does not currently offer a way to select torch
wheels based on desired cuda
and OS
, and will install a version without GPU support. For CUDA torch
wheels,
run poetry run poe force_cuda11
, or add your version to pyproject.toml
.
If you wish to run jupyter
notebooks within your newly created poetry environments, use poetry run ipython kernel install --user --name=torchdyn
and switch the notebook kernel.
Choosing what to work on: There is always ongoing work on new features, tests and tutorials. Contributing to any of the above is extremely valuable to us. If you wish to work on additional features not currently WIP, feel free to reach out on Slack or via email. We'll be glad to discuss details.
If you find torchdyn
valuable for your research or applied projects:
@article{poli2020torchdyn,
title={TorchDyn: A Neural Differential Equations Library},
author={Poli, Michael and Massaroli, Stefano and Yamashita, Atsushi and Asama, Hajime and Park, Jinkyoo},
journal={arXiv preprint arXiv:2009.09346},
year={2020}
}