-
California Institute of Technology
- Pasadena, US
-
15:49
(UTC -08:00) - https://jberner.info
- https://orcid.org/0000-0002-5648-648X
- @julberner
- in/julius-berner
Stars
Improved sampling via learned diffusions (ICLR2024) and an optimal control perspective on diffusion-based generative modeling (TMLR2024)
Code for "Improving Diffusion Inverse Problem Solving with Decoupled Noise Annealing"
Learning in infinite dimension with neural operators.
Code for "DPOT: Auto-Regressive Denoising Operator Transformer for Large-Scale PDE Pre-Training"
Codomain attention neural operator for single to multi-physics PDE adaptation.
thu-ml / DPOT
Forked from HaoZhongkai/DPOTCode for "DPOT: Auto-Regressive Denoising Operator Transformer for Large-Scale PDE Pre-Training"
Code for the paper "Hyperbolic Image-Text Representations", Desai et al, ICML 2023
Scaling behavior of Physics Informed Neural Networks for solving partial differential equations
Code for ICML 2019 paper "Probabilistic Neural-symbolic Models for Interpretable Visual Question Answering" [long-oral]
Training ReLU networks to high uniform accuracy is intractable
Numerically Solving Parametric Families of High-Dimensional Kolmogorov Partial Differential Equations via Deep Learning
DeepErwin is a python 3.8+ package that implements and optimizes JAX 2.x wave function models for numerical solutions to the multi-electron Schrödinger equation. DeepErwin supports weight-sharing w…
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.