Highlights
- Pro
Stars
A concise but complete full-attention transformer with a set of promising experimental features from various papers
Efficient implementations of state-of-the-art linear attention models in Pytorch and Triton
A project exploring and reviewing the analysis and discussion of aperiodic neural activity through time.
Implement a ChatGPT-like LLM in PyTorch from scratch, step by step
Implementations of various linear RNN layers using pytorch and triton
Numerical differential equation solvers in JAX. Autodifferentiable and GPU-capable. https://docs.kidger.site/diffrax/
Open weights language model from Google DeepMind, based on Griffin.
Structured state space sequence models
Building blocks for foundation models.
Long Range Arena for Benchmarking Efficient Transformers
Differentiable neuron simulations with biophysical detail on CPU, GPU, or TPU.
A logical, reasonably standardized, but flexible project structure for doing and sharing data science work.
A professional list of Papers, Tutorials, and Surveys on AI for Time Series in top AI conferences and journals.
Python library for designing and training your own Diffusion Models with PyTorch.
This repository contains research code for the paper "Generating realistic neurophysiological time series with denoising diffusion probabilistic models". @jsvetter
Refine high-quality datasets and visual AI models