- Bellevue, WA
- http://flake.org/
Stars
AutoGPT is the vision of accessible AI for everyone, to use and to build on. Our mission is to provide the tools, so that you can focus on what matters.
🏡 Open source home automation that puts local control and privacy first.
The simplest, fastest repository for training/finetuning medium-sized GPTs.
An open platform for training, serving, and evaluating large language models. Release repo for Vicuna and Chatbot Arena.
The lean application framework for Python. Build sophisticated user interfaces with a simple Python API. Run your apps in the terminal and a web browser.
Machine Learning From Scratch. Bare bones NumPy implementations of machine learning models and algorithms with a focus on accessibility. Aims to cover everything from linear regression to deep lear…
The ChatGPT Retrieval Plugin lets you easily find personal or work documents by asking questions in natural language.
RWKV (pronounced RwaKuv) is an RNN with great LLM performance, which can also be directly trained like a GPT transformer (parallelizable). We are at RWKV-7 "Goose". So it's combining the best of RN…
Style transfer, deep learning, feature transform
Advanced Python Mastery (course by @dabeaz)
Implementation of Denoising Diffusion Probabilistic Model in Pytorch
An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.
TimesFM (Time Series Foundation Model) is a pretrained time-series foundation model developed by Google Research for time-series forecasting.
Sacred is a tool to help you configure, organize, log and reproduce experiments developed at IDSIA.
A tour in the wonderland of math with python.
STUMPY is a powerful and scalable Python library for modern time series analysis
An unnecessarily tiny implementation of GPT-2 in NumPy.
Chronos: Pretrained Models for Probabilistic Time Series Forecasting
This is a replicate of DeepSeek-R1-Zero and DeepSeek-R1 training on small models with limited data
Conditional Transformer Language Model for Controllable Generation
Implementation of papers in 100 lines of code.
EEG Transformer 2.0. i. Convolutional Transformer for EEG Decoding. ii. Novel visualization - Class Activation Topography.
PyTorch implementations of several SOTA backbone deep neural networks (such as ResNet, ResNeXt, RegNet) on one-dimensional (1D) signal/time-series data.
A Python 3 library making time series data mining tasks, utilizing matrix profile algorithms, accessible to everyone.
Just some miscellaneous utility functions / decorators / modules related to Pytorch and Accelerate to help speed up implementation of new AI research