- San Francisco Bay Area
-
08:11
(UTC -07:00) - www.meta.com
- @_smuddu
Lists (2)
Sort Name ascending (A-Z)
Stars
A programming font focused on source code legibility
AutoGPT is the vision of accessible AI for everyone, to use and to build on. Our mission is to provide the tools, so that you can focus on what matters.
🦜🔗 Build context-aware reasoning applications
[ACL 2023] One Embedder, Any Task: Instruction-Finetuned Text Embeddings
the AI-native open-source embedding database
Run LLaMA (and Stanford-Alpaca) inference on Apple Silicon GPUs.
The Incredible PyTorch: a curated list of tutorials, papers, projects, communities and more relating to PyTorch.
The simplest, fastest repository for training/finetuning medium-sized GPTs.
TensorDict is a pytorch dedicated tensor container.
The superconsole crate provides a handler and building blocks for powerful, yet minimally intrusive TUIs. It is cross platform, supporting Windows 7+, Linux, and MacOS. Rustaceans who want to creat…
[ NeurIPS '22 ] ∞-AE model's implementation in JAX. Kernel-only method outperforms complicated SoTA models with a closed-form solution and a single hyper-parameter.
torch::deploy (multipy for non-torch uses) is a system that lets you get around the GIL problem by running multiple Python interpreters in a single C++ process.
A lightweight library for PyTorch training tools and utilities
Kernl lets you run PyTorch transformer models several times faster on GPU with a single line of code, and is designed to be easily hackable.
Metric learning and retrieval pipelines, models and zoo.
Robust Speech Recognition via Large-Scale Weak Supervision
Extended Isolation Forest for Anomaly Detection
A walkthrough of transformer architecture code
🎓 Sharing machine learning course / lecture notes.
Survival analsyis and time-to-failure predictive modeling using Weibull distributions and Recurrent Neural Networks in Keras
Pytorch implementation of preconditioned stochastic gradient descent (Kron and affine preconditioner, low-rank approximation preconditioner and more)
Scalable and user friendly neural 🧠 forecasting algorithms.
Variational Recurrent Autoencoder for timeseries clustering in pytorch
Joining the modern data stack with the modern ML stack