Highlights
- Pro
Stars
Meditron is a suite of open-source medical Large Language Models (LLMs).
Accelerating your LLM training to full speed! Made with ❤️ by ServiceNow Research
DISCO is a code-free and installation-free browser platform that allows any non-technical user to collaboratively train machine learning models without sharing any private data.
Codebase for ICML submission "DOGE: Domain Reweighting with Generalization Estimation"
Language Identification with Support for More Than 2000 Labels -- EMNLP 2023
Code for NeurIPS 2024 Spotlight: "Scaling Laws and Compute-Optimal Training Beyond Fixed Training Durations"
Freeing data processing from scripting madness by providing a set of platform-agnostic customizable pipeline processing blocks.
Tensor computation with WebGPU acceleration
Landmark Attention: Random-Access Infinite Context Length for Transformers
StableLM: Stability AI Language Models
The simplest, fastest repository for training/finetuning medium-sized GPTs.
Practical low-rank gradient compression for distributed optimization: https://arxiv.org/abs/1905.13727
Robust Cross-lingual Embeddings from Parallel Sentences
Decentralized Privacy-Preserving Proximity Tracing -- Documents
Example code and applications for machine learning on Graphcore IPUs
Stochastic Gradient Push for Distributed Deep Learning
Introduction to PyTorch Workshop at the AMLD 2019
Open Challenge - Automatic Training for Deep Learning
Unsupervised Scalable Representation Learning for Multivariate Time Series: Experiments
Code and data for the WSDM '19 paper "Crosslingual Document Embedding as Reduced-Rank Ridge Regression (Cr5)"
Learn how to design, develop, deploy and iterate on production-grade ML applications.