Stars
Implementation of the LLaMA language model based on nanoGPT. Supports flash attention, Int8 and GPTQ 4bit quantization, LoRA and LLaMA-Adapter fine-tuning, pre-training. Apache 2.0-licensed.
yzygitzh / nccl
Forked from NVIDIA/ncclOptimized primitives for collective multi-GPU communication
A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training
Classical equations and diagrams in machine learning
Links to works on deep learning algorithms for physics problems, TUM-I15 and beyond
[Zoom & Facebook Live] Weekly AI Arxiv ์์ฆ2
Neural Network Acceleration such as ASIC, FPGA, GPU, and PIM
๐ค Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
WebGL Clustergrammer JavaScript Library
Sphinx theme from Read the Docs
โ The Node.js best practices list (July 2024)
Distributed training framework for TensorFlow, Keras, PyTorch, and Apache MXNet.
A book series on JavaScript. @YDKJS on twitter.
A guide, not a template, for building electron apps with elm and webpack
Run Keras models in the browser, with GPU support using WebGL
The most cited deep learning papers
A simple, concise tensorflow implementation of style transfer (neural style)
Material Design Components in HTML/CSS/JS
Custom flat theme based on Twitter's Bootstrap for Dojo dijits, dgrid, and esri widgets.