Stars
Implementation of the LLaMA language model based on nanoGPT. Supports flash attention, Int8 and GPTQ 4bit quantization, LoRA and LLaMA-Adapter fine-tuning, pre-training. Apache 2.0-licensed.
A concise but complete full-attention transformer with a set of promising experimental features from various papers
A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training
Book about interpretable machine learning
Papers about out-of-distribution generalization on graphs.
Making large AI models cheaper, faster and more accessible
Avalanche: an End-to-End Library for Continual Learning based on PyTorch.
Real-time PathTracing with global illumination and progressive rendering, all on top of the Three.js WebGL framework. Click here for Live Demo: https://erichlof.github.io/THREE.js-PathTracing-Rende…
Automated theorem prover for first-order predicate logic written in TypeScript
A curated list of awesome C++ (or C) frameworks, libraries, resources, and shiny things. Inspired by awesome-... stuff.