Stars
6
stars
written in Python
Clear filter
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training
Distributed training framework for TensorFlow, Keras, PyTorch, and Apache MXNet.
Implementation of the LLaMA language model based on nanoGPT. Supports flash attention, Int8 and GPTQ 4bit quantization, LoRA and LLaMA-Adapter fine-tuning, pre-training. Apache 2.0-licensed.
A simple, concise tensorflow implementation of style transfer (neural style)