Skip to content
View hyoo's full-sized avatar

Block or report hyoo

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Please don't include any personal information such as legal names or email addresses. Maximum 100 characters, markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse
6 stars written in Python
Clear filter

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.

Python 137,620 27,565 Updated Jan 15, 2025

A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training

Python 20,806 2,680 Updated Aug 15, 2024

Distributed training framework for TensorFlow, Keras, PyTorch, and Apache MXNet.

Python 14,337 2,246 Updated Dec 12, 2024

Implementation of the LLaMA language model based on nanoGPT. Supports flash attention, Int8 and GPTQ 4bit quantization, LoRA and LLaMA-Adapter fine-tuning, pre-training. Apache 2.0-licensed.

Python 6,021 518 Updated Sep 6, 2024

A simple, concise tensorflow implementation of style transfer (neural style)

Python 297 108 Updated Feb 17, 2017