- Munich, Germany
- https://lpq29743.github.io/
- @lpq29743
Stars
memory-efficient fine-tuning; support 24G GPU memory fine-tuning 7B
Welcome to the Llama Cookbook! This is your go to guide for Building with Llama: Getting started with Inference, Fine-Tuning, RAG. We also show you how to solve end to end problems using Llama mode…
LAVIS - A One-stop Library for Language-Vision Intelligence
👓 A web interface of gpustat: monitor GPU clusters at a look
🤗 The largest hub of ready-to-use datasets for ML models with fast, easy-to-use and efficient data manipulation tools
A framework for few-shot evaluation of language models.
Toolkit for creating, sharing and using natural language prompts.
Data and software for building the ACL Anthology.
中英文敏感词、语言检测、中外手机/电话归属地/运营商查询、名字推断性别、手机号抽取、身份证抽取、邮箱抽取、中日文人名库、中文缩写库、拆字词典、词汇情感值、停用词、反动词表、暴恐词表、繁简体转换、英文模拟中文发音、汪峰歌词生成器、职业名称词库、同义词库、反义词库、否定词库、汽车品牌词库、汽车零件词库、连续英文切割、各种中文词向量、公司名字大全、古诗词库、IT词库、财经词库、成语词库、地名词库、…
The AI developer platform. Use Weights & Biases to train and fine-tune models, and manage models from experimentation to production.
A Unified Library for Parameter-Efficient and Modular Transfer Learning
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
BLOOM+1: Adapting BLOOM model to support a new unseen language
The official GitHub page for the survey paper "A Survey of Large Language Models".
Central place for the engineering/scaling WG: documentation, SLURM scripts and logs, compute environment and data.
A tool that locates, downloads, and extracts machine translation corpora
润学全球官方指定GITHUB,整理润学宗旨、纲领、理论和各类润之实例;解决为什么润,润去哪里,怎么润三大问题; 并成为新中国人的核心宗教,核心信念。
XTREME is a benchmark for the evaluation of the cross-lingual generalization ability of pre-trained multilingual models that covers 40 typologically diverse languages and includes nine tasks.
Unsupervised text tokenizer for Neural Network-based text generation.
Apps/CLIs/configs I use on macOS/iOS. Fish, Karabiner, Cursor..
Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Repository to track the progress in Natural Language Processing (NLP), including the datasets and the current state-of-the-art for the most common NLP tasks.
Tensors and Dynamic neural networks in Python with strong GPU acceleration