Stars
The TinyLlama project is an open endeavor to pretrain a 1.1B Llama model on 3 trillion tokens.
The simplest, fastest repository for training/finetuning medium-sized GPTs.
🔥Highlighting the top ML papers every week.
中文LLaMA&Alpaca大语言模型+本地CPU/GPU训练部署 (Chinese LLaMA & Alpaca LLMs)
Classify and extract structured data with LLMs
ChatRWKV is like ChatGPT but powered by RWKV (100% RNN) language model, and open source.
Must-read papers on prompt-based tuning for pre-trained language models.
CLIP (Contrastive Language-Image Pretraining), Predict the most relevant text snippet given an image
A blazingly fast JSON serializing & deserializing library
🏆 A ranked list of awesome machine learning Python libraries. Updated weekly.
Easy-to-use,Modular and Extendible package of deep-learning based CTR models .
A fast, distributed, high performance gradient boosting (GBT, GBDT, GBRT, GBM or MART) framework based on decision tree algorithms, used for ranking, classification and many other machine learning …
《动手学深度学习》:面向中文读者、能运行、可讨论。中英文版被70多个国家的500多所大学用于教学。
Matlab implementation of the ECO tracker.
uoip / KCFcpp-py-wrapper
Forked from joaofaro/KCFcppPython wrapper for KCFcpp
An Open Source Machine Learning Framework for Everyone