Highlights
Stars
Drivers that would work with Windows 10 x64 and fix scrolling for Apple Magic Mouse 2
Beautiful Chat components for C# Windows forms including Chat User, Chat Header, Chat Bubbles, TypingBox, SearchBox and BeautyForm
PPTAgent: Generating and Evaluating Presentations Beyond Text-to-Slides
MS Word Add-In for Rephrasing Using a Fine-Tuned GPT-3 Model.
Compose Multiplatform, a modern UI framework for Kotlin that makes building performant and beautiful user interfaces easy and enjoyable.
The most powerful and modular diffusion model GUI, api and backend with a graph/nodes interface.
🤖 Chat with your SQL database 📊. Accurate Text-to-SQL Generation via LLMs using RAG 🔄.
Code and documentation to train Stanford's Alpaca models, and generate the data.
Unified Efficient Fine-Tuning of 100+ LLMs & VLMs (ACL 2024)
Code and documents of LongLoRA and LongAlpaca (ICLR 2024 Oral)
The RedPajama-Data repository contains code for preparing large datasets for training large language models.
OpenLLaMA, a permissively licensed open source reproduction of Meta AI’s LLaMA 7B trained on the RedPajama dataset
Guide for fine-tuning Llama/Mistral/CodeLlama models and more
A efficient and effective few-shot NL2SQL method on GPT-4.
[ICLR'24 spotlight] An open platform for training, serving, and evaluating large language model for tool learning.
Dify is an open-source LLM app development platform. Dify's intuitive interface combines AI workflow, RAG pipeline, agent capabilities, model management, observability features and more, letting yo…
Llama中文社区,Llama3在线体验和微调模型已开放,实时汇总最新Llama3学习资料,已将所有代码更新适配Llama3,构建最好的中文Llama大模型,完全开源可商用
Repo for adapting Meta LlaMA2 in Chinese! META最新发布的LlaMA2的汉化版! (完全开源可商用)
modular-ml / wrapyfi-examples_llama
Forked from meta-llama/llamaInference code for facebook LLaMA models with Wrapyfi support
The simplest, fastest repository for training/finetuning medium-sized GPTs.
Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
SuperCLUE: 中文通用大模型综合性基准 | A Benchmark for Foundation Models in Chinese
中文LLaMA-2 & Alpaca-2大模型二期项目 + 64K超长上下文模型 (Chinese LLaMA-2 & Alpaca-2 LLMs with 64K long context models)
AirLLM 70B inference with single 4GB GPU