Lists (13)
Sort Name ascending (A-Z)
Stars
⏩ Continue is the leading open-source AI code assistant. You can connect any models and any context to build custom autocomplete and chat experiences inside VS Code and JetBrains
Autonomous coding agent right in your IDE, capable of creating/editing files, executing commands, using the browser, and more with your permission every step of the way.
Agno is a lightweight framework for building multi-modal Agents
Cross-platform WebView library in Rust for Tauri.
Cross-Platform, GPU Accelerated Whisper 🏎️
Incredibly fast JavaScript runtime, bundler, test runner, and package manager – all in one
Distribute and run LLMs with a single file.
g1: Using Llama-3.1 70b on Groq to create o1-like reasoning chains
Using Groq or OpenAI or Ollama to create o1-like reasoning chains
`llm-chain` is a powerful rust crate for building chains in large language models allowing you to summarise text and complete complex tasks
🦀 A curated list of Rust tools, libraries, and frameworks for working with LLMs, GPT, AI
The Cloud Operational Data Store: use SQL to transform, deliver, and act on fast-changing data.
Qdrant - High-performance, massive-scale Vector Database and Vector Search Engine for the next generation of AI. Also available in the cloud https://cloud.qdrant.io/
Gaining advanced insights from Git repository history.
cuVS - a library for vector search and clustering on the GPU
Lord of Large Language and Multi modal Systems Web User Interface
Daikatana restoration aiming to have a clean 1.2 source and to support modern systems
General purpose GPU compute framework built on Vulkan to support 1000s of cross vendor graphics cards (AMD, Qualcomm, NVIDIA & friends). Blazing fast, mobile-enabled, asynchronous and optimized for…
JuiceFS is a distributed POSIX file system built on top of Redis and S3.
Weaviate is an open-source vector database that stores both objects and vectors, allowing for the combination of vector search with structured filtering with the fault tolerance and scalability of …
GPT4All: Run Local LLMs on Any Device. Open-source and available for commercial use.