Stars
stackblitz-labs / bolt.diy
Forked from stackblitz/bolt.newPrompt, run, edit, and deploy full-stack web applications using any LLM you want!
Open source JSON/CSV representations of the cards for the Flesh and Blood TCG
JavaScript player library / DASH & HLS client / MSE-EME player
Intelligent Trading Bot: Automatically generating signals and trading based on machine learning and feature engineering
A web tool to keep track of Satisfactory logistics
A calmer internet, without any gimmicks.
The Garmin Connect integration allows you to expose data from Garmin Connect to Home Assistant.
A standalone RPC server based on HomeAssistant's Silicon Labs multiprotocol addon
A minimalistic typing speed tester
Upload media to Cloudinary service
A plugin to add descriptive popovers to field labels in Payload.
The official demo for Payload 3.0
Haptic input knob with software-defined endstops and virtual detents
The world's most flexible commerce platform.
Private chat with local GPT with document, images, video, etc. 100% private, Apache 2.0. Supports oLLaMa, Mixtral, llama.cpp, and more. Demo: https://gpt.h2o.ai/ https://gpt-docs.h2o.ai/
TensorRT-LLM provides users with an easy-to-use Python API to define Large Language Models (LLMs) and build TensorRT engines that contain state-of-the-art optimizations to perform inference efficie…
A developer reference project for creating Retrieval Augmented Generation (RAG) chatbots on Windows using TensorRT-LLM
This is a simple vue pluign toast notifier with tailwind
A set of beautifully-designed, accessible components and a code distribution platform. Works with your favorite frameworks. Open Source. Open Code.
This repository contains a pure C++ ONNX implementation of multiple offline AI models, such as StableDiffusion (1.5 and XL), ControlNet, Midas, HED and OpenPose.
Download your Spotify playlists and songs along with album art and metadata (from YouTube if a match is found).
gid-oss / dataui-nestjs-crud
Forked from rewiko/crudNestJs CRUD for RESTful APIs
The simplest way to run Alpaca (and other LLaMA-based local LLMs) on your own computer