Guide to self-hosting AI models using Traefik on a home network, offering cost-effective and controlled alternatives to cloud-based services.
-
Updated
Dec 18, 2023 - Makefile
Guide to self-hosting AI models using Traefik on a home network, offering cost-effective and controlled alternatives to cloud-based services.
Extremely simple chat interface for ollama models.
Spring break project for easier access to 'ollama' language models.
one chat UI for ollama
docker compose to load ollama, flowise, langfuse, open-web-ui
A simple interface for interacting with LLMs via a local installation of Ollama
ollama web_ui simple and easy
Simple web UI for Ollama
Frontend for the Ollama LLM, built with React.js and Flux architecture.
一个通过ollama API与本地LLMs聊天的小工具(a web application for chatting with local LLMs by ollama API)
Explore and Use Ollama with a Streamlit App!
Odin Runes, a java-based GPT client, facilitates interaction with your preferred GPT model right through your favorite text editor. There is more: It also facilitates prompt-engineering by extracting context from diverse sources using technologies such as OCR, enhancing overall productivity and saving costs.
Your gateway to both Ollama & Apple MlX models
An excellent localized AI chat client application, cross-platform, compatible with all large models compatible with Ollama and OpenAI API. Local deployment protects your data privacy and can be used as Ollama client and OpenAI client.
Ollama Chat is a GUI for Ollama designed for macOS.
Running Ollama on Github Actions
Desktop UI for Ollama made with PyQT
Ollama chat webui - AI Chatbot made with React, Vite, Nest.js, tailwind, shadcn & more
Add a description, image, and links to the ollama-gui topic page so that developers can more easily learn about it.
To associate your repository with the ollama-gui topic, visit your repo's landing page and select "manage topics."