Skip to content

erikmillergalow/htmx-llmchat

Repository files navigation


HTMXHTMLChat
HTMXLLMChat

A simple frontend for chatting with and organizing conversations with Large Language Models.

FeaturesUsageConfigurationDevelopmentCreditsLicense

screenshot

Features

  • Connect to any OpenAI compatible API, local or external.
  • Switch between APIs and models within conversations.
  • Search thread history based on content, tags, models, and usefulness.
  • Tag threads to keep common topics readily accessible.
  • Mark messages as useful to easily find and for a basic model ranking system.
  • Customize colors to your preference.

Usage

Run from binaries

  • Download from releases
  • Open command prompt in htmx-llmchat directory
./htmx-llmchat serve

Note Unsigned and mostly untested, you may encounter security warnings before being able to run on macOS and Windows.

Run from source

git clone https://github.com/erikmillergalow/htmx-llmchat.git
cd htmx-llmchat
templ generate
go run main.go serve
  • Connect to 127.0.01:8090 in web browser

To clone and run this application, you'll need Git and Node.js (which comes with npm) installed on your computer. From your command line:

Configuration

  • Open the API editor and add an API:
    Add API image
  • Enter a display name, the OpenAI compatible API /v1 endpoint, and an API key (not always necessary)
  • Press update for changes to take effect

API Suggestions

Ollama (local)

Local AI (local)

llama-cpp-python (local)

Groq (external, free tier available)

OpenAI (external, paid)

Note Many other options available, just make sure that they support /v1/chat/completions streaming and /v1/models to list model options.

Note Total and 'useful' chats tracked over time per model and can be viewed in the config menu: model ranking

Development

  • air can be installed for live reloading
  • .air.toml included in repo
// run for development

air

// build binaries

make all
  • Tauri can be used to package everything as a standalone desktop app (no browser needed)
// development mode

npm run tauri dev

// create release bundle
// src-tauri/tauri.conf.json beforeBuildCommand is configured for Github Actions, 
// needs to be modified for local builds

npm run tauri build
  • Runs on macOS successfully but ran into this issue while creating release bundle for Linux/Windows. Need to determine if the paths API or BaseDirectory can be used to provide the Pocketbase sidecar access to a non read-only filesystem.

Credits

This software uses the following open source packages:

License

MIT