An open-source AI chatbot app template built for Local usage, with a focus on privacy.
This is intended to the UI for the AI Chatbot, and is designed to be used with a compatible AI model provider.
This template ships with OpenAI gpt-3.5-turbo
as the default. Users can also use their own OpenAI compatible server.
- Node.js (v14 or higher)
- pnpm - Package Manager
- Docker - (Optional )For running Redis KV store
- LM Studio - For loading and starting the AI models
Clone the repository and navigate to the project folder.
git clone https://github.com/alphaolomi/local-ai-chatbot.git
cd local-ai-chatbot
# Install Redis if not already installed
sudo apt-get install redis-server
# Start Redis
sudo service redis-server start
# Check if Redis is running
redis-cli ping
# Install Redis if not already installed
brew install redis
# Start Redis
brew services start redis
# Check if Redis is running
redis-cli ping
Note: Redis is not officially supported on Windows. It is recommended to use Docker to run Redis on Windows.
docker run -d --name redis-stack -p 6379:6379 -p 8001:8001 redis/redis-stack:latest
You will need to use the environment variables defined in .env.example
to run Local AI Chatbot.
Note: You should not commit your
.env
file or it will expose secrets.
pnpm install
pnpm dev
Your app should now be running on localhost:3000.
Learn more about LM Studio
If there are no responses from the AI model, Ensure you have started the AI model server in the LM Studio and the server is running.