This project implements a local chatbot using Retrieval-Augmented Generation (RAG) technology. It consists of three main components: a chatbot API, a user interface, and the infrastructure to support the chatbot system.
Ideal for developers, researchers looking to implement a customizable, privacy-focused chatbot solution with the power of large language models and the flexibility of local deployment.
The chatbot-api component is responsible for handling the core logic of the chatbot, including:
- Processing user inputs
- Generating appropriate responses using RAG technology
- Managing conversation context
- Integrating with external services and databases
- RESTful API endpoints for chatbot interactions
- Document upload and text extraction (PDF, DOCX, TXT)
- Vector-based document search
- Integration with Ollama for language model inference
- Database storage for chat history and documents
- Python 3.12
- FastAPI
- SQLAlchemy
- Alembic for database migrations
- FAISS for vector storage
- Sentence Transformers for text embedding
The chatbot-ui component provides the user interface for interacting with the chatbot. It includes:
- A responsive web interface built with React.js
- Real-time chat functionality
- Model selection from available models
- Document upload functionality
- Chat history display
- Dark mode UI using Material-UI
- React.js
- Material-UI
- Axios for API communication
The chatbot-infrastructure component manages the deployment, scaling, and monitoring of the chatbot system. It includes:
- Containerization using Docker
- Docker Compose for local development and testing
- PostgreSQL database setup
- Environment configuration for API and UI components
- Docker containers for API, UI, and database
- Makefile for easy management of services
- Integration with locally running Ollama instance
To get started with the project, follow these steps:
-
Clone the repository:
git clone https://github.com/doanhat/local-chatbot-with-rag-with-fastapi-ollama-react-faiss-postgres.git cd local-chatbot-with-rag-with-fastapi-ollama-react-faiss-postgres
-
Set up and start the infrastructure:
cd chatbot-infrastructure make run
-
Access the components:
- API: http://localhost:8000/api
- UI: http://localhost:3000
- API Documentation: http://localhost:8000/docs
For detailed setup instructions for each component, refer to their individual README files: