This is a simple demo of a chat application that integrates Langflow as a low-code backend and Chainlit as an interactive UI. The system enables users to retrieve information from specified documents using a vector database. It also includes authentication, persistence mechanisms, and user feedback integration.
The demo features:
- Langflow as the low-code backend for designing AI workflows.
- Ollama as the model provider (using
nomic-embed-text
for embeddings andllama3.2-1b
for completion/generation). - ChromaDB as a simple vector store.
- Chainlit as the frontend UI, supporting both light and dark modes.
- Custom logo stored in the
public
folder. - Chat history, persistence, and user feedback mechanisms integrated via Literal AI.
- User authentication using an SQLite database.
- Low-Code Backend: Langflow enables intuitive design and modification of AI workflows without writing extensive code.
- Ollama LLMs: Supports
nomic-embed-text
for vector search andllama3.2-1b
for text generation. - ChromaDB Vector Store: Simple and lightweight retrieval system for semantic search.
- Chainlit UI: Chatbot interface with real-time updates, supporting both light and dark themes.
- Custom Branding: Includes a personalized logo stored in the
public
folder. - Persistence & Feedback: Stores chat history and integrates user feedback via Literal AI.
- Authentication: Uses SQLite to manage user credentials securely.
📂 langflow-chainlit-sample
├── 📂 .chainlit/ # Chainlit configurations
├── 📂 public/ # Stores custom branding assets (e.g., logo)
├── .gitignore # Git ignore rules
├── app.py # Chainlit main script
├── lf_python_api.py # Langflow exported API script
├── init_db.py # Script to initialize SQLite database
├── requirements.txt # List of dependencies
├── start_backend.ps1 # Starts Langflow as backend
├── start_frontend.ps1 # Starts Chainlit as frontend
app.py
: Runs the Chainlit UI, handling user interactions and responses from the backend.lf_python_api.py
: Python code exported from Langflow, defining the AI processing flow.init_db.py
: Example script to initialize the SQLite database and create a user table.start_backend.ps1
: Starts Langflow as the backend service.start_frontend.ps1
: Launches Chainlit as the frontend UI.
-
Create a virtual environment and install dependencies:
python -m venv venv source venv/bin/activate # On macOS/Linux venv\Scripts\activate # On Windows pip install -r requirements.txt
-
Start Langflow to design your AI flow:
uv run langflow run
- Modify your Langflow workspace as needed.
- You can swap ChromaDB for another vector store or use different models.
- Upload your docs via the Upload data flow.
- Once satisfied, export the Python API code and replace
lf_python_api.py
.
-
Set up environment variables: Create a
.env
file with the following content:LITERAL_API_KEY=your-literal-ai-key #Can get from your provisioned project @ cloud.getliteral.ai CHAINLIT_AUTH_SECRET=your-auth-secret FLOW_ID=your-langflow-flow-id
-
Run the backend service (Langflow):
./start_backend.ps1
-
Run the frontend (Chainlit UI):
./start_frontend.ps1
-
Access the chatbot UI: Open your browser and go to
http://localhost:8000
(or the configured Chainlit port).
- Langflow for the low-code AI backend.
- Ollama for LLM inference models (
nomic-embed-text
,llama3.2-1b
). - Chainlit & Literal AI for the conversational UI.