Skip to content

KLxLee/LLM_chatbot

Repository files navigation

LLM Chatbot

This project implements a custom knowledge chatbot integrated with a Telegram bot. You can easily add PDF or text documents to a dedicated folder within the repository. During application initialization, these documents will be processed and embedded into a vector database for efficient querying. Once set up, you can chat with the Telegram bot and receive responses generated by the LLM (Large Language Model) based on the custom knowledge you've provided.


Deployment Guide

1. Prerequisites:

Clone the repository

git clone https://github.com/KLxLee/LLM_chatbot

Change the directory

cd LLM_chatbot

Create the .env file for environment variables:

touch .env

2. Provide Your OpenAI API Key:

To reduce server hardware requirements, a local AI model is not implemented in this project.

  1. Create an OpenAI account and obtain the API key. You can find the instructions here:
    How to Get OpenAI Access Token.
  2. Define your OpenAI API key in the .env file with the variable name OPENAI_API_KEY:
    echo 'OPENAI_API_KEY=<your_key>' >> .env

3. Expose a Public Port Using NGROK:

NGROK is a cross-platform tool that allows developers to expose a local development server to the internet easily.
For users who prefer not to set up a cloud server, NGROK can be used to expose a public port.

  1. Set up a free NGROK account.
  2. Define your NGROK authentication key in the .env file with the variable name NGROK_AUTH_TOKEN:
    echo 'NGROK_AUTH_TOKEN=<your_token>' >> .env

4. Integrate with Telegram Bot:

  1. Create a Telegram bot and obtain the bot authentication key. Follow these tutorials:
  2. Define your Telegram Bot API key in the .env file with the variable name TELEGRAM_API_KEY:
    echo 'TELEGRAM_API_KEY=<your_key>' >> .env

5. Add Custom Knowledge Documents:

  1. Under the LLM_chatbot/knowledge_docs directory, there are two folders: pdf_docs and txt_docs.
  2. Add your text and PDF documents to the corresponding folders.
  3. You can skip this step if you do not want to add any documents for custom knowledge.

6. Start the Application:

  1. Build and run the containers:
    docker compose up
    or
    docker-compose up
  2. It may take some time to build the containers and embed the data into the vector store.

7. Chat with the Custom Knowledge Bot via Telegram:

  1. After the application is running, you can start chatting with the Telegram bot and ask questions based on the custom knowledge provided.
  2. If you use the example knowledge files provided, you can ask questions like:
    • "Summary about The Adventures of Sir Aldric"
    • "Moral of the story The Tale of Prince John"
    • "Please tell me about the IKEA light manual"

Upcoming Features:

  1. Conversation History: Keep track of the conversation history with the bot.
  2. Custom System Prompt: Implement custom system prompts for better conversation control.

Changes/Corrections:

  • Fixed the spelling of Preresique to Prerequisites.
  • Added consistency with how environment variables are defined in .env file (>> for appending).
  • Minor rephrasing for better readability and clarity.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Packages

No packages published