This project implements a Question Answering system using Large Language Models (LLMs) on PDF documents. The system supports running via Tunnel or Ngrok.
RAG_LLM(Vicuna)_Chainlit.ipynb
: The main notebook for the project.app.py
: The script to run the application using Ngrok.README.md
: Documentation for the project.
To set up the environment, follow these steps:
-
Clone the repository:
git clone https://github.com/NguyenHuy190303/LLM-PDF-QA.git
-
Navigate to the project directory:
cd LLM-PDF-QA
-
Install the required packages:
pip install -r <required-library> (all libraries in the first code block in RAG_LLM(Vicuna)_Chainlit.ipynb file)
Using Google Colab and upload all the RAG_LLM(Vicuna)_Chainlit.ipynb
and app.py
, then:
Run the relevant Tunnel code blocks in the RAG_LLM(Vicuna)_Chainlit.ipynb
notebook to start the application via Tunnel.
- Register/Login on Ngrok and get your TOKEN
- Enter the token to
your-ngrok-token
in relevant Ngrok code blocks
Run the relevant code blocks in the RAG_LLM(Vicuna)_Chainlit.ipynb
notebook to start the application via Tunnel.