Skip to content

This project implements a Question Answering system using Large Language Models (LLMs) on PDF documents. The system supports running via Tunnel or Ngrok.

Notifications You must be signed in to change notification settings

NguyenHuy190303/LLM-PDF-QA

Repository files navigation

LLM-PDF-QA Project

Overview

This project implements a Question Answering system using Large Language Models (LLMs) on PDF documents. The system supports running via Tunnel or Ngrok.

Files in the Repository

  • RAG_LLM(Vicuna)_Chainlit.ipynb: The main notebook for the project.
  • app.py: The script to run the application using Ngrok.
  • README.md: Documentation for the project.

Installation

To set up the environment, follow these steps:

  1. Clone the repository:

    git clone https://github.com/NguyenHuy190303/LLM-PDF-QA.git
  2. Navigate to the project directory:

    cd LLM-PDF-QA
  3. Install the required packages:

    pip install -r <required-library> (all libraries in the first code block in RAG_LLM(Vicuna)_Chainlit.ipynb file)

Running the Applications

Using Google Colab and upload all the RAG_LLM(Vicuna)_Chainlit.ipynb and app.py, then:

Using Tunnel

Run the relevant Tunnel code blocks in the RAG_LLM(Vicuna)_Chainlit.ipynb notebook to start the application via Tunnel.

Using Ngrok

  1. Register/Login on Ngrok and get your TOKEN
  2. Enter the token to your-ngrok-token in relevant Ngrok code blocks

Run the relevant code blocks in the RAG_LLM(Vicuna)_Chainlit.ipynb notebook to start the application via Tunnel.

About

This project implements a Question Answering system using Large Language Models (LLMs) on PDF documents. The system supports running via Tunnel or Ngrok.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published