Skip to content

Pipelines: Versatile, UI-Agnostic OpenAI-Compatible Plugin Framework

License

Notifications You must be signed in to change notification settings

CYH4157/cyh_pipelines

 
 

Repository files navigation

Pipelines Logo

Pipelines: UI-Agnostic OpenAI API Plugin Framework

Welcome to Pipelines, an Open WebUI initiative. Pipelines bring modular, customizable workflows to any UI client supporting OpenAI API specs – and much more! Easily extend functionalities, integrate unique logic, and create dynamic workflows with just a few lines of code.

Preparation

  1. Start pipeline container
git clone https://github.com/CYH4157/cyh_pipelines.git
cd cyh_pipelines
docker build -t cyh-pipelines:latest .
docker run -d -p 9099:9099 --add-host=host.docker.internal:host-gateway -v $(pwd):/app --name pipelines --network hydra_llm_network --restart always cyh-pipelines

⚠️ warning: Please note that the current path is 'your_path/cyh_pipeline'.


  1. hydra docker-compose to get the qdrant container started put your qdrant embedding data
cd
mkdir storage_qdrant
mv -r your_embedding_data ./storage_qdrant/collections/your_embedding_data
  1. Ollama Download model
docker exec -it ollama ollama pull llama3.1:8b-instruct-fp16
docker exec -it ollama ollama pull chatfire/bge-m3:q8_0

⚠️ warning: Please note ollama container version.


  1. Setting openwebui admin panel go to open webui and connect with pipeline container http://your_open_webui_ip:3000/
image
# default passwd
0p3n-w3bu!
  1. upload your code or git url
image
  1. find your models
image

🚀 Why Choose Pipelines?

  • Limitless Possibilities: Easily add custom logic and integrate Python libraries, from AI agents to home automation APIs.
  • Seamless Integration: Compatible with any UI/client supporting OpenAI API specs. (Only pipe-type pipelines are supported; filter types require clients with Pipelines support.)
  • Custom Hooks: Build and integrate custom pipelines.

Examples of What You Can Achieve:

🔧 How It Works

Pipelines Workflow

Integrating Pipelines with any OpenAI API-compatible UI client is simple. Launch your Pipelines instance and set the OpenAI URL on your client to the Pipelines URL. That's it! You're ready to leverage any Python library for your needs.

⚡ Quick Start with Docker

Warning

Pipelines are a plugin system with arbitrary code execution — don't fetch random pipelines from sources you don't trust.

For a streamlined setup using Docker:

  1. Run the Pipelines container:

    docker run -d -p 9099:9099 --add-host=host.docker.internal:host-gateway -v pipelines:/app/pipelines --name pipelines --restart always ghcr.io/open-webui/pipelines:main
  2. Connect to Open WebUI:

    • Navigate to the Settings > Connections > OpenAI API section in Open WebUI.
    • Set the API URL to http://localhost:9099 and the API key to 0p3n-w3bu!. Your pipelines should now be active.

Note

If your Open WebUI is running in a Docker container, replace localhost with host.docker.internal in the API URL.

  1. Manage Configurations:

    • In the admin panel, go to Admin Settings > Pipelines tab.
    • Select your desired pipeline and modify the valve values directly from the WebUI.

Tip

If you are unable to connect, it is most likely a Docker networking issue. We encourage you to troubleshoot on your own and share your methods and solutions in the discussions forum.

If you need to install a custom pipeline with additional dependencies:

  • Run the following command:

    docker run -d -p 9099:9099 --add-host=host.docker.internal:host-gateway -e PIPELINES_URLS="https://github.com/open-webui/pipelines/blob/main/examples/filters/detoxify_filter_pipeline.py" -v pipelines:/app/pipelines --name pipelines --restart always ghcr.io/open-webui/pipelines:main

Alternatively, you can directly install pipelines from the admin settings by copying and pasting the pipeline URL, provided it doesn't have additional dependencies.

That's it! You're now ready to build customizable AI integrations effortlessly with Pipelines. Enjoy!

📦 Installation and Setup

Get started with Pipelines in a few easy steps:

  1. Ensure Python 3.11 is installed.

  2. Clone the Pipelines repository:

    git clone https://github.com/open-webui/pipelines.git
    cd pipelines
  3. Install the required dependencies:

    pip install -r requirements.txt
  4. Start the Pipelines server:

    sh ./start.sh

Once the server is running, set the OpenAI URL on your client to the Pipelines URL. This unlocks the full capabilities of Pipelines, integrating any Python library and creating custom workflows tailored to your needs.

📂 Directory Structure and Examples

The /pipelines directory is the core of your setup. Add new modules, customize existing ones, and manage your workflows here. All the pipelines in the /pipelines directory will be automatically loaded when the server launches.

You can change this directory from /pipelines to another location using the PIPELINES_DIR env variable.

Integration Examples

Find various integration examples in the /examples directory. These examples show how to integrate different functionalities, providing a foundation for building your own custom pipelines.

🎉 Work in Progress

We’re continuously evolving! We'd love to hear your feedback and understand which hooks and features would best suit your use case. Feel free to reach out and become a part of our Open WebUI community!

Our vision is to push Pipelines to become the ultimate plugin framework for our AI interface, Open WebUI. Imagine Open WebUI as the WordPress of AI interfaces, with Pipelines being its diverse range of plugins. Join us on this exciting journey! 🌍

About

Pipelines: Versatile, UI-Agnostic OpenAI-Compatible Plugin Framework

Resources

License

Code of conduct

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 87.0%
  • Shell 10.7%
  • Dockerfile 2.1%
  • Batchfile 0.2%