Skip to content

Test and deploy your llm prompts in a data-driven way on an open-source and self-hostable platform

License

Notifications You must be signed in to change notification settings

arcticfly/OpenPipe

 
 

Repository files navigation

OpenPipe

OpenPipe is a flexible playground for comparing and optimizing LLM prompts. It lets you quickly generate, test and compare candidate prompts with realistic sample data.

Sample Experiments

These are simple experiments users have created that show how OpenPipe works. Feel free to fork them and start experimenting yourself.

demo

You can use our hosted version of OpenPipe at https://openpipe.ai. You can also clone this repository and run it locally.

High-Level Features

Visualize Responses
Inspect prompt completions side-by-side.


Test Many Inputs
OpenPipe lets you template a prompt. Use the templating feature to run the prompts you're testing against many potential inputs for broad coverage of your problem space.


Translate between Model APIs
Write your prompt in one format and automatically convert it to work with any other model.

Screenshot 2023-08-01 at 11 55 38 PM



Refine your prompts automatically
Use a growing database of best-practice refinements to improve your prompts automatically.

Screenshot 2023-08-01 at 11 55 38 PM



🪄 Auto-generate Test Scenarios
OpenPipe includes a tool to generate new test scenarios based on your existing prompts and scenarios. Just click "Autogenerate Scenario" to try it out!

auto-generate



Supported Models

Running Locally

  1. Install Postgresql.
  2. Install NodeJS 20 (earlier versions will very likely work but aren't tested).
  3. Install pnpm: npm i -g pnpm
  4. Clone this repository: git clone https://github.com/openpipe/openpipe
  5. Install the dependencies: cd openpipe && pnpm install
  6. Create a .env file (cp .env.example .env) and enter your OPENAI_API_KEY.
  7. Update DATABASE_URL if necessary to point to your Postgres instance and run pnpm prisma db push to create the database.
  8. Create a GitHub OAuth App and update the GITHUB_CLIENT_ID and GITHUB_CLIENT_SECRET values. (Note: a PR to make auth optional when running locally would be a great contribution!)
  9. Start the app: pnpm dev.
  10. Navigate to http://localhost:3000

About

Test and deploy your llm prompts in a data-driven way on an open-source and self-hostable platform

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • TypeScript 98.1%
  • JavaScript 1.5%
  • Other 0.4%