Skip to content

rupurt/llm-http-api

Repository files navigation

llm-http-api

HTTP API for LLM with OpenAI compatibility

Usage

> llm http-api --help
Usage: llm http-api [OPTIONS]

  Run a FastAPI HTTP server with OpenAI compatibility

Options:
  -h, --host TEXT       [default: 0.0.0.0]
  -p, --port INTEGER    [default: 8080]
  -l, --log-level TEXT  [default: info]
  --help                Show this message and exit.
> curl http://localhost:8080/v1/embeddings -X POST -H "Content-Type: application/json" -d '{
  "input": "Hello world",
  "model": "jina-embeddings-v2-small-en"
}'
{"object":"embedding","embedding":[-0.47561466693878174,-0.4471365511417389,...],"index":0}

OpenAI Endpoints

Embeddings

Unimplemented

A detailed list of unimplemented OpenAI endpoints can be found here

Development

This repository manages the dev environment as a Nix flake and requires Nix to be installed

nix develop -c $SHELL
make deps.install
make deps.install/test
make test
make coverage
make lint
make format

About

HTTP API for LLM with OpenAI compatibility

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published