Skip to content

Commit

Permalink
add flags to server and client setup
Browse files Browse the repository at this point in the history
  • Loading branch information
MikeBirdTech committed Mar 13, 2024
1 parent 01bc823 commit 84039d2
Show file tree
Hide file tree
Showing 2 changed files with 88 additions and 2 deletions.
9 changes: 9 additions & 0 deletions docs/client/setup.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -36,3 +36,12 @@ poetry run 01
```bash
poetry run 01 --client
```

### Flags

- `--client`
Run client.

- `--client-type TEXT`
Specify the client type.
Default: `auto`.
81 changes: 79 additions & 2 deletions docs/server/setup.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -3,9 +3,86 @@ title: "Setup"
description: "Get your 01 server up and running"
---

Setup (just run start.py --server , explain the flags (revealed via start.py --help))

- Interpreter
- Open Interpreter (explains i.py, how you configure your interpreter, cover the basic settings of OI (that file is literally just modifying an interpreter from OI)
- Language Model (LLM setup via interpreter.model in i.py or from the command line via start.py --server --llm-service llamafile)
- Voice Interface (explains that you can run --tts-service and --stt-service to swap out for different services, which are in /Services/Speech-to-text and /Services/Text-to-text)

## Run Server

```bash
poetry run 01 --server
```

## Flags

- `--server`
Run server.

- `--server-host TEXT`
Specify the server host where the server will deploy.
Default: `0.0.0.0`.

- `--server-port INTEGER`
Specify the server port where the server will deploy.
Default: `8000`.

- `--tunnel-service TEXT`
Specify the tunnel service.
Default: `ngrok`.

- `--expose`
Expose server to internet.

- `--server-url TEXT`
Specify the server URL that the client should expect.
Defaults to server-host and server-port.
Default: `None`.

- `--llm-service TEXT`
Specify the LLM service.
Default: `litellm`.

- `--model TEXT`
Specify the model.
Default: `gpt-4`.

- `--llm-supports-vision`
Specify if the LLM service supports vision.

- `--llm-supports-functions`
Specify if the LLM service supports functions.

- `--context-window INTEGER`
Specify the context window size.
Default: `2048`.

- `--max-tokens INTEGER`
Specify the maximum number of tokens.
Default: `4096`.

- `--temperature FLOAT`
Specify the temperature for generation.
Default: `0.8`.

- `--tts-service TEXT`
Specify the TTS service.
Default: `openai`.

- `--stt-service TEXT`
Specify the STT service.
Default: `openai`.

- `--local`
Use recommended local services for LLM, STT, and TTS.

- `--install-completion [bash|zsh|fish|powershell|pwsh]`
Install completion for the specified shell.
Default: `None`.

- `--show-completion [bash|zsh|fish|powershell|pwsh]`
Show completion for the specified shell, to copy it or customize the installation.
Default: `None`.

- `--help`
Show this message and exit.

0 comments on commit 84039d2

Please sign in to comment.