Skip to content

Commit 80d4e6c

Browse files
author
Luke Hinds
committed
Add Docs
1 parent 2d4c72a commit 80d4e6c

File tree

2 files changed

+132
-3
lines changed

2 files changed

+132
-3
lines changed

docs/configuration.md

Lines changed: 45 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -16,6 +16,10 @@ The configuration system in Codegate is managed through the `Config` class in `c
1616
- Log Level: "INFO"
1717
- Log Format: "JSON"
1818
- Prompts: Default prompts from prompts/default.yaml
19+
- Provider URLs:
20+
- vLLM: "http://localhost:8000"
21+
- OpenAI: "https://api.openai.com/v1"
22+
- Anthropic: "https://api.anthropic.com/v1"
1923

2024
## Configuration Methods
2125

@@ -27,6 +31,18 @@ Load configuration from a YAML file:
2731
config = Config.from_file("config.yaml")
2832
```
2933

34+
Example config.yaml:
35+
```yaml
36+
port: 8989
37+
host: localhost
38+
log_level: INFO
39+
log_format: JSON
40+
provider_urls:
41+
vllm: "https://vllm.example.com"
42+
openai: "https://api.openai.com/v1"
43+
anthropic: "https://api.anthropic.com/v1"
44+
```
45+
3046
### From Environment Variables
3147
3248
Environment variables are automatically loaded with these mappings:
@@ -36,13 +52,42 @@ Environment variables are automatically loaded with these mappings:
3652
- `CODEGATE_APP_LOG_LEVEL`: Logging level
3753
- `CODEGATE_LOG_FORMAT`: Log format
3854
- `CODEGATE_PROMPTS_FILE`: Path to prompts YAML file
55+
- `CODEGATE_PROVIDER_VLLM_URL`: vLLM provider URL
56+
- `CODEGATE_PROVIDER_OPENAI_URL`: OpenAI provider URL
57+
- `CODEGATE_PROVIDER_ANTHROPIC_URL`: Anthropic provider URL
3958

4059
```python
4160
config = Config.from_env()
4261
```
4362

4463
## Configuration Options
4564

65+
### Provider URLs
66+
67+
Provider URLs can be configured in several ways:
68+
69+
1. In Configuration File:
70+
```yaml
71+
provider_urls:
72+
vllm: "https://vllm.example.com" # /v1 path is added automatically
73+
openai: "https://api.openai.com/v1"
74+
anthropic: "https://api.anthropic.com/v1"
75+
```
76+
77+
2. Via Environment Variables:
78+
```bash
79+
export CODEGATE_PROVIDER_VLLM_URL=https://vllm.example.com
80+
export CODEGATE_PROVIDER_OPENAI_URL=https://api.openai.com/v1
81+
export CODEGATE_PROVIDER_ANTHROPIC_URL=https://api.anthropic.com/v1
82+
```
83+
84+
3. Via CLI Flags:
85+
```bash
86+
codegate serve --vllm-url https://vllm.example.com
87+
```
88+
89+
Note: For the vLLM provider, the /v1 path is automatically appended to the base URL if not present.
90+
4691
### Log Levels
4792

4893
Available log levels (case-insensitive):

docs/development.md

Lines changed: 87 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,7 @@ Codegate is a configurable Generative AI gateway designed to protect developers
99
- Secure coding recommendations
1010
- Prevention of AI recommending deprecated/malicious libraries
1111
- Modular system prompts configuration
12+
- Multiple AI provider support with configurable endpoints
1213

1314
## Development Setup
1415

@@ -53,7 +54,11 @@ codegate/
5354
│ ├── logging.py # Logging setup
5455
│ ├── prompts.py # Prompts management
5556
│ ├── server.py # Main server implementation
56-
│ └── providers/* # External service providers (anthropic, openai, etc.)
57+
│ └── providers/ # External service providers
58+
│ ├── anthropic/ # Anthropic provider implementation
59+
│ ├── openai/ # OpenAI provider implementation
60+
│ ├── vllm/ # vLLM provider implementation
61+
│ └── base.py # Base provider interface
5762
├── tests/ # Test files
5863
└── docs/ # Documentation
5964
```
@@ -128,9 +133,87 @@ Codegate uses a hierarchical configuration system with the following priority (h
128133
- Log Level: Logging level (ERROR|WARNING|INFO|DEBUG)
129134
- Log Format: Log format (JSON|TEXT)
130135
- Prompts: System prompts configuration
136+
- Provider URLs: AI provider endpoint configuration
131137

132138
See [Configuration Documentation](configuration.md) for detailed information.
133139

140+
## Working with Providers
141+
142+
Codegate supports multiple AI providers through a modular provider system.
143+
144+
### Available Providers
145+
146+
1. **vLLM Provider**
147+
- Default URL: http://localhost:8000
148+
- Supports OpenAI-compatible API
149+
- Automatically adds /v1 path to base URL
150+
- Model names are prefixed with "hosted_vllm/"
151+
152+
2. **OpenAI Provider**
153+
- Default URL: https://api.openai.com/v1
154+
- Standard OpenAI API implementation
155+
156+
3. **Anthropic Provider**
157+
- Default URL: https://api.anthropic.com/v1
158+
- Anthropic Claude API implementation
159+
160+
### Configuring Providers
161+
162+
Provider URLs can be configured through:
163+
164+
1. Config file (config.yaml):
165+
```yaml
166+
provider_urls:
167+
vllm: "https://vllm.example.com"
168+
openai: "https://api.openai.com/v1"
169+
anthropic: "https://api.anthropic.com/v1"
170+
```
171+
172+
2. Environment variables:
173+
```bash
174+
export CODEGATE_PROVIDER_VLLM_URL=https://vllm.example.com
175+
export CODEGATE_PROVIDER_OPENAI_URL=https://api.openai.com/v1
176+
export CODEGATE_PROVIDER_ANTHROPIC_URL=https://api.anthropic.com/v1
177+
```
178+
179+
3. CLI flags:
180+
```bash
181+
codegate serve --vllm-url https://vllm.example.com
182+
```
183+
184+
### Implementing New Providers
185+
186+
To add a new provider:
187+
188+
1. Create a new directory in `src/codegate/providers/`
189+
2. Implement required components:
190+
- `provider.py`: Main provider class extending BaseProvider
191+
- `adapter.py`: Input/output normalizers
192+
- `__init__.py`: Export provider class
193+
194+
Example structure:
195+
```python
196+
from codegate.providers.base import BaseProvider
197+
198+
class NewProvider(BaseProvider):
199+
def __init__(self, ...):
200+
super().__init__(
201+
InputNormalizer(),
202+
OutputNormalizer(),
203+
completion_handler,
204+
pipeline_processor,
205+
fim_pipeline_processor
206+
)
207+
208+
@property
209+
def provider_route_name(self) -> str:
210+
return "provider_name"
211+
212+
def _setup_routes(self):
213+
# Implement route setup
214+
pass
215+
```
216+
134217
## Working with Prompts
135218

136219
### Default Prompts
@@ -188,8 +271,9 @@ codegate serve --port 8989 --host localhost --log-level DEBUG
188271

189272
# Start with custom prompts
190273
codegate serve --prompts my-prompts.yaml
274+
275+
# Start with custom provider URL
276+
codegate serve --vllm-url https://vllm.example.com
191277
```
192278

193279
See [CLI Documentation](cli.md) for detailed command information.
194-
195-
[Rest of development.md content remains unchanged...]

0 commit comments

Comments
 (0)