forked from stanfordnlp/dspy
-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Added prem sdk documentation under deep dive
- Loading branch information
1 parent
62eaf88
commit d831ad1
Showing
1 changed file
with
70 additions
and
0 deletions.
There are no files selected for viewing
70 changes: 70 additions & 0 deletions
70
docs/docs/deep-dive/language_model_clients/remote_models/PremAI.mdx
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,70 @@ | ||
## PremAI | ||
|
||
[PremAI](https://app.premai.io) is a unified platform that lets you build powerful production-ready GenAI-powered applications with the least effort so that you can focus more on user experience and overall growth. With dspy, you can connect with several [best-in-class LLMs](https://models.premai.io/) of your choice with a single interface. | ||
|
||
### Prerequisites | ||
|
||
Refer to the [quick start](https://docs.premai.io/introduction) guide to getting started with the PremAI platform, create your first project and grab your API key. | ||
|
||
### Setting up the PremAI Client | ||
|
||
The constructor initializes the base class `LM` to support prompting requests to supported PremAI hosted models. This requires the following parameters: | ||
|
||
- `model` (_str_): Models supported by PremAI. Example: `mistral-tiny`. We recommend using the model selected in [project launchpad](https://docs.premai.io/get-started/launchpad). | ||
- `project_id` (_int_): The [project id](https://docs.premai.io/get-started/projects) which contains the model of choice. | ||
- `api_key` (_Optional[str]_, _optional_): API provider from PremAI. Defaults to None. | ||
- `session_id` (_Optional[int]_, _optional_): The ID of the session to use. It helps to track the chat history. | ||
- `**kwargs`: Additional language model arguments will be passed to the API provider. | ||
|
||
Example of PremAI constructor: | ||
|
||
```python | ||
class PremAI(LM): | ||
def __init__( | ||
self, | ||
model: str, | ||
project_id: int, | ||
api_key: str, | ||
base_url: Optional[str] = None, | ||
session_id: Optional[int] = None, | ||
**kwargs, | ||
) -> None: | ||
``` | ||
|
||
### Under the Hood | ||
|
||
#### `__call__(self, prompt: str, **kwargs) -> str` | ||
|
||
**Parameters:** | ||
- `prompt` (_str_): Prompt to send to PremAI. | ||
- `**kwargs`: Additional keyword arguments for completion request. | ||
|
||
**Returns:** | ||
- `str`: Completions string from the chosen LLM provider | ||
|
||
Internally, the method handles the specifics of preparing the request prompt and corresponding payload to obtain the response. | ||
|
||
### Using the PremAI client | ||
|
||
```python | ||
premai_client = dspy.PremAI(project_id=1111) | ||
``` | ||
|
||
Please note that, this is a dummy `project_id`. You need to change this to the project_id you are interested to use with dspy. | ||
|
||
```python | ||
dspy.configure(lm=premai_client) | ||
|
||
#Example DSPy CoT QA program | ||
qa = dspy.ChainOfThought('question -> answer') | ||
|
||
response = qa(question="What is the capital of Paris?") | ||
print(response.answer) | ||
``` | ||
|
||
2) Generate responses using the client directly. | ||
|
||
```python | ||
response = premai_client(prompt='What is the capital of Paris?') | ||
print(response) | ||
``` |