Skip to content

Commit

Permalink
Add PromptLayer OpenAI wrapper integration (langchain-ai#61)
Browse files Browse the repository at this point in the history
* Add PromptLayer OpenAI wrapper integration

* Import node-fetch to fix potential crash if not included
  • Loading branch information
dqbd authored Feb 20, 2023
1 parent 6d23799 commit b1c55da
Show file tree
Hide file tree
Showing 4 changed files with 87 additions and 1 deletion.
18 changes: 18 additions & 0 deletions docs/docs/modules/llms/openai.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,3 +11,21 @@ const res = await model.call(
);
console.log({ res });
```

## PromptLayer OpenAI

This library supports PromptLayer for logging and debugging prompts and responses. To add support for PromptLayer:

1. Create a PromptLayer account here: [https://promptlayer.com](https://promptlayer.com).
2. Create an API token and pass it either as `promptLayerApiKey` argument in the `PromptLayerOpenAI` constructor or in the `PROMPT_LAYER_API_KEY` environment variable.

```typescript
const model = new PromptLayerOpenAI({ temperature: 0.9 });
const res = await model.call(
"What would be a good company name a company that makes colorful socks?"
);
```

The request and the response will be logged in the [PromptLayer dashboard](https://promptlayer.com/home).

Note: In streaming mode PromptLayer will not log the response.
9 changes: 9 additions & 0 deletions examples/src/llm_promptlayer.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
import { PromptLayerOpenAI } from "langchain/llms";

export const run = async () => {
const model = new PromptLayerOpenAI({ temperature: 0.9 });
const res = await model.call(
"What would be a good company name a company that makes colorful socks?"
);
console.log({ res });
};
2 changes: 1 addition & 1 deletion langchain/llms/index.ts
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
export { BaseLLM, LLM, SerializedLLM } from "./base";
export { OpenAI } from "./openai";
export { OpenAI, PromptLayerOpenAI } from "./openai";
export { Cohere } from "./cohere";
export { loadLLM } from "./load";

Expand Down
59 changes: 59 additions & 0 deletions langchain/llms/openai.ts
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ import type {
} from "openai";
import type { IncomingMessage } from "http";

import fetch from "node-fetch";
import { createParser } from "eventsource-parser";
import { backOff } from "exponential-backoff";
import { chunkArray } from "../util";
Expand Down Expand Up @@ -338,3 +339,61 @@ export class OpenAI extends BaseLLM implements OpenAIInput {
return "openai";
}
}

/**
* PromptLayer wrapper to OpenAI
* @augments OpenAI
*/
export class PromptLayerOpenAI extends OpenAI {
promptLayerApiKey?: string;

plTags?: string[];

constructor(
fields?: ConstructorParameters<typeof OpenAI>[0] & {
promptLayerApiKey?: string;
plTags?: string[];
}
) {
super(fields);

this.plTags = fields?.plTags ?? [];
this.promptLayerApiKey =
fields?.promptLayerApiKey ?? process.env.PROMPTLAYER_API_KEY;

if (!this.promptLayerApiKey) {
throw new Error("Missing PromptLayer API key");
}
}

async completionWithRetry(request: CreateCompletionRequest) {
if (request.stream) {
return super.completionWithRetry(request);
}

const requestStartTime = Date.now();
const response = await super.completionWithRetry(request);
const requestEndTime = Date.now();

// https://github.com/MagnivOrg/promptlayer-js-helper
await fetch("https://api.promptlayer.com/track-request", {
method: "POST",
headers: {
"Content-Type": "application/json",
Accept: "application/json",
},
body: JSON.stringify({
function_name: "openai.Completion.create",
args: [],
kwargs: { engine: request.model, prompt: request.prompt },
tags: this.plTags ?? [],
request_response: response.data,
request_start_time: Math.floor(requestStartTime / 1000),
request_end_time: Math.floor(requestEndTime / 1000),
api_key: process.env.PROMPTLAYER_API_KEY,
}),
});

return response;
}
}

0 comments on commit b1c55da

Please sign in to comment.