Skip to content

Commit

Permalink
Merge branch 'main' into nc/test-exports-cf
Browse files Browse the repository at this point in the history
  • Loading branch information
nfcampos committed Apr 9, 2023
2 parents c311c6c + d37366b commit 320f9fb
Show file tree
Hide file tree
Showing 9 changed files with 131 additions and 119 deletions.
63 changes: 4 additions & 59 deletions docs/docs/getting-started/guide-chat.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,9 @@
sidebar_position: 3
---

import CodeBlock from "@theme/CodeBlock";
import Example from "@examples/models/chat/chat_streaming_stdout.ts";

# Quickstart, using Chat Models

Chat models are a variation on language models.
Expand Down Expand Up @@ -316,62 +319,4 @@ const chain = new ConversationChain({

You can also use the streaming API to get words streamed back to you as they are generated. This is useful for eg. chatbots, where you want to show the user what is being generated as it is being generated. Note: OpenAI as of this writing does not support `tokenUsage` reporting while streaming is enabled.

```typescript
let s = "";
const chatStreaming = new ChatOpenAI({
streaming: true,
callbackManager: CallbackManager.fromHandlers({
async handleLLMNewToken(token: string) {
console.clear();
s += token;
console.log(s);
},
}),
});

const responseD = await chatStreaming.call([
new HumanChatMessage("Write me a song about sparkling water."),
]);
```

```
Verse 1:
Bubbles in the bottle,
Light and refreshing,
It's the drink that I love,
My thirst quenching blessing.
Chorus:
Sparkling water, my fountain of youth,
I can't get enough, it's the perfect truth,
It's fizzy and fun, and oh so clear,
Sparkling water, it's crystal clear.
Verse 2:
No calories or sugars,
Just a burst of delight,
It's the perfect cooler,
On a hot summer night.
Chorus:
Sparkling water, my fountain of youth,
I can't get enough, it's the perfect truth,
It's fizzy and fun, and oh so clear,
Sparkling water, it's crystal clear.
Bridge:
It's my happy place,
In every situation,
My daily dose of hydration,
Always bringing satisfaction.
Chorus:
Sparkling water, my fountain of youth,
I can't get enough, it's the perfect truth,
It's fizzy and fun, and oh so clear,
Sparkling water, it's crystal clear.
Outro:
Sparkling water, it's crystal clear,
My love for you will never disappear.
```
<CodeBlock language="typescript">{Example}</CodeBlock>
54 changes: 4 additions & 50 deletions docs/docs/getting-started/guide-llm.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,9 @@
sidebar_position: 2
---

import CodeBlock from "@theme/CodeBlock";
import Example from "@examples/models/llm/llm_streaming_stdout.ts";

# Quickstart, using LLMs

This tutorial gives you a quick walkthrough about building an end-to-end language model application with LangChain.
Expand Down Expand Up @@ -228,53 +231,4 @@ console.log(res2);
You can also use the streaming API to get words streamed back to you as they are generated. This is useful for eg. chatbots, where you want to show the user what is being generated as it is being generated. Note: OpenAI as of this writing does not support `tokenUsage` reporting while streaming is enabled.
```typescript
const chat = new OpenAI({
streaming: true,
callbackManager: CallbackManager.fromHandlers({
async handleLLMNewToken(token: string) {
console.log(token);
},
}),
});
const response = await chat.call("Write me a song about sparkling water.");
console.log(response);
```
```
Verse 1
On a hot summer day, I'm looking for a treat
I'm thirsty for something cool and sweet
When I open up the fridge, what do I see?
A bottle of sparkling water, it's calling out to me
Chorus
Sparkling water, it's so refreshing
It quenches my thirst, it's the perfect thing
It's so light and bubbly, it's like a dream
And I'm loving every sip of sparkling water
Verse 2
I take it out of the fridge and pour some in a glass
It's so light and bubbly, I can feel the fizz
I take a sip and suddenly I'm revived
This sparkling water's just what I need to survive
Chorus
Sparkling water, it's so refreshing
It quenches my thirst, it's the perfect thing
It's so light and bubbly, it's like a dream
And I'm loving every sip of sparkling water
Bridge
It's like drinking sunshine between my hands
It's so light and bubbly, I'm in a trance
The summer heat's no match for sparkling water
It's my favorite
```
<CodeBlock language="typescript">{Example}</CodeBlock>
12 changes: 8 additions & 4 deletions docs/docs/modules/models/chat/integrations.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -12,15 +12,19 @@ LangChain offers a number of Chat Models implementations that integrate with var
```typescript
import { ChatOpenAI } from "langchain/chat_models/openai";

// Expects an OpenAI API key to be set in the env variable OPENAI_API_KEY
const model = new ChatOpenAI({ temperature: 0.9 });
const model = new ChatOpenAI({
temperature: 0.9,
openAIApiKey: "YOUR-API-KEY", // In Node.js defaults to process.env.OPENAI_API_KEY
});
```

## `Anthropic`

```typescript
import { ChatAnthropic } from "langchain/chat_models/anthropic";

// Expects an Anthropic API key to be set in the env variable ANTHROPIC_API_KEY
const model = new ChatAnthropic({ temperature: 0.9 });
const model = new ChatAnthropic({
temperature: 0.9,
apiKey: "YOUR-API-KEY", // In Node.js defaults to process.env.ANTHROPIC_API_KEY
});
```
8 changes: 6 additions & 2 deletions docs/docs/modules/models/embeddings/integrations.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,9 @@ The `OpenAIEmbeddings` class uses the OpenAI API to generate embeddings for a gi
```typescript
import { OpenAIEmbeddings } from "langchain/embeddings/openai";

const embeddings = new OpenAIEmbeddings();
const embeddings = new OpenAIEmbeddings({
openAIApiKey: "YOUR-API-KEY", // In Node.js defaults to process.env.OPENAI_API_KEY
});
```

## `CohereEmbeddings`
Expand All @@ -26,5 +28,7 @@ npm install cohere-ai
```typescript
import { CohereEmbeddings } from "langchain/embeddings/cohere";

const embeddings = new CohereEmbeddings();
const embeddings = new CohereEmbeddings({
apiKey: "YOUR-API-KEY", // In Node.js defaults to process.env.COHERE_API_KEY
});
```
22 changes: 18 additions & 4 deletions docs/docs/modules/models/llms/integrations.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,10 @@ LangChain offers a number of LLM implementations that integrate with various mod
```typescript
import { OpenAI } from "langchain/llms/openai";

const model = new OpenAI({ temperature: 0.9 });
const model = new OpenAI({
temperature: 0.9,
openAIApiKey: "YOUR-API-KEY", // In Node.js defaults to process.env.OPENAI_API_KEY
});
const res = await model.call(
"What would be a good company name a company that makes colorful socks?"
);
Expand All @@ -28,7 +31,10 @@ npm install @huggingface/inference
```typescript
import { HuggingFaceInference } from "langchain/llms/hf";

const model = new HuggingFaceInference({ model: "gpt2" });
const model = new HuggingFaceInference({
model: "gpt2",
apiKey: "YOUR-API-KEY", // In Node.js defaults to process.env.HUGGINGFACEHUB_API_KEY
});
const res = await model.call("1 + 1 =");
console.log({ res });
```
Expand All @@ -42,7 +48,10 @@ npm install cohere-ai
```typescript
import { Cohere } from "langchain/llms/cohere";

const model = new Cohere({ maxTokens: 20 });
const model = new Cohere({
maxTokens: 20,
apiKey: "YOUR-API-KEY", // In Node.js defaults to process.env.COHERE_API_KEY
});
const res = await model.call(
"What would be a good company name a company that makes colorful socks?"
);
Expand All @@ -61,6 +70,7 @@ import { Replicate } from "langchain/llms/cohere";
const model = new Replicate({
model:
"daanelson/flan-t5:04e422a9b85baed86a4f24981d7f9953e20c5fd82f6103b74ebc431588e1cec8",
apiKey: "YOUR-API-KEY", // In Node.js defaults to process.env.REPLICATE_API_KEY
});
const res = await modelA.call(
"What would be a good company name a company that makes colorful socks?"
Expand All @@ -80,7 +90,11 @@ LangChain integrates with PromptLayer for logging and debugging prompts and resp
```typescript
import { PromptLayerOpenAI } from "langchain/llms/openai";

const model = new PromptLayerOpenAI({ temperature: 0.9 });
const model = new PromptLayerOpenAI({
temperature: 0.9,
openAIApiKey: "YOUR-API-KEY", // In Node.js defaults to process.env.OPENAI_API_KEY
promptLayerApiKey: "YOUR-API-KEY", // In Node.js defaults to process.env.PROMPTLAYER_API_KEY
});
const res = await model.call(
"What would be a good company name a company that makes colorful socks?"
);
Expand Down
11 changes: 11 additions & 0 deletions examples/src/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,17 @@ This folder contains examples of how to use LangChain.

What you'll usually want to do.

First, build langchain. From the repository root, run:

```sh
yarn
yarn build
```

Most examples require API keys. Run `cp .env.example .env`, then edit `.env` with your API keys.

Then from the `examples/` directory, run:

`yarn run start <path to example>`

eg.
Expand Down
5 changes: 5 additions & 0 deletions examples/src/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,11 @@ import url from "url";

const [exampleName, ...args] = process.argv.slice(2);

if (!exampleName) {
console.error("Please provide path to example to run");
process.exit(1);
}

// Allow people to pass all possible variations of a path to an example
// ./src/foo.ts, ./dist/foo.js, src/foo.ts, dist/foo.js, foo.ts
let exampleRelativePath: string;
Expand Down
38 changes: 38 additions & 0 deletions examples/src/models/chat/chat_streaming_stdout.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
import { CallbackManager } from "langchain/callbacks";
import { ChatOpenAI } from "langchain/chat_models";
import { HumanChatMessage } from "langchain/schema";

export const run = async () => {
const chat = new ChatOpenAI({
streaming: true,
callbackManager: CallbackManager.fromHandlers({
async handleLLMNewToken(token: string) {
process.stdout.write(token);
},
}),
});

await chat.call([
new HumanChatMessage("Write me a song about sparkling water."),
]);
/*
Verse 1:
Bubbles rise, crisp and clear
Refreshing taste that brings us cheer
Sparkling water, so light and pure
Quenches our thirst, it's always secure
Chorus:
Sparkling water, oh how we love
Its fizzy bubbles and grace above
It's the perfect drink, anytime, anyplace
Refreshing as it gives us a taste
Verse 2:
From morning brunch to evening feast
It's the perfect drink for a treat
A sip of it brings a smile so bright
Our thirst is quenched in just one sip so light
...
*/
};
37 changes: 37 additions & 0 deletions examples/src/models/llm/llm_streaming_stdout.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
import { CallbackManager } from "langchain/callbacks";
import { OpenAI } from "langchain/llms";

export const run = async () => {
// To enable streaming, we pass in `streaming: true` to the LLM constructor.
// Additionally, we pass in a `CallbackManager` with a handler set up for the `handleLLMNewToken` event.
const chat = new OpenAI({
streaming: true,
callbackManager: CallbackManager.fromHandlers({
async handleLLMNewToken(token: string) {
process.stdout.write(token);
},
}),
});

await chat.call("Write me a song about sparkling water.");
/*
Verse 1
Crystal clear and made with care
Sparkling water on my lips, so refreshing in the air
Fizzy bubbles, light and sweet
My favorite beverage I can’t help but repeat
Chorus
A toast to sparkling water, I’m feeling so alive
Let’s take a sip, and let’s take a drive
A toast to sparkling water, it’s the best I’ve had in my life
It’s the best way to start off the night
Verse 2
It’s the perfect drink to quench my thirst
It’s the best way to stay hydrated, it’s the first
A few ice cubes, a splash of lime
It will make any day feel sublime
...
*/
};

0 comments on commit 320f9fb

Please sign in to comment.