Skip to content

Commit

Permalink
langsmith rename (langchain-ai#1957)
Browse files Browse the repository at this point in the history
* sdk package change

* get rid of old langchainplus docker compose setup

* file renames

* rename tests with new name

* example tracing endpoint to smith

* point tracing docs to langsmith docs
  • Loading branch information
efriis authored Jul 13, 2023
1 parent f985d3e commit 2d0e1ad
Show file tree
Hide file tree
Showing 13 changed files with 28 additions and 151 deletions.
94 changes: 1 addition & 93 deletions docs/docs/production/tracing.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,96 +2,4 @@

Similar to the Python `langchain` package, JS `langchain` also supports tracing.

You can view an overview of tracing [here.](https://python.langchain.com/en/latest/additional_resources/tracing.html)
To spin up the tracing backend, run `docker compose up` (or `docker-compose up` if on using an older version of `docker`) in the `langchain` directory.
You can also use the `langchain-server` command if you have the python `langchain` package installed.

Here's an example of how to use tracing in `langchain.js`. All that needs to be done is setting the `LANGCHAIN_TRACING` environment variable to `true`.

```typescript
import { OpenAI } from "langchain/llms/openai";
import { initializeAgentExecutorWithOptions } from "langchain/agents";
import { SerpAPI } from "langchain/tools";
import { Calculator } from "langchain/tools/calculator";
import process from "process";

export const run = async () => {
process.env.LANGCHAIN_TRACING = "true";
const model = new OpenAI({ temperature: 0 });
const tools = [
new SerpAPI(process.env.SERPAPI_API_KEY, {
location: "Austin,Texas,United States",
hl: "en",
gl: "us",
}),
new Calculator(),
];

const executor = await initializeAgentExecutorWithOptions(tools, model, {
agentType: "zero-shot-react-description",
verbose: true,
});
console.log("Loaded agent.");

const input = `Who is Olivia Wilde's boyfriend? What is his current age raised to the 0.23 power?`;

console.log(`Executing with input "${input}"...`);

const result = await executor.call({ input });

console.log(`Got output ${result.output}`);
};
```

## Concurrency

Tracing works with concurrency out of the box.

```typescript
import { OpenAI } from "langchain/llms/openai";
import { initializeAgentExecutorWithOptions } from "langchain/agents";
import { SerpAPI } from "langchain/tools";
import { Calculator } from "langchain/tools/calculator";
import process from "process";

export const run = async () => {
process.env.LANGCHAIN_TRACING = "true";
const model = new OpenAI({ temperature: 0 });
const tools = [
new SerpAPI(process.env.SERPAPI_API_KEY, {
location: "Austin,Texas,United States",
hl: "en",
gl: "us",
}),
new Calculator(),
];

const executor = await initializeAgentExecutorWithOptions(tools, model, {
agentType: "zero-shot-react-description",
verbose: true,
});

console.log("Loaded agent.");

const input = `Who is Olivia Wilde's boyfriend? What is his current age raised to the 0.23 power?`;

console.log(`Executing with input "${input}"...`);

// This will result in a lot of errors, because the shared Tracer is not concurrency-safe.
const [resultA, resultB, resultC] = await Promise.all([
executor.call({ input }),
executor.call({ input }),
executor.call({ input }),
]);

console.log(`Got output ${resultA.output} ${resultA.__run.runId}`);
console.log(`Got output ${resultB.output} ${resultB.__run.runId}`);
console.log(`Got output ${resultC.output} ${resultC.__run.runId}`);

/*
Got output Harry Styles, Olivia Wilde's boyfriend, is 29 years old and his age raised to the 0.23 power is 2.169459462491557. b8fb98aa-07a5-45bd-b593-e8d7376b05ca
Got output Harry Styles, Olivia Wilde's boyfriend, is 29 years old and his age raised to the 0.23 power is 2.169459462491557. c8d916d5-ca1d-4702-8dd7-cab5e438578b
Got output Harry Styles, Olivia Wilde's boyfriend, is 29 years old and his age raised to the 0.23 power is 2.169459462491557. bf5fe04f-ef29-4e55-8ce1-e4aa974f9484
*/
};
```
You can view the tracing docs, including the JS Quickstart [here.](https://docs.smith.langchain.com/docs/)
4 changes: 2 additions & 2 deletions examples/src/client/tracing_datasets.ts
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
/* eslint-disable no-process-env */
// eslint-disable-next-line import/no-extraneous-dependencies
import { Client, Dataset } from "langchainplus-sdk";
import { Client, Dataset } from "langsmith";
import { ChatOpenAI } from "langchain/chat_models/openai";
import { runOnDataset } from "langchain/client";
import { SerpAPI } from "langchain/tools";
Expand Down Expand Up @@ -57,7 +57,7 @@ for (const input of inputs) {

// So that you don't have to create the dataset manually, we will create it for you
const client = new Client({
// apiUrl: "https://api.langchain.plus", // Default: LANGCHAIN_ENDPOINT environment variable of "http://localhost:1984"
// apiUrl: "https://api.smith.langchain.com", // Default: LANGCHAIN_ENDPOINT environment variable of "http://localhost:1984"
// apiKey: "<Your API Key>", // Default: LANGCHAIN_API_KEY environment variable
});
const csvContent = `
Expand Down
31 changes: 0 additions & 31 deletions langchain/docker-compose.yaml

This file was deleted.

2 changes: 1 addition & 1 deletion langchain/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -809,7 +809,7 @@
"js-tiktoken": "^1.0.7",
"js-yaml": "^4.1.0",
"jsonpointer": "^5.0.1",
"langchainplus-sdk": "^0.0.19",
"langsmith": "^0.0.5",
"ml-distance": "^4.0.0",
"object-hash": "^3.0.0",
"openai": "^3.3.0",
Expand Down
2 changes: 1 addition & 1 deletion langchain/src/callbacks/handlers/tracer.ts
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
import { KVMap, BaseRun } from "langchainplus-sdk/schemas";
import { KVMap, BaseRun } from "langsmith/schemas";

import {
AgentAction,
Expand Down
4 changes: 2 additions & 2 deletions langchain/src/callbacks/handlers/tracer_langchain.ts
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
import { Client } from "langchainplus-sdk";
import { Client } from "langsmith";
import {
BaseRun,
RunCreate,
RunUpdate as BaseRunUpdate,
} from "langchainplus-sdk/schemas";
} from "langsmith/schemas";
import {
getEnvironmentVariable,
getRuntimeEnvironment,
Expand Down
2 changes: 1 addition & 1 deletion langchain/src/client/index.ts
Original file line number Diff line number Diff line change
@@ -1 +1 @@
export { DatasetRunResults, runOnDataset } from "./langchainplus.js";
export { DatasetRunResults, runOnDataset } from "./langsmith.js";
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
import PQueueMod from "p-queue";
import { Example, Client } from "langchainplus-sdk";
import { Example, Client } from "langsmith";

import { LangChainTracer } from "../callbacks/handlers/tracer_langchain.js";
import { ChainValues, LLMResult, StoredMessage } from "../schema/index.js";
Expand Down Expand Up @@ -152,7 +152,7 @@ export const runOnDataset = async (
} else {
projectName_ = projectName;
}
await client_.createProject({ projectName: projectName_, mode: "eval" });
await client_.createProject({ projectName: projectName_ });
const results: DatasetRunResults = examples.reduce(
(acc, example) => ({ ...acc, [example.id]: [] }),
{}
Expand Down
Original file line number Diff line number Diff line change
@@ -1,14 +1,14 @@
/* eslint-disable no-process-env */
import { test } from "@jest/globals";
import { Client } from "langchainplus-sdk";
import { Client } from "langsmith";
import { ChatOpenAI } from "../../chat_models/openai.js";
import { SerpAPI } from "../../tools/serpapi.js";
import { Calculator } from "../../tools/calculator.js";
import { initializeAgentExecutorWithOptions } from "../../agents/initialize.js";
import { OpenAI } from "../../llms/openai.js";
import { runOnDataset } from "../langchainplus.js";
import { runOnDataset } from "../langsmith.js";

test("Test LangChainPlus Client Run Chat Model Over Simple Dataset", async () => {
test("Test LangSmith Client Run Chat Model Over Simple Dataset", async () => {
const client = new Client({});
const datasetName = "chat-test";
const description = "Asking a chat model test things";
Expand Down Expand Up @@ -41,7 +41,7 @@ test("Test LangChainPlus Client Run Chat Model Over Simple Dataset", async () =>
expect(Object.keys(results).length).toEqual(1);
});

test("Test LangChainPlus Client Run LLM Over Simple Dataset", async () => {
test("Test LangSmith Client Run LLM Over Simple Dataset", async () => {
const client = new Client({});
const datasetName = "llm-test";
const description = "Asking a chat model test things";
Expand All @@ -65,7 +65,7 @@ test("Test LangChainPlus Client Run LLM Over Simple Dataset", async () => {
}
const model = new OpenAI({ temperature: 0 });
const randomId = Math.random().toString(36).substring(7);
const projectName = `LangChainPlus Client Test ${randomId}`;
const projectName = `LangSmith Client Test ${randomId}`;
const results = await runOnDataset(datasetName, model, {
projectName,
client,
Expand All @@ -83,7 +83,7 @@ test("Test LangChainPlus Client Run LLM Over Simple Dataset", async () => {
expect(run.id).toBe(firstRun.id);
});

test("Test LangChainPlus Client Run Chain Over Simple Dataset", async () => {
test("Test LangSmith Client Run Chain Over Simple Dataset", async () => {
const client = new Client({});
const csvContent = `
input,output
Expand Down Expand Up @@ -122,7 +122,7 @@ what is 1213 divided by 4345?,approximately 0.2791714614499425
expect(Object.keys(results).length).toEqual(2);
});

test("Test LangChainPlus Client Run Chain Over Dataset", async () => {
test("Test LangSmith Client Run Chain Over Dataset", async () => {
const client = new Client({});
const csvContent = `
input,output
Expand Down
4 changes: 2 additions & 2 deletions langchain/src/evaluation/run_evaluators/base.ts
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
import { Example, Run } from "langchainplus-sdk";
import { EvaluationResult, RunEvaluator } from "langchainplus-sdk/evaluation";
import { Example, Run } from "langsmith";
import { EvaluationResult, RunEvaluator } from "langsmith/evaluation";
import { BaseOutputParser } from "../../schema/output_parser.js";
import { LLMChain } from "../../chains/llm_chain.js";
import { BaseChain } from "../../chains/base.js";
Expand Down
4 changes: 2 additions & 2 deletions langchain/src/evaluation/run_evaluators/implementations.ts
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
import { Example, Run } from "langchainplus-sdk";
import { EvaluationResult } from "langchainplus-sdk/evaluation";
import { Example, Run } from "langsmith";
import { EvaluationResult } from "langsmith/evaluation";
import { CRITERIA_PROMPT } from "./criteria_prompt.js";
import {
RunEvaluatorInputMapper,
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
import { test } from "@jest/globals";
import { Example, Run } from "langchainplus-sdk";
import { Example, Run } from "langsmith";
import {
ChoicesOutputParser,
StringRunEvaluatorInputMapper,
Expand Down
12 changes: 6 additions & 6 deletions yarn.lock
Original file line number Diff line number Diff line change
Expand Up @@ -16948,7 +16948,7 @@ __metadata:
js-tiktoken: ^1.0.7
js-yaml: ^4.1.0
jsonpointer: ^5.0.1
langchainplus-sdk: ^0.0.19
langsmith: ^0.0.5
mammoth: ^1.5.1
ml-distance: ^4.0.0
mongodb: ^5.2.0
Expand Down Expand Up @@ -17169,18 +17169,18 @@ __metadata:
languageName: unknown
linkType: soft

"langchainplus-sdk@npm:^0.0.19":
version: 0.0.19
resolution: "langchainplus-sdk@npm:0.0.19"
"langsmith@npm:^0.0.5":
version: 0.0.5
resolution: "langsmith@npm:0.0.5"
dependencies:
"@types/uuid": ^9.0.1
commander: ^10.0.1
p-queue: ^6.6.2
p-retry: 4
uuid: ^9.0.0
bin:
langchain: dist/cli/main.cjs
checksum: f0174c1e248e4bc5034a7dd182f703b895a485b6408aa518c91ff12b3f015febd5546eeb7f821c82b63a9d2b67a7ea903a1a57ad196049743f0934ff1c524ae8
langsmith: dist/cli/main.cjs
checksum: f23960e81198996c154c38236eda9d6db841b60428f075fb5a7cbd54f03312f635d0edb26f2309b449225c2603e9915dd4aa10f77a6d5526bdf2f432e1be17cb
languageName: node
linkType: hard

Expand Down

0 comments on commit 2d0e1ad

Please sign in to comment.