Skip to content

Commit

Permalink
Merge remote-tracking branch 'upstream/main'
Browse files Browse the repository at this point in the history
  • Loading branch information
magick93 committed Apr 12, 2023
2 parents 287cdd1 + 2886ad9 commit 595d21f
Show file tree
Hide file tree
Showing 411 changed files with 14,818 additions and 1,704 deletions.
21 changes: 19 additions & 2 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# This workflow will do a clean installation of node dependencies, cache/restore them, build the source code and run tests across different versions of node
# For more information see: https://docs.github.com/en/actions/automating-builds-and-tests/building-and-testing-nodejs

name: Node.js CI
name: CI

on:
push:
Expand Down Expand Up @@ -55,7 +55,7 @@ jobs:
run: yarn run build

test:
name: Test
name: Unit Tests
strategy:
matrix:
os: [macos-latest, windows-latest, ubuntu-latest]
Expand All @@ -75,3 +75,20 @@ jobs:
run: yarn run build --filter="!docs"
- name: Test
run: yarn run test:unit

test-exports:
name: Environment Tests
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Use Node.js 18.x
uses: actions/setup-node@v3
with:
node-version: 18.x
cache: "yarn"
- name: Install dependencies
run: yarn install --immutable
- name: Build
run: yarn run build --filter="!docs"
- name: Test Exports
run: yarn run test:exports:docker
1 change: 1 addition & 0 deletions .nvmrc
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
18
4 changes: 3 additions & 1 deletion .vscode/settings.json
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,10 @@
"./langchain",
"./examples",
"./docs",
"./test-exports-vercel",
"./test-exports-cra",
],
"yaml.schemas": {
"https://json.schemastore.org/github-workflow.json": "./.github/workflows/deploy.yml"
}
}
}
20 changes: 15 additions & 5 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -162,6 +162,14 @@ To run only integration tests, run:
yarn test:int
```

**Environment tests** test whether LangChain works across different JS environments, including Node.js (both ESM and CJS), Edge environments (eg. Cloudflare Workers), and browsers (using Webpack).

To run the environment tests with Docker run:

```bash
yarn test:exports:docker
```

### Building

To build the project, run:
Expand All @@ -183,21 +191,23 @@ level of the repo.

### Adding an Entrypoint

Langchain exposes multiple multiple subpaths the user can import from, e.g.
LangChain exposes multiple subpaths the user can import from, e.g.

```ts
import { OpenAI } from "langchain/llms";
import { OpenAI } from "langchain/llms/openai";
```

We call these subpaths "entrypoints". In general, you should create a new entrypoint if you are adding a new integration with a 3rd party library. If you're adding self-contained functionality without any external dependencies, you can add it to an existing entrypoint.

In order to declare a new entrypoint that users can import from, you
should edit the `langchain/create-entrypoints.js` script. To add an
entrypoint `tools` that imports from `agents/tools/index.ts` you could add
should edit the `langchain/scripts/create-entrypoints.js` script. To add an
entrypoint `tools` that imports from `tools/index.ts` you'd add
the following to the `entrypoints` variable:

```ts
const entrypoints = {
// ...
tools: "agents/tools/index.ts",
tools: "tools/index",
};
```

Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ Please fill out [this form](https://forms.gle/57d8AmXBYp8PP8tZA) and we'll set u
`yarn add langchain`

```typescript
import { OpenAI } from "langchain/llms";
import { OpenAI } from "langchain/llms/openai";
```

## 🤔 What is this?
Expand Down
66 changes: 66 additions & 0 deletions docker-compose.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,66 @@
version: '3'
services:
test-exports-esm:
image: node:18
working_dir: /app
volumes:
- ./test-exports-esm:/package
- ./langchain:/langchain
- ./scripts:/scripts
command: bash /scripts/docker-ci-entrypoint.sh
test-exports-cjs:
image: node:18
working_dir: /app
volumes:
- ./test-exports-cjs:/package
- ./langchain:/langchain
- ./scripts:/scripts
command: bash /scripts/docker-ci-entrypoint.sh
test-exports-cra:
image: node:18
working_dir: /app
volumes:
- ./test-exports-cra:/package
- ./langchain:/langchain
- ./scripts:/scripts
command: bash /scripts/docker-ci-entrypoint.sh
test-exports-cf:
image: node:18
working_dir: /app
volumes:
- ./test-exports-cf:/package
- ./langchain:/langchain
- ./scripts:/scripts
command: bash /scripts/docker-ci-entrypoint.sh
test-exports-vercel:
image: node:18
working_dir: /app
volumes:
- ./test-exports-vercel:/package
- ./langchain:/langchain
- ./scripts:/scripts
command: bash /scripts/docker-ci-entrypoint.sh
test-exports-vite:
image: node:18
working_dir: /app
volumes:
- ./test-exports-vite:/package
- ./langchain:/langchain
- ./scripts:/scripts
command: bash /scripts/docker-ci-entrypoint.sh
success:
image: alpine:3.14
command: echo "Success"
depends_on:
test-exports-esm:
condition: service_completed_successfully
test-exports-cjs:
condition: service_completed_successfully
test-exports-cra:
condition: service_completed_successfully
test-exports-cf:
condition: service_completed_successfully
test-exports-vercel:
condition: service_completed_successfully
test-exports-vite:
condition: service_completed_successfully
40 changes: 40 additions & 0 deletions docs/docs/ecosystem/databerry.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
# Databerry

This page covers how to use the [Databerry](https://databerry.ai) within LangChain.

## What is Databerry?

Databerry is an [open source](https://github.com/gmpetrov/databerry) document retrievial platform that helps to connect your personal data with Large Language Models.

![Databerry](/img/DataberryDashboard.png)

## Quick start

Retrieving documents stored in Databerry from LangChain is very easy!

```typescript
import { DataberryRetriever } from "langchain/retrievers/databerry";

const retriever = new DataberryRetriever({
datastoreUrl: "https://api.databerry.ai/query/clg1xg2h80000l708dymr0fxc",
apiKey: "DATABERRY_API_KEY", // optional: needed for private datastores
topK: 8, // optional: default value is 3
});

// Create a chain that uses the OpenAI LLM and Databerry retriever.
const chain = RetrievalQAChain.fromLLM(model, retriever);

// Call the chain with a query.
const res = await chain.call({
query: "What's Databerry?",
});

console.log({ res });
/*
{
res: {
text: 'Databerry provides a user-friendly solution to quickly setup a semantic search system over your personal data without any technical knowledge.'
}
}
*/
```
2 changes: 1 addition & 1 deletion docs/docs/ecosystem/unstructured.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ If you are running the container locally, switch the url to
`https://api.unstructured.io/general/v0/general`.

```typescript
import { UnstructuredLoader } from "langchain/document_loader";
import { UnstructuredLoader } from "langchain/document_loaders/fs/unstructured";

const loader = new UnstructuredLoader(
"https://api.unstructured.io/general/v0/general",
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/getting-started/guide-chat.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ To get started, follow the [installation instructions](./install) to install Lan
This section covers how to get started with chat models. The interface is based around messages rather than raw text.

```typescript
import { ChatOpenAI } from "langchain/chat_models";
import { ChatOpenAI } from "langchain/chat_models/openai";
import { HumanChatMessage, SystemChatMessage } from "langchain/schema";

const chat = new ChatOpenAI({ temperature: 0 });
Expand Down
11 changes: 6 additions & 5 deletions docs/docs/getting-started/guide-llm.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ The most basic building block of LangChain is calling an LLM on some input. Let'
In order to do this, we first need to import the LLM wrapper.

```typescript
import { OpenAI } from "langchain";
import { OpenAI } from "langchain/llms/openai";
```

We will then need to set the environment variable for the OpenAI key. Three options here:
Expand Down Expand Up @@ -110,7 +110,7 @@ The most core type of chain is an LLMChain, which consists of a PromptTemplate a
Extending the previous example, we can construct an LLMChain which takes user input, formats it with a PromptTemplate, and then passes the formatted response to an LLM.
```typescript
import { OpenAI } from "langchain/llms";
import { OpenAI } from "langchain/llms/openai";
import { PromptTemplate } from "langchain/prompts";
const model = new OpenAI({ temperature: 0.9 });
Expand Down Expand Up @@ -165,9 +165,10 @@ SERPAPI_API_KEY="..."
Now we can get started!
```typescript
import { OpenAI } from "langchain";
import { OpenAI } from "langchain/llms/openai";
import { initializeAgentExecutor } from "langchain/agents";
import { SerpAPI, Calculator } from "langchain/tools";
import { SerpAPI } from "langchain/tools";
import { Calculator } from "langchain/tools/calculator";
const model = new OpenAI({ temperature: 0 });
const tools = [new SerpAPI(), new Calculator()];
Expand Down Expand Up @@ -203,7 +204,7 @@ LangChain provides several specially created chains just for this purpose. This
By default, the `ConversationChain` has a simple type of memory that remembers all previous inputs/outputs and adds them to the context that is passed. Let's take a look at using this chain.
```typescript
import { OpenAI } from "langchain/llms";
import { OpenAI } from "langchain/llms/openai";
import { BufferMemory } from "langchain/memory";
import { ConversationChain } from "langchain/chains";
Expand Down
Loading

0 comments on commit 595d21f

Please sign in to comment.