Skip to content

Commit

Permalink
Update repo path to point to langchain-ai GitHub org. (#2933)
Browse files Browse the repository at this point in the history
  • Loading branch information
obi1kenobi authored Oct 16, 2023
1 parent 5f53963 commit 1f734db
Show file tree
Hide file tree
Showing 28 changed files with 69 additions and 69 deletions.
4 changes: 2 additions & 2 deletions .devcontainer/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,14 +7,14 @@ You can use the dev container configuration in this folder to build and run the
## GitHub Codespaces
[![Open in GitHub Codespaces](https://github.com/codespaces/badge.svg)](https://codespaces.new/hwchase17/langchainjs)
You may use the button above, or follow these steps to open this repo in a Codespace:
1. Click the **Code** drop-down menu at the top of https://github.com/hwchase17/langchainjs.
1. Click the **Code** drop-down menu at the top of https://github.com/langchain-ai/langchainjs.
1. Click on the **Codespaces** tab.
1. Click **Create codespace on main** .

For more info, check out the [GitHub documentation](https://docs.github.com/en/free-pro-team@latest/github/developing-online-with-codespaces/creating-a-codespace#creating-a-codespace).

## VS Code Dev Containers
[![Open in Dev Containers](https://img.shields.io/static/v1?label=Dev%20Containers&message=Open&color=blue&logo=visualstudiocode)](https://vscode.dev/redirect?url=vscode://ms-vscode-remote.remote-containers/cloneInVolume?url=https://github.com/hwchase17/langchainjs)
[![Open in Dev Containers](https://img.shields.io/static/v1?label=Dev%20Containers&message=Open&color=blue&logo=visualstudiocode)](https://vscode.dev/redirect?url=vscode://ms-vscode-remote.remote-containers/cloneInVolume?url=https://github.com/langchain-ai/langchainjs)

If you already have VS Code and Docker installed, you can use the button above to get started. This will cause VS Code to automatically install the Dev Containers extension if needed, clone the source code into a container volume, and spin up a dev container for use.

Expand Down
20 changes: 10 additions & 10 deletions .github/contributing/INTEGRATIONS.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Contributing Integrations to LangChain

In addition to the [general contribution guidelines](https://github.com/hwchase17/langchainjs/blob/main/CONTRIBUTING.md), there are a few extra things to consider when contributing third-party integrations to LangChain that will be covered here. The goal of this page is to help you draft PRs that take these considerations into account, and can therefore be merged sooner.
In addition to the [general contribution guidelines](https://github.com/langchain-ai/langchainjs/blob/main/CONTRIBUTING.md), there are a few extra things to consider when contributing third-party integrations to LangChain that will be covered here. The goal of this page is to help you draft PRs that take these considerations into account, and can therefore be merged sooner.

Integrations tend to fall into a set number of categories, each of which will have their own section below. Please read the [general guidelines](#general-concepts), then see the [integration-specific guidelines and example PRs](#integration-specific-guidelines-and-example-prs) section at the end of this page for additional information and examples.

Expand All @@ -10,7 +10,7 @@ The following guidelines apply broadly to all type of integrations:

### Creating a separate entrypoint

You should generally not export your new module from an `index.ts` file that contains many other exports. Instead, you should add a separate entrypoint for your integration in [`langchain/scripts/create-entrypoints.js`](https://github.com/hwchase17/langchainjs/blob/main/langchain/scripts/create-entrypoints.js) within the `entrypoints` object:
You should generally not export your new module from an `index.ts` file that contains many other exports. Instead, you should add a separate entrypoint for your integration in [`langchain/scripts/create-entrypoints.js`](https://github.com/langchain-ai/langchainjs/blob/main/langchain/scripts/create-entrypoints.js) within the `entrypoints` object:

```js
import * as fs from "fs";
Expand Down Expand Up @@ -59,13 +59,13 @@ A user would then import your new vector store as `import { LangCoVectorStore }

### Third-party dependencies

You may use third-party dependencies in new integrations, but they should be added as `peerDependencies` and `devDependencies` with an entry under `peerDependenciesMeta` in [`langchain/package.json`](https://github.com/hwchase17/langchainjs/blob/main/langchain/package.json), **not under any core `dependencies` list**. This keeps the overall package size small, as only people who are using your integration will need to install, and allows us to support a wider range of runtimes.
You may use third-party dependencies in new integrations, but they should be added as `peerDependencies` and `devDependencies` with an entry under `peerDependenciesMeta` in [`langchain/package.json`](https://github.com/langchain-ai/langchainjs/blob/main/langchain/package.json), **not under any core `dependencies` list**. This keeps the overall package size small, as only people who are using your integration will need to install, and allows us to support a wider range of runtimes.

We suggest using caret syntax (`^`) for peer dependencies to support a wider range of people trying to use them as well as to be somewhat tolerant to non-major version updates, which should (theoretically) be the only breaking ones.

Please make sure all introduced dependencies are permissively licensed (MIT is recommended) and well-supported and maintained.

You must also add your new entrypoint under `requiresOptionalDependency` in the [`create-entrypoints.js`](https://github.com/hwchase17/langchainjs/blob/main/langchain/scripts/create-entrypoints.js) file to avoid breaking the build:
You must also add your new entrypoint under `requiresOptionalDependency` in the [`create-entrypoints.js`](https://github.com/langchain-ai/langchainjs/blob/main/langchain/scripts/create-entrypoints.js) file to avoid breaking the build:

```js
// Entrypoints in this list require an optional dependency to be installed.
Expand Down Expand Up @@ -146,13 +146,13 @@ As with all contributions, make sure you run `yarn lint` and `yarn format` so th

Below are links to guides with advice and tips for specific types of integrations:

- [LLM providers](https://github.com/hwchase17/langchainjs/blob/main/.github/contributing/integrations/LLMS.md) (e.g. OpenAI's GPT-3)
- [LLM providers](https://github.com/langchain-ai/langchainjs/blob/main/.github/contributing/integrations/LLMS.md) (e.g. OpenAI's GPT-3)
- Chat model providers (TODO) (e.g. Anthropic's Claude, OpenAI's GPT-4)
- [Memory](https://github.com/hwchase17/langchainjs/blob/main/.github/contributing/integrations/MEMORY.md) (used to give an LLM or chat model context of past conversations, e.g. Motörhead)
- [Vector stores](https://github.com/hwchase17/langchainjs/blob/main/.github/contributing/integrations/VECTOR_STORES.md) (e.g. Pinecone)
- [Persistent message stores](https://github.com/hwchase17/langchainjs/blob/main/.github/contributing/integrations/MESSAGE_STORES.md) (used to persistently store and load raw chat histories, e.g. Redis)
- [Document loaders](https://github.com/hwchase17/langchainjs/blob/main/.github/contributing/integrations/DOCUMENT_LOADERS.md) (used to load documents for later storage into vector stores, e.g. Apify)
- [Memory](https://github.com/langchain-ai/langchainjs/blob/main/.github/contributing/integrations/MEMORY.md) (used to give an LLM or chat model context of past conversations, e.g. Motörhead)
- [Vector stores](https://github.com/langchain-ai/langchainjs/blob/main/.github/contributing/integrations/VECTOR_STORES.md) (e.g. Pinecone)
- [Persistent message stores](https://github.com/langchain-ai/langchainjs/blob/main/.github/contributing/integrations/MESSAGE_STORES.md) (used to persistently store and load raw chat histories, e.g. Redis)
- [Document loaders](https://github.com/langchain-ai/langchainjs/blob/main/.github/contributing/integrations/DOCUMENT_LOADERS.md) (used to load documents for later storage into vector stores, e.g. Apify)
- Embeddings (TODO) (e.g. Cohere)
- [Tools](https://github.com/hwchase17/langchainjs/blob/main/.github/contributing/integrations/TOOLS.md) (used for agents, e.g. the SERP API tool)
- [Tools](https://github.com/langchain-ai/langchainjs/blob/main/.github/contributing/integrations/TOOLS.md) (used for agents, e.g. the SERP API tool)

This is a living document, so please make a pull request if we're missing anything useful!
4 changes: 2 additions & 2 deletions .github/contributing/integrations/DOCUMENT_LOADERS.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,8 @@ This page contains some specific guidelines and examples for contributing integr

Document loaders are classes that pull in text from a given source and load them into chunks called **documents** for later use in queryable vector stores. Some example sources include PDFs, websites, and Notion docs.

**Make sure you read the [general guidelines page](https://github.com/hwchase17/langchainjs/blob/main/.github/contributing/INTEGRATIONS.md) first!**
**Make sure you read the [general guidelines page](https://github.com/langchain-ai/langchainjs/blob/main/.github/contributing/INTEGRATIONS.md) first!**

## Example PR

You can take a look at this PR adding Apify Datasets as an example when creating your own document loader integrations: https://github.com/hwchase17/langchainjs/pull/1271
You can take a look at this PR adding Apify Datasets as an example when creating your own document loader integrations: https://github.com/langchain-ai/langchainjs/pull/1271
8 changes: 4 additions & 4 deletions .github/contributing/integrations/LLMS.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,21 +2,21 @@

This page contains some specific guidelines and examples for contributing integrations with third-party LLM providers.

**Make sure you read the [general guidelines page](https://github.com/hwchase17/langchainjs/blob/main/.github/contributing/INTEGRATIONS.md) first!**
**Make sure you read the [general guidelines page](https://github.com/langchain-ai/langchainjs/blob/main/.github/contributing/INTEGRATIONS.md) first!**


## Example PR

We'll be referencing this PR adding Amazon SageMaker endpoints as an example: https://github.com/hwchase17/langchainjs/pull/1267
We'll be referencing this PR adding Amazon SageMaker endpoints as an example: https://github.com/langchain-ai/langchainjs/pull/1267

## General ideas

The general idea for adding new third-party LLMs is to subclass the `LLM` class and implement the `_call` method. As the name suggests, this method should call the LLM with the given prompt and transform the LLM response into some generated string output.

The example PR for Amazon SageMaker is an interesting example of this because SageMaker endpoints can host a wide variety of models with non-standard input and output formats. Therefore, the contributor added a [simple abstract class](https://github.com/hwchase17/langchainjs/pull/1267/files#diff-4496012d30c03b969546b14039f8deee1b5ba9152a86222100d76c4da77f060cR35) that a user can implement depending on which specific model they are hosting that transforms input from LangChain into a format expected by the model and output into a plain string.
The example PR for Amazon SageMaker is an interesting example of this because SageMaker endpoints can host a wide variety of models with non-standard input and output formats. Therefore, the contributor added a [simple abstract class](https://github.com/langchain-ai/langchainjs/pull/1267/files#diff-4496012d30c03b969546b14039f8deee1b5ba9152a86222100d76c4da77f060cR35) that a user can implement depending on which specific model they are hosting that transforms input from LangChain into a format expected by the model and output into a plain string.

Other third-party providers like OpenAI and Anthropic will have a defined input and output format, and in those cases, the input and output transformations should happen within the `_call` method.

## Wrap LLM requests in this.caller

The base LLM class contains an instance property called `caller` that will automatically handle retries, errors, timeouts, and more. You should wrap calls to the LLM in `this.caller.call` [as shown here](https://github.com/hwchase17/langchainjs/pull/1267/files#diff-4496012d30c03b969546b14039f8deee1b5ba9152a86222100d76c4da77f060cR148)
The base LLM class contains an instance property called `caller` that will automatically handle retries, errors, timeouts, and more. You should wrap calls to the LLM in `this.caller.call` [as shown here](https://github.com/langchain-ai/langchainjs/pull/1267/files#diff-4496012d30c03b969546b14039f8deee1b5ba9152a86222100d76c4da77f060cR148)
6 changes: 3 additions & 3 deletions .github/contributing/integrations/MEMORY.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,17 +2,17 @@

This page contains some specific guidelines and examples for contributing integrations with third-party memory providers.

In LangChain, memory differs from [message stores](https://github.com/hwchase17/langchainjs/blob/main/.github/contributing/integrations/MESSAGE_STORES.md) in that memory does not actually handle persistently storing messages, but acts as a representation of the LLM or chat model's awareness of past conversations, while message stores handle the actual message data persistence. For example, memory may perform other transformations on the messages, like summarization, or may emphasize specific pieces of pertinent information. Memory may rely on message stores as a backing class.
In LangChain, memory differs from [message stores](https://github.com/langchain-ai/langchainjs/blob/main/.github/contributing/integrations/MESSAGE_STORES.md) in that memory does not actually handle persistently storing messages, but acts as a representation of the LLM or chat model's awareness of past conversations, while message stores handle the actual message data persistence. For example, memory may perform other transformations on the messages, like summarization, or may emphasize specific pieces of pertinent information. Memory may rely on message stores as a backing class.

Another key difference is that message stores are only used with chat models.

Before getting started, think about whether your planned integration would be more suited as a message store or as memory!

**Make sure you read the [general guidelines page](https://github.com/hwchase17/langchainjs/blob/main/.github/contributing/INTEGRATIONS.md) first!**
**Make sure you read the [general guidelines page](https://github.com/langchain-ai/langchainjs/blob/main/.github/contributing/INTEGRATIONS.md) first!**

## Example PR

You can use this PR adding Motorhead memory as a reference: https://github.com/hwchase17/langchainjs/pull/598
You can use this PR adding Motorhead memory as a reference: https://github.com/langchain-ai/langchainjs/pull/598

## General ideas

Expand Down
8 changes: 4 additions & 4 deletions .github/contributing/integrations/MESSAGE_STORES.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,18 +2,18 @@

This page contains some specific guidelines and examples for contributing integrations with third-party message stores.

In LangChain, message stores differ from [memory](https://github.com/hwchase17/langchainjs/blob/main/.github/contributing/integrations/MEMORY.md) in that they simply serialize and persistently store chat messages, while memory, despite its name, does not actually handle persistently storing messages, but acts as a representation of the LLM or chat model's awareness of past conversations. For example, memory may perform other transformations on the messages, like summarization, or may emphasize specific pieces of pertinent information. Memory may rely on message stores as a backing class.
In LangChain, message stores differ from [memory](https://github.com/langchain-ai/langchainjs/blob/main/.github/contributing/integrations/MEMORY.md) in that they simply serialize and persistently store chat messages, while memory, despite its name, does not actually handle persistently storing messages, but acts as a representation of the LLM or chat model's awareness of past conversations. For example, memory may perform other transformations on the messages, like summarization, or may emphasize specific pieces of pertinent information. Memory may rely on message stores as a backing class.

Another key difference is that message stores are only used with chat models.

Before getting started, think about whether your planned integration would be more suited as a message store or as memory!

**Make sure you read the [general guidelines page](https://github.com/hwchase17/langchainjs/blob/main/.github/contributing/INTEGRATIONS.md) first!**
**Make sure you read the [general guidelines page](https://github.com/langchain-ai/langchainjs/blob/main/.github/contributing/INTEGRATIONS.md) first!**

## Example PR

We'll be referencing this PR adding a Redis-backed message store as an example: https://github.com/hwchase17/langchainjs/pull/951
We'll be referencing this PR adding a Redis-backed message store as an example: https://github.com/langchain-ai/langchainjs/pull/951

## Serializing and deserializing chat messages

LangChain messages implement a `BaseMessage` class that contains information like the message's content and role of the speaker. In order to provide a standard way to map these messages to a storable JSON format, you should use the utility `mapChatMessagesToStoredMessages` and `mapStoredMessagesToChatMessages` functions as [shown here](https://github.com/hwchase17/langchainjs/pull/951/files#diff-4c638d231a5e5bb29a149c6fb7d8f4b24aaf1b6fcc2cc2a728346eaebb6c9c47R17).
LangChain messages implement a `BaseMessage` class that contains information like the message's content and role of the speaker. In order to provide a standard way to map these messages to a storable JSON format, you should use the utility `mapChatMessagesToStoredMessages` and `mapStoredMessagesToChatMessages` functions as [shown here](https://github.com/langchain-ai/langchainjs/pull/951/files#diff-4c638d231a5e5bb29a149c6fb7d8f4b24aaf1b6fcc2cc2a728346eaebb6c9c47R17).
4 changes: 2 additions & 2 deletions .github/contributing/integrations/TOOLS.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,11 +2,11 @@

This page contains some specific guidelines and examples for contributing integrations with third-party APIs within tools.

**Make sure you read the [general guidelines page](https://github.com/hwchase17/langchainjs/blob/main/.github/contributing/INTEGRATIONS.md) first!**
**Make sure you read the [general guidelines page](https://github.com/langchain-ai/langchainjs/blob/main/.github/contributing/INTEGRATIONS.md) first!**

## Example PR

You can use this PR adding an AWSLambda tool as a reference when creating your own tools (minus the dynamic import!): https://github.com/hwchase17/langchainjs/pull/727
You can use this PR adding an AWSLambda tool as a reference when creating your own tools (minus the dynamic import!): https://github.com/langchain-ai/langchainjs/pull/727

## Guidelines

Expand Down
4 changes: 2 additions & 2 deletions .github/contributing/integrations/VECTOR_STORES.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,8 @@

This page contains some specific guidelines and examples for contributing integrations with third-party vector store providers.

**Make sure you read the [general guidelines page](https://github.com/hwchase17/langchainjs/blob/main/.github/contributing/INTEGRATIONS.md) first!**
**Make sure you read the [general guidelines page](https://github.com/langchain-ai/langchainjs/blob/main/.github/contributing/INTEGRATIONS.md) first!**

## Example PR

You can use this PR adding Faiss as a reference when creating your own vector store integration: https://github.com/hwchase17/langchainjs/pull/685
You can use this PR adding Faiss as a reference when creating your own vector store integration: https://github.com/langchain-ai/langchainjs/pull/685
6 changes: 3 additions & 3 deletions .github/pull_request_template.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,10 +2,10 @@
Thank you for contributing to LangChainJS! Your PR will appear in our next release under the title you set above. Please make sure it highlights your valuable contribution.
To help streamline the review process, please make sure you read our contribution guidelines:
https://github.com/hwchase17/langchainjs/blob/main/CONTRIBUTING.md
https://github.com/langchain-ai/langchainjs/blob/main/CONTRIBUTING.md
If you are adding an integration (e.g. a new LLM, vector store, or memory), please also read our additional guidelines for integrations:
https://github.com/hwchase17/langchainjs/blob/main/.github/contributing/INTEGRATIONS.md
https://github.com/langchain-ai/langchainjs/blob/main/.github/contributing/INTEGRATIONS.md
Replace this block with a description of the change, the issue it fixes (if applicable), and relevant context.
Expand All @@ -14,4 +14,4 @@ Finally, we'd love to show appreciation for your contribution - if you'd like us

<!-- Remove if not applicable -->

Fixes # (issue)
Fixes # (issue)
Loading

0 comments on commit 1f734db

Please sign in to comment.