Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

scripts[major]: Cleanup CLI integration template scripts #6357

Merged
merged 8 commits into from
Aug 3, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
302 changes: 302 additions & 0 deletions docs/core_docs/docs/integrations/text_embedding/togetherai.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,302 @@
{
"cells": [
{
"cell_type": "raw",
"id": "afaf8039",
"metadata": {
"vscode": {
"languageId": "raw"
}
},
"source": [
"---\n",
"sidebar_label: TogetherAI\n",
"---"
]
},
{
"cell_type": "markdown",
"id": "9a3d6f34",
"metadata": {},
"source": [
"# TogetherAIEmbeddings\n",
"\n",
"This will help you get started with TogetherAIEmbeddings [embedding models](/docs/concepts#embedding-models) using LangChain. For detailed documentation on `TogetherAIEmbeddings` features and configuration options, please refer to the [API reference](https://api.js.langchain.com/classes/langchain_community_embeddings_togetherai.TogetherAIEmbeddings.html).\n",
"\n",
"## Overview\n",
"### Integration details\n",
"\n",
"| Class | Package | Local | [Py support](https://python.langchain.com/docs/integrations/text_embedding/together/) | Package downloads | Package latest |\n",
"| :--- | :--- | :---: | :---: | :---: | :---: |\n",
"| [TogetherAIEmbeddings](https://api.js.langchain.com/classes/langchain_community_embeddings_togetherai.TogetherAIEmbeddings.html) | [@langchain/community](https://api.js.langchain.com/modules/langchain_community_embeddings_togetherai.html) | ❌ | ✅ | ![NPM - Downloads](https://img.shields.io/npm/dm/@langchain/community?style=flat-square&label=%20&) | ![NPM - Version](https://img.shields.io/npm/v/@langchain/community?style=flat-square&label=%20&) |\n",
"\n",
"## Setup\n",
"\n",
"To access TogetherAI embedding models you'll need to create a TogetherAI account, get an API key, and install the `@langchain/community` integration package.\n",
"\n",
"### Credentials\n",
"\n",
"You can sign up for a Together account and create an API key [here](https://api.together.xyz/). Once you've done this set the `TOGETHER_AI_API_KEY` environment variable:\n",
"\n",
"```bash\n",
"export TOGETHER_AI_API_KEY=\"your-api-key\"\n",
"```\n",
"\n",
"If you want to get automated tracing of your model calls you can also set your [LangSmith](https://docs.smith.langchain.com/) API key by uncommenting below:\n",
"\n",
"```bash\n",
"# export LANGCHAIN_TRACING_V2=\"true\"\n",
"# export LANGCHAIN_API_KEY=\"your-api-key\"\n",
"```\n",
"\n",
"### Installation\n",
"\n",
"The LangChain TogetherAIEmbeddings integration lives in the `@langchain/community` package:\n",
"\n",
"```{=mdx}\n",
"import IntegrationInstallTooltip from \"@mdx_components/integration_install_tooltip.mdx\";\n",
"import Npm2Yarn from \"@theme/Npm2Yarn\";\n",
"\n",
"<IntegrationInstallTooltip></IntegrationInstallTooltip>\n",
"\n",
"<Npm2Yarn>\n",
" @langchain/community\n",
"</Npm2Yarn>\n",
"```"
]
},
{
"cell_type": "markdown",
"id": "45dd1724",
"metadata": {},
"source": [
"## Instantiation\n",
"\n",
"Now we can instantiate our model object and generate chat completions:"
]
},
{
"cell_type": "code",
"execution_count": 1,
"id": "9ea7a09b",
"metadata": {},
"outputs": [],
"source": [
"import { TogetherAIEmbeddings } from \"@langchain/community/embeddings/togetherai\";\n",
"\n",
"const embeddings = new TogetherAIEmbeddings({\n",
" model: \"togethercomputer/m2-bert-80M-8k-retrieval\", // Default value\n",
"});"
]
},
{
"cell_type": "markdown",
"id": "77d271b6",
"metadata": {},
"source": [
"## Indexing and Retrieval\n",
"\n",
"Embedding models are often used in retrieval-augmented generation (RAG) flows, both as part of indexing data as well as later retrieving it. For more detailed instructions, please see our RAG tutorials under the [working with external knowledge tutorials](/docs/tutorials/#working-with-external-knowledge).\n",
"\n",
"Below, see how to index and retrieve data using the `embeddings` object we initialized above. In this example, we will index and retrieve a sample document using the demo [`MemoryVectorStore`](/docs/integrations/vectorstores/memory)."
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "d817716b",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"LangChain is the framework for building context-aware reasoning applications\n"
]
}
],
"source": [
"// Create a vector store with a sample text\n",
"import { MemoryVectorStore } from \"langchain/vectorstores/memory\";\n",
"\n",
"const text = \"LangChain is the framework for building context-aware reasoning applications\";\n",
"\n",
"const vectorstore = await MemoryVectorStore.fromDocuments(\n",
" [{ pageContent: text, metadata: {} }],\n",
" embeddings,\n",
");\n",
"\n",
"// Use the vector store as a retriever that returns a single document\n",
"const retriever = vectorstore.asRetriever(1);\n",
"\n",
"// Retrieve the most similar text\n",
"const retrievedDocuments = await retriever.invoke(\"What is LangChain?\");\n",
"\n",
"retrievedDocuments[0].pageContent;"
]
},
{
"cell_type": "markdown",
"id": "e02b9855",
"metadata": {},
"source": [
"## Direct Usage\n",
"\n",
"Under the hood, the vectorstore and retriever implementations are calling `embeddings.embedDocument(...)` and `embeddings.embedQuery(...)` to create embeddings for the text(s) used in `fromDocuments` and the retriever's `invoke` operations, respectively.\n",
"\n",
"You can directly call these methods to get embeddings for your own use cases.\n",
"\n",
"### Embed single texts\n",
"\n",
"You can embed queries for search with `embedQuery`. This generates a vector representation specific to the query:"
]
},
{
"cell_type": "code",
"execution_count": 3,
"id": "0d2befcd",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"[\n",
" 0.3812227, -0.052848946, -0.10564975, 0.03480297, 0.2878488,\n",
" 0.0084609175, 0.11605915, 0.05303011, 0.14711718, -0.14407106,\n",
" -0.29865336, -0.15807179, -0.068397366, -0.2708063, 0.056596708,\n",
" -0.07656515, 0.052995138, -0.11275427, 0.028096694, 0.123501234,\n",
" -0.039519835, 0.12148692, -0.12820457, 0.15691335, 0.033519063,\n",
" -0.27026987, -0.08460162, -0.23792154, -0.234982, -0.05786798,\n",
" 0.016467346, -0.17168592, -0.060787182, 0.038752213, -0.08169927,\n",
" 0.09327062, 0.29490772, 0.0167866, -0.32224452, -0.2037822,\n",
" -0.10284172, -0.124050565, 0.25344968, -0.06275548, -0.14180769,\n",
" 0.0046709594, 0.073105976, 0.12004031, 0.19224276, -0.022589967,\n",
" 0.102790825, 0.1138286, -0.057701062, -0.050010648, -0.1632584,\n",
" -0.18942119, -0.12018798, 0.15288158, 0.07941474, 0.10440051,\n",
" -0.13257962, -0.19282033, 0.044656333, 0.13560675, -0.068929024,\n",
" 0.028590716, 0.055663664, 0.04652713, 0.014936657, 0.120679885,\n",
" 0.053866718, -0.16296014, 0.119450666, -0.29559663, 0.008097747,\n",
" 0.07380408, -0.09010084, -0.0687739, -0.08575685, -0.07202606,\n",
" 0.18868081, -0.08392917, 0.014016109, 0.15435852, -0.030115498,\n",
" -0.16927013, 0.02836557, -0.050763763, 0.0840437, -0.22718845,\n",
" 0.111397505, 0.033395614, -0.123287566, -0.2111604, -0.1580479,\n",
" 0.05520573, -0.1422921, 0.08828953, 0.051058788, -0.13312188\n",
"]\n"
]
}
],
"source": [
"const singleVector = await embeddings.embedQuery(text);\n",
"\n",
"console.log(singleVector.slice(0, 100));"
]
},
{
"cell_type": "markdown",
"id": "1b5a7d03",
"metadata": {},
"source": [
"### Embed multiple texts\n",
"\n",
"You can embed multiple texts for indexing with `embedDocuments`. The internals used for this method may (but do not have to) differ from embedding queries:"
]
},
{
"cell_type": "code",
"execution_count": 4,
"id": "2f4d6e97",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"[\n",
" 0.3812227, -0.052848946, -0.10564975, 0.03480297, 0.2878488,\n",
" 0.0084609175, 0.11605915, 0.05303011, 0.14711718, -0.14407106,\n",
" -0.29865336, -0.15807179, -0.068397366, -0.2708063, 0.056596708,\n",
" -0.07656515, 0.052995138, -0.11275427, 0.028096694, 0.123501234,\n",
" -0.039519835, 0.12148692, -0.12820457, 0.15691335, 0.033519063,\n",
" -0.27026987, -0.08460162, -0.23792154, -0.234982, -0.05786798,\n",
" 0.016467346, -0.17168592, -0.060787182, 0.038752213, -0.08169927,\n",
" 0.09327062, 0.29490772, 0.0167866, -0.32224452, -0.2037822,\n",
" -0.10284172, -0.124050565, 0.25344968, -0.06275548, -0.14180769,\n",
" 0.0046709594, 0.073105976, 0.12004031, 0.19224276, -0.022589967,\n",
" 0.102790825, 0.1138286, -0.057701062, -0.050010648, -0.1632584,\n",
" -0.18942119, -0.12018798, 0.15288158, 0.07941474, 0.10440051,\n",
" -0.13257962, -0.19282033, 0.044656333, 0.13560675, -0.068929024,\n",
" 0.028590716, 0.055663664, 0.04652713, 0.014936657, 0.120679885,\n",
" 0.053866718, -0.16296014, 0.119450666, -0.29559663, 0.008097747,\n",
" 0.07380408, -0.09010084, -0.0687739, -0.08575685, -0.07202606,\n",
" 0.18868081, -0.08392917, 0.014016109, 0.15435852, -0.030115498,\n",
" -0.16927013, 0.02836557, -0.050763763, 0.0840437, -0.22718845,\n",
" 0.111397505, 0.033395614, -0.123287566, -0.2111604, -0.1580479,\n",
" 0.05520573, -0.1422921, 0.08828953, 0.051058788, -0.13312188\n",
"]\n",
"[\n",
" 0.066308185, -0.032866564, 0.115751594, 0.19082588, 0.14017,\n",
" -0.26976448, -0.056340694, -0.26923394, 0.2548541, -0.27271318,\n",
" -0.2244126, 0.07949589, -0.27710953, -0.17993368, 0.09681616,\n",
" -0.08692256, 0.22127126, -0.14512022, -0.18016525, 0.14892976,\n",
" -0.0526347, -0.008140617, -0.2916987, 0.23706906, -0.38488507,\n",
" -0.35881752, 0.09276949, -0.07051063, -0.07778231, 0.12552947,\n",
" 0.06256748, -0.25832427, 0.025054429, -0.1451448, -0.2662871,\n",
" 0.13676351, -0.07413256, 0.14966589, -0.39968985, 0.15542287,\n",
" -0.13107607, 0.02761394, 0.108077586, -0.12076956, 0.128296,\n",
" -0.05625126, 0.15723586, -0.056932643, 0.23720805, 0.23993455,\n",
" -0.035553705, -0.053907514, -0.11852807, 0.07005695, -0.06317475,\n",
" 0.070009425, 0.284697, 0.2212059, 0.018890115, 0.16924675,\n",
" 0.21651487, 0.07259682, 0.1328156, 0.3261852, 0.1914124,\n",
" -0.10120423, 0.03450111, -0.22588971, -0.04458192, 0.24116798,\n",
" -0.021830376, -0.30731413, 0.08586451, -0.058835756, 0.0010347435,\n",
" 0.0031927782, -0.09403646, -0.22608931, 0.15865424, 0.15738021,\n",
" 0.23582733, 0.1714161, 0.1585189, -0.18085755, 0.019376995,\n",
" -0.026587496, -0.017079154, -0.04588549, -0.047336094, -0.082413346,\n",
" -0.1114185, -0.05403556, 0.12438637, -0.20476522, 0.073182,\n",
" -0.12210378, -0.010543863, -0.09767598, 0.1057683, -0.050204434\n",
"]\n"
]
}
],
"source": [
"const text2 = \"LangGraph is a library for building stateful, multi-actor applications with LLMs\";\n",
"\n",
"const vectors = await embeddings.embedDocuments([text, text2]);\n",
"\n",
"console.log(vectors[0].slice(0, 100));\n",
"console.log(vectors[1].slice(0, 100));"
]
},
{
"cell_type": "markdown",
"id": "8938e581",
"metadata": {},
"source": [
"## API reference\n",
"\n",
"For detailed documentation of all TogetherAIEmbeddings features and configurations head to the API reference: https://api.js.langchain.com/classes/langchain_community_embeddings_togetherai.TogetherAIEmbeddings.html"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "TypeScript",
"language": "typescript",
"name": "tslab"
},
"language_info": {
"codemirror_mode": {
"mode": "typescript",
"name": "javascript",
"typescript": true
},
"file_extension": ".ts",
"mimetype": "text/typescript",
"name": "typescript",
"version": "3.7.2"
}
},
"nbformat": 4,
"nbformat_minor": 5
}
28 changes: 0 additions & 28 deletions docs/core_docs/docs/integrations/text_embedding/togetherai.mdx

This file was deleted.

39 changes: 39 additions & 0 deletions libs/langchain-scripts/src/cli/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
# Integration doc CLI

Generate integration documentation using the CLI.

## Supported integration types

- Chat models
- LLMs
- Text embeddings
- Retrievers
- Document loaders

## Usage

1. Build the CLI:

```bash
yarn build --filter=@langchain/scripts
```

2. Run the CLI:

```bash
yarn create:integration:doc --classname <Class Name> --type <Type>
```

The `--classname` field should be passed the full class name of the integration, e.g `ChatOpenAI` or `RecursiveUrlLoader`.

The `--type` field should be passed the type of the integration. It must be one of the following:

- `chat`
- `llm`
- `embeddings`
- `retriever`
- `doc_loader`

After invoking the script, you'll be prompted to fill out more integration-specific information.

Finally, the script will log the path of the newly created integration documentation. You should open this notebook, run all the cells, handle and remove any TODOs, and verify all links work as expected.
13 changes: 13 additions & 0 deletions libs/langchain-scripts/src/cli/constants.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
export const SIDEBAR_LABEL_PLACEHOLDER = "__sidebar_label__";
export const MODULE_NAME_PLACEHOLDER = "__module_name__";
export const PACKAGE_NAME_PLACEHOLDER = "__package_name__";
export const FULL_IMPORT_PATH_PLACEHOLDER = "__full_import_path__";
export const ENV_VAR_NAME_PLACEHOLDER = "__env_var_name__";

export const API_REF_MODULE_PLACEHOLDER = "__api_ref_module__";
export const API_REF_PACKAGE_PLACEHOLDER = "__api_ref_package__";
export const PYTHON_DOC_URL_PLACEHOLDER = "__python_doc_url__";

export const SERIALIZABLE_PLACEHOLDER = "__serializable__";
export const LOCAL_PLACEHOLDER = "__local__";
export const PY_SUPPORT_PLACEHOLDER = "__py_support__";
Loading
Loading