diff --git a/docs/core_docs/docs/integrations/text_embedding/fireworks.ipynb b/docs/core_docs/docs/integrations/text_embedding/fireworks.ipynb new file mode 100644 index 000000000000..f49ea3048613 --- /dev/null +++ b/docs/core_docs/docs/integrations/text_embedding/fireworks.ipynb @@ -0,0 +1,222 @@ +{ + "cells": [ + { + "cell_type": "raw", + "id": "afaf8039", + "metadata": { + "vscode": { + "languageId": "raw" + } + }, + "source": [ + "---\n", + "sidebar_label: Fireworks\n", + "---" + ] + }, + { + "cell_type": "markdown", + "id": "9a3d6f34", + "metadata": {}, + "source": [ + "# FireworksEmbeddings\n", + "\n", + "This will help you get started with FireworksEmbeddings [embedding models](/docs/concepts#embedding-models) using LangChain. For detailed documentation on `FireworksEmbeddings` features and configuration options, please refer to the [API reference](https://api.js.langchain.com/classes/langchain_community_embeddings_fireworks.FireworksEmbeddings.html).\n", + "\n", + "## Overview\n", + "### Integration details\n", + "\n", + "| Class | Package | Local | [Py support](https://python.langchain.com/docs/integrations/text_embedding/fireworks/) | Package downloads | Package latest |\n", + "| :--- | :--- | :---: | :---: | :---: | :---: |\n", + "| [FireworksEmbeddings](https://api.js.langchain.com/classes/langchain_community_embeddings_fireworks.FireworksEmbeddings.html) | [@langchain/community](https://api.js.langchain.com/modules/langchain_community_embeddings_fireworks.html) | ❌ | ✅ | ![NPM - Downloads](https://img.shields.io/npm/dm/@langchain/community?style=flat-square&label=%20&) | ![NPM - Version](https://img.shields.io/npm/v/@langchain/community?style=flat-square&label=%20&) |\n", + "\n", + "## Setup\n", + "\n", + "To access Fireworks embedding models you'll need to create a Fireworks account, get an API key, and install the `@langchain/community` integration package.\n", + "\n", + "### Credentials\n", + "\n", + "Head to [fireworks.ai](https://fireworks.ai/) to sign up to `Fireworks` and generate an API key. Once you've done this set the `FIREWORKS_API_KEY` environment variable:\n", + "\n", + "```bash\n", + "export FIREWORKS_API_KEY=\"your-api-key\"\n", + "```\n", + "\n", + "If you want to get automated tracing of your model calls you can also set your [LangSmith](https://docs.smith.langchain.com/) API key by uncommenting below:\n", + "\n", + "```bash\n", + "# export LANGCHAIN_TRACING_V2=\"true\"\n", + "# export LANGCHAIN_API_KEY=\"your-api-key\"\n", + "```\n", + "\n", + "### Installation\n", + "\n", + "The LangChain `FireworksEmbeddings` integration lives in the `@langchain/community` package:\n", + "\n", + "```{=mdx}\n", + "import IntegrationInstallTooltip from \"@mdx_components/integration_install_tooltip.mdx\";\n", + "import Npm2Yarn from \"@theme/Npm2Yarn\";\n", + "\n", + "\n", + "\n", + "\n", + " @langchain/community\n", + "\n", + "```" + ] + }, + { + "cell_type": "markdown", + "id": "45dd1724", + "metadata": {}, + "source": [ + "## Instantiation\n", + "\n", + "Now we can instantiate our model object and generate chat completions:" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "id": "9ea7a09b", + "metadata": {}, + "outputs": [ + { + "name": "stderr", + "output_type": "stream", + "text": [ + "1:37 - Cannot find module '@langchain/community/embeddings/fireworks' or its corresponding type declarations.\n" + ] + } + ], + "source": [ + "import { FireworksEmbeddings } from \"@langchain/community/embeddings/fireworks\";\n", + "\n", + "const embeddings = new FireworksEmbeddings({\n", + " modelName: \"nomic-ai/nomic-embed-text-v1.5\",\n", + "});" + ] + }, + { + "cell_type": "markdown", + "id": "77d271b6", + "metadata": {}, + "source": [ + "## Indexing and Retrieval\n", + "\n", + "Embedding models are often used in retrieval-augmented generation (RAG) flows, both as part of indexing data as well as later retrieving it. For more detailed instructions, please see our RAG tutorials under the [working with external knowledge tutorials](/docs/tutorials/#working-with-external-knowledge).\n", + "\n", + "Below, see how to index and retrieve data using the `embeddings` object we initialized above. In this example, we will index and retrieve a sample document using the demo [`MemoryVectorStore`](/docs/integrations/vectorstores/memory)." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "d817716b", + "metadata": {}, + "outputs": [], + "source": [ + "// Create a vector store with a sample text\n", + "import { MemoryVectorStore } from \"langchain/vectorstores/memory\";\n", + "\n", + "const text = \"LangChain is the framework for building context-aware reasoning applications\";\n", + "\n", + "const vectorstore = await MemoryVectorStore.fromDocuments(\n", + " [{ pageContent: text, metadata: {} }],\n", + " embeddings,\n", + ");\n", + "\n", + "// Use the vector store as a retriever that returns a single document\n", + "const retriever = vectorstore.asRetriever(1);\n", + "\n", + "// Retrieve the most similar text\n", + "const retrievedDocuments = await retriever.invoke(\"What is LangChain?\");\n", + "\n", + "retrievedDocuments[0].pageContent;" + ] + }, + { + "cell_type": "markdown", + "id": "e02b9855", + "metadata": {}, + "source": [ + "## Direct Usage\n", + "\n", + "Under the hood, the vectorstore and retriever implementations are calling `embeddings.embedDocument(...)` and `embeddings.embedQuery(...)` to create embeddings for the text(s) used in `fromDocuments` and the retriever's `invoke` operations, respectively.\n", + "\n", + "You can directly call these methods to get embeddings for your own use cases.\n", + "\n", + "### Embed single texts\n", + "\n", + "You can embed queries for search with `embedQuery`. This generates a vector representation specific to the query:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "0d2befcd", + "metadata": {}, + "outputs": [], + "source": [ + "const singleVector = await embeddings.embedQuery(text);\n", + "\n", + "console.log(singleVector.slice(0, 100));" + ] + }, + { + "cell_type": "markdown", + "id": "1b5a7d03", + "metadata": {}, + "source": [ + "### Embed multiple texts\n", + "\n", + "You can embed multiple texts for indexing with `embedDocuments`. The internals used for this method may (but do not have to) differ from embedding queries:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "2f4d6e97", + "metadata": {}, + "outputs": [], + "source": [ + "const text2 = \"LangGraph is a library for building stateful, multi-actor applications with LLMs\";\n", + "\n", + "const vectors = await embeddings.embedDocuments([text, text2]);\n", + "\n", + "console.log(vectors[0].slice(0, 100));\n", + "console.log(vectors[1].slice(0, 100));" + ] + }, + { + "cell_type": "markdown", + "id": "8938e581", + "metadata": {}, + "source": [ + "## API reference\n", + "\n", + "For detailed documentation of all FireworksEmbeddings features and configurations head to the API reference: https://api.js.langchain.com/classes/langchain_community_embeddings_fireworks.FireworksEmbeddings.html" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "TypeScript", + "language": "typescript", + "name": "tslab" + }, + "language_info": { + "codemirror_mode": { + "mode": "typescript", + "name": "javascript", + "typescript": true + }, + "file_extension": ".ts", + "mimetype": "text/typescript", + "name": "typescript", + "version": "3.7.2" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/docs/core_docs/docs/integrations/text_embedding/fireworks.mdx b/docs/core_docs/docs/integrations/text_embedding/fireworks.mdx deleted file mode 100644 index f1cb9e6f55dc..000000000000 --- a/docs/core_docs/docs/integrations/text_embedding/fireworks.mdx +++ /dev/null @@ -1,29 +0,0 @@ ---- -sidebar_label: Fireworks ---- - -import CodeBlock from "@theme/CodeBlock"; - -# Fireworks - -The `FireworksEmbeddings` class allows you to use the Fireworks AI API to generate embeddings. - -## Setup - -First, sign up for a [Fireworks API key](https://fireworks.ai/) and set it as an environment variable called `FIREWORKS_API_KEY`. - -Next, install the `@langchain/community` package as shown below: - -import IntegrationInstallTooltip from "@mdx_components/integration_install_tooltip.mdx"; - - - -```bash npm2yarn -npm install @langchain/community -``` - -## Usage - -import FireworksExample from "@examples/models/embeddings/fireworks.ts"; - -{FireworksExample} diff --git a/libs/langchain-community/src/embeddings/fireworks.ts b/libs/langchain-community/src/embeddings/fireworks.ts index 8a6d77c4a669..05d928f10b3b 100644 --- a/libs/langchain-community/src/embeddings/fireworks.ts +++ b/libs/langchain-community/src/embeddings/fireworks.ts @@ -7,7 +7,11 @@ import { chunkArray } from "@langchain/core/utils/chunk_array"; * parameters specific to the FireworksEmbeddings class. */ export interface FireworksEmbeddingsParams extends EmbeddingsParams { + /** + * @deprecated Use `model` instead. + */ modelName: string; + model: string; /** * The maximum number of documents to embed in a single request. This is @@ -41,8 +45,13 @@ export class FireworksEmbeddings extends Embeddings implements FireworksEmbeddingsParams { + /** + * @deprecated Use `model` instead. + */ modelName = "nomic-ai/nomic-embed-text-v1.5"; + model = "nomic-ai/nomic-embed-text-v1.5"; + batchSize = 8; private apiKey: string; @@ -74,7 +83,8 @@ export class FireworksEmbeddings throw new Error("Fireworks AI API key not found"); } - this.modelName = fieldsWithDefaults?.modelName ?? this.modelName; + this.model = fieldsWithDefaults?.model ?? this.model; + this.modelName = this.model; this.batchSize = fieldsWithDefaults?.batchSize ?? this.batchSize; this.apiKey = apiKey; this.apiUrl = `${this.basePath}/embeddings`; @@ -90,7 +100,7 @@ export class FireworksEmbeddings const batchRequests = batches.map((batch) => this.embeddingWithRetry({ - model: this.modelName, + model: this.model, input: batch, }) ); @@ -117,7 +127,7 @@ export class FireworksEmbeddings */ async embedQuery(text: string): Promise { const { data } = await this.embeddingWithRetry({ - model: this.modelName, + model: this.model, input: text, });