Skip to content

Commit

Permalink
fix api ref links and codeblock link gen. also added api to header ex…
Browse files Browse the repository at this point in the history
…ternal link
  • Loading branch information
bracesproul committed Nov 6, 2023
1 parent 6ff3f06 commit a382a80
Show file tree
Hide file tree
Showing 10 changed files with 36 additions and 31 deletions.
38 changes: 26 additions & 12 deletions docs/code-block-loader.js
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,9 @@ const fs = require("fs");
*/
async function webpackLoader(content, map, meta) {
const cb = this.async();
const BASE_URL = "https://api.js.langchain.com";
// Directories generated inside the API docs (excluding "modules").
const CATEGORIES = ["classes", "functions", "interfaces", "types", "variables"];

if (!this.resourcePath.endsWith(".ts")) {
cb(null, JSON.stringify({ content, imports: [] }), map, meta);
Expand Down Expand Up @@ -48,22 +51,33 @@ async function webpackLoader(content, map, meta) {
}
});

/**
* Somewhat of a hacky solution to finding the exact path of the docs file.
* Maps over all categories in the API docs and if the file exists, returns the path.
* @param {string} moduleName
* @param {string} imported
* @returns {string | undefined}
*/
const findExactPath = (moduleName, imported) => {
let modulePath;
CATEGORIES.forEach((category) => {
const docsPath = path.resolve(__dirname, "..", "api-docs", "public", category, `${moduleName}.${imported}.html`);
if (fs.existsSync(docsPath)) {
modulePath = category + "/" + moduleName + "." + imported + ".html";
}
})
return modulePath;
}

imports.forEach((imp) => {
const { imported, source } = imp;
const moduleName = source.split("/").slice(1).join("_");
const docsPath = path.resolve(__dirname, "docs", "api", moduleName);
const available = fs.readdirSync(docsPath, { withFileTypes: true });
const found = available.find(
(dirent) =>
dirent.isDirectory() &&
fs.existsSync(path.resolve(docsPath, dirent.name, imported + ".md"))
);
if (found) {
imp.docs =
"/" + path.join("docs", "api", moduleName, found.name, imported);
const moduleName = source.split("/").slice(1).join("_");
const exactPath = findExactPath(moduleName, imported);
if (exactPath) {
imp.docs = BASE_URL + "/" + exactPath;
} else {
throw new Error(
`Could not find docs for ${source}.${imported} in docs/api/`
`Could not find docs for ${source}.${imported} in api-docs/public/`
);
}
});
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/expression_language/interface.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ import CodeBlock from "@theme/CodeBlock";

# Interface

In an effort to make it as easy as possible to create custom chains, we've implemented a ["Runnable"](/docs/api/schema_runnable/classes/Runnable) protocol that most components implement.
In an effort to make it as easy as possible to create custom chains, we've implemented a ["Runnable"](https://api.js.langchain.com/classes/schema_runnable.Runnable.html) protocol that most components implement.
This is a standard interface with a few different methods, which make it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:

- `stream`: stream back chunks of the response
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/integrations/vectorstores/googlevertexai.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,7 @@ for Matching Engine:

You will also need a document store. While an `InMemoryDocstore` is ok for
initial testing, you will want to use something like a
[GoogleCloudStorageDocstore](/docs/api/stores_doc_gcs/classes/GoogleCloudStorageDocstore) to store it more permanently.
[GoogleCloudStorageDocstore](https://api.js.langchain.com/classes/stores_doc_gcs.GoogleCloudStorageDocstore.html) to store it more permanently.

```typescript
import { MatchingEngine } from "langchain/vectorstores/googlevertexai";
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/modules/callbacks/how_to/tags.mdx
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
# Tags

You can add tags to your callbacks by passing a `tags` argument to the `call()`/`run()`/`apply()` methods. This is useful for filtering your logs, eg. if you want to log all requests made to a specific LLMChain, you can add a tag, and then filter your logs by that tag. You can pass tags to both constructor and request callbacks, see the examples above for details. These tags are then passed to the `tags` argument of the "start" callback methods, ie. [`handleLLMStart`](/docs/api/callbacks/interfaces/CallbackHandlerMethods#handlellmstart), [`handleChatModelStart`](/docs/api/callbacks/interfaces/CallbackHandlerMethods#handlechatmodelstart), [`handleChainStart`](/docs/api/callbacks/interfaces/CallbackHandlerMethods#handlechainstart), [`handleToolStart`](/docs/api/callbacks/interfaces/CallbackHandlerMethods#handletoolstart).
You can add tags to your callbacks by passing a `tags` argument to the `call()`/`run()`/`apply()` methods. This is useful for filtering your logs, eg. if you want to log all requests made to a specific LLMChain, you can add a tag, and then filter your logs by that tag. You can pass tags to both constructor and request callbacks, see the examples above for details. These tags are then passed to the `tags` argument of the "start" callback methods, ie. [`handleLLMStart`](https://api.js.langchain.com/interfaces/callbacks.CallbackHandlerMethods.html#handleLLMStart), [`handleChatModelStart`](https://api.js.langchain.com/interfaces/callbacks.CallbackHandlerMethods.html#handleChatModelStart), [`handleChainStart`](https://api.js.langchain.com/interfaces/callbacks.CallbackHandlerMethods.html#handleChainStart), [`handleToolStart`](https://api.js.langchain.com/interfaces/callbacks.CallbackHandlerMethods.html#handleToolStart).
4 changes: 2 additions & 2 deletions docs/docs/modules/callbacks/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ LangChain provides a callbacks system that allows you to hook into the various s

import CodeBlock from "@theme/CodeBlock";

You can subscribe to these events by using the `callbacks` argument available throughout the API. This method accepts a list of handler objects, which are expected to implement [one or more of the methods described in the API docs](/docs/api/callbacks/interfaces/CallbackHandlerMethods).
You can subscribe to these events by using the `callbacks` argument available throughout the API. This method accepts a list of handler objects, which are expected to implement [one or more of the methods described in the API docs](https://api.js.langchain.com/interfaces/callbacks.CallbackHandlerMethods.html).

## How to use callbacks

Expand Down Expand Up @@ -55,7 +55,7 @@ import ConsoleExample from "@examples/callbacks/console_handler.ts";

### One-off handlers

You can create a one-off handler inline by passing a plain object to the `callbacks` argument. This object should implement the [`CallbackHandlerMethods`](/docs/api/callbacks/interfaces/CallbackHandlerMethods) interface. This is useful if eg. you need to create a handler that you will use only for a single request, eg to stream the output of an LLM/Agent/etc to a websocket.
You can create a one-off handler inline by passing a plain object to the `callbacks` argument. This object should implement the [`CallbackHandlerMethods`](https://api.js.langchain.com/interfaces/callbacks.CallbackHandlerMethods.html) interface. This is useful if eg. you need to create a handler that you will use only for a single request, eg to stream the output of an LLM/Agent/etc to a websocket.

import StreamingExample from "@examples/models/llm/llm_streaming.ts";

Expand Down
4 changes: 2 additions & 2 deletions docs/docs/modules/chains/popular/chat_vector_db_legacy.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ Here's an explanation of each of the attributes of the options object:
- Passing in a separate LLM (`llm`) here allows you to use a cheaper/faster model to create the condensed question while using a more powerful model for the final response, and can reduce unnecessary latency.
- `qaChainOptions`: Options that allow you to customize the specific QA chain used in the final step. The default is the [`StuffDocumentsChain`](/docs/modules/chains/document/stuff), but you can customize which chain is used by passing in a `type` parameter.
**Passing specific options here is completely optional**, but can be useful if you want to customize the way the response is presented to the end user, or if you have too many documents for the default `StuffDocumentsChain`.
You can see [the API reference of the usable fields here](/docs/api/chains/types/QAChainParams). In case you want to make chat_history available to the final answering `qaChain`, which ultimately answers the user question, you HAVE to pass a custom qaTemplate with chat_history as input, as it is not present in the default Template, which only gets passed `context` documents and generated `question`.
You can see [the API reference of the usable fields here](https://api.js.langchain.com/types/chains.QAChainParams.html). In case you want to make chat_history available to the final answering `qaChain`, which ultimately answers the user question, you HAVE to pass a custom qaTemplate with chat_history as input, as it is not present in the default Template, which only gets passed `context` documents and generated `question`.
- `returnSourceDocuments`: A boolean value that indicates whether the `ConversationalRetrievalQAChain` should return the source documents that were used to retrieve the answer. If set to true, the documents will be included in the result returned by the call() method. This can be useful if you want to allow the user to see the sources used to generate the answer. If not set, the default value will be false.
- If you are using this option and passing in a memory instance, set `inputKey` and `outputKey` on the memory instance to the same values as the chain input and final conversational chain output. These default to `"question"` and `"text"` respectively, and specify the values that the memory should store.

Expand All @@ -69,7 +69,7 @@ import ConvoQAStreamingExample from "@examples/chains/conversational_qa_streamin
## Externally-Managed Memory

For this chain, if you'd like to format the chat history in a custom way (or pass in chat messages directly for convenience), you can also pass the chat history in explicitly by omitting the `memory` option and supplying
a `chat_history` string or array of [HumanMessages](/docs/api/schema/classes/HumanMessage) and [AIMessages](/docs/api/schema/classes/AIMessage) directly into the `chain.call` method:
a `chat_history` string or array of [HumanMessages](https://api.js.langchain.com/classes/schema.HumanMessage.html) and [AIMessages](https://api.js.langchain.com/classes/schema.AIMessage.html) directly into the `chain.call` method:

import ConvoQAExternalMemoryExample from "@examples/chains/conversational_qa_external_memory_legacy.ts";

Expand Down
2 changes: 1 addition & 1 deletion docs/docs/modules/data_connection/vectorstores/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -101,7 +101,7 @@ interface VectorStore {
}
```

You can create a vector store from a list of [Documents](/docs/api/document/classes/Document), or from a list of texts and their corresponding metadata. You can also create a vector store from an existing index, the signature of this method depends on the vector store you're using, check the documentation of the vector store you're interested in.
You can create a vector store from a list of [Documents](https://api.js.langchain.com/classes/document.Document.html), or from a list of texts and their corresponding metadata. You can also create a vector store from an existing index, the signature of this method depends on the vector store you're using, check the documentation of the vector store you're interested in.

```typescript
abstract class BaseVectorStore implements VectorStore {
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/modules/memory/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -142,7 +142,7 @@ To implement your own memory class you have two options:

### Subclassing `BaseChatMemory`

This is the easiest way to implement your own memory class. You can subclass `BaseChatMemory`, which takes care of `saveContext` by saving inputs and outputs as [Chat Messages](/docs/api/schema/classes/BaseMessage), and implement only the `loadMemoryVariables` method. This method is responsible for returning the memory variables that are relevant for the current input values.
This is the easiest way to implement your own memory class. You can subclass `BaseChatMemory`, which takes care of `saveContext` by saving inputs and outputs as [Chat Messages](https://api.js.langchain.com/classes/schema.BaseMessage.html), and implement only the `loadMemoryVariables` method. This method is responsible for returning the memory variables that are relevant for the current input values.

```typescript
abstract class BaseChatMemory extends BaseMemory {
Expand Down
2 changes: 1 addition & 1 deletion docs/docusaurus.config.js
Original file line number Diff line number Diff line change
Expand Up @@ -164,7 +164,7 @@ const config = {
label: "Integrations",
},
{
href: "/docs/api/",
href: "https://api.js.langchain.com",
label: "API",
position: "left",
},
Expand Down
9 changes: 0 additions & 9 deletions docs/sidebars.js
Original file line number Diff line number Diff line change
Expand Up @@ -92,15 +92,6 @@ module.exports = {
value: "<hr>",
defaultStyle: true,
},
{
type: "category",
label: "API reference",
link: {
type: "doc",
id: "api/index",
},
items: [{ type: "autogenerated", dirName: "api" }],
},
],
use_cases: [
{
Expand Down

0 comments on commit a382a80

Please sign in to comment.