Skip to content

Commit

Permalink
Add YandexGPT LLM model (#2906)
Browse files Browse the repository at this point in the history
* yandex llm

* fix docs
  • Loading branch information
tyumentsev4 authored Oct 13, 2023
1 parent c6f3f18 commit b74b340
Show file tree
Hide file tree
Showing 16 changed files with 174 additions and 0 deletions.
21 changes: 21 additions & 0 deletions docs/extras/modules/model_io/models/llms/integrations/yandex.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
# YandexGPT

LangChain.js supports calling [YandexGPT](https://cloud.yandex.com/en/services/yandexgpt) LLMs.

## Setup

First, you should [create service account](https://cloud.yandex.com/en/docs/iam/operations/sa/create) with the `ai.languageModels.user` role.

Next, you have two authentication options:

- [IAM token](https://cloud.yandex.com/en/docs/iam/operations/iam-token/create-for-sa).
You can specify the token in a constructor parameter `iam_token` or in an environment variable `YC_IAM_TOKEN`.
- [API key](https://cloud.yandex.com/en/docs/iam/operations/api-key/create)
You can specify the key in a constructor parameter `api_key` or in an environment variable `YC_API_KEY`.

## Usage

import CodeBlock from "@theme/CodeBlock";
import YandexGPTExample from "@examples/models/llm/yandex.ts";

<CodeBlock language="typescript">{YandexGPTExample}</CodeBlock>
1 change: 1 addition & 0 deletions environment_tests/test-exports-bun/src/entrypoints.js
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ export * from "langchain/llms/ai21";
export * from "langchain/llms/aleph_alpha";
export * from "langchain/llms/ollama";
export * from "langchain/llms/fireworks";
export * from "langchain/llms/yandex";
export * from "langchain/prompts";
export * from "langchain/vectorstores/base";
export * from "langchain/vectorstores/memory";
Expand Down
1 change: 1 addition & 0 deletions environment_tests/test-exports-cf/src/entrypoints.js
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ export * from "langchain/llms/ai21";
export * from "langchain/llms/aleph_alpha";
export * from "langchain/llms/ollama";
export * from "langchain/llms/fireworks";
export * from "langchain/llms/yandex";
export * from "langchain/prompts";
export * from "langchain/vectorstores/base";
export * from "langchain/vectorstores/memory";
Expand Down
1 change: 1 addition & 0 deletions environment_tests/test-exports-cjs/src/entrypoints.js
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ const llms_ai21 = require("langchain/llms/ai21");
const llms_aleph_alpha = require("langchain/llms/aleph_alpha");
const llms_ollama = require("langchain/llms/ollama");
const llms_fireworks = require("langchain/llms/fireworks");
const llms_yandex = require("langchain/llms/yandex");
const prompts = require("langchain/prompts");
const vectorstores_base = require("langchain/vectorstores/base");
const vectorstores_memory = require("langchain/vectorstores/memory");
Expand Down
1 change: 1 addition & 0 deletions environment_tests/test-exports-esbuild/src/entrypoints.js
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ import * as llms_ai21 from "langchain/llms/ai21";
import * as llms_aleph_alpha from "langchain/llms/aleph_alpha";
import * as llms_ollama from "langchain/llms/ollama";
import * as llms_fireworks from "langchain/llms/fireworks";
import * as llms_yandex from "langchain/llms/yandex";
import * as prompts from "langchain/prompts";
import * as vectorstores_base from "langchain/vectorstores/base";
import * as vectorstores_memory from "langchain/vectorstores/memory";
Expand Down
1 change: 1 addition & 0 deletions environment_tests/test-exports-esm/src/entrypoints.js
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ import * as llms_ai21 from "langchain/llms/ai21";
import * as llms_aleph_alpha from "langchain/llms/aleph_alpha";
import * as llms_ollama from "langchain/llms/ollama";
import * as llms_fireworks from "langchain/llms/fireworks";
import * as llms_yandex from "langchain/llms/yandex";
import * as prompts from "langchain/prompts";
import * as vectorstores_base from "langchain/vectorstores/base";
import * as vectorstores_memory from "langchain/vectorstores/memory";
Expand Down
1 change: 1 addition & 0 deletions environment_tests/test-exports-vercel/src/entrypoints.js
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ export * from "langchain/llms/ai21";
export * from "langchain/llms/aleph_alpha";
export * from "langchain/llms/ollama";
export * from "langchain/llms/fireworks";
export * from "langchain/llms/yandex";
export * from "langchain/prompts";
export * from "langchain/vectorstores/base";
export * from "langchain/vectorstores/memory";
Expand Down
1 change: 1 addition & 0 deletions environment_tests/test-exports-vite/src/entrypoints.js
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ export * from "langchain/llms/ai21";
export * from "langchain/llms/aleph_alpha";
export * from "langchain/llms/ollama";
export * from "langchain/llms/fireworks";
export * from "langchain/llms/yandex";
export * from "langchain/prompts";
export * from "langchain/vectorstores/base";
export * from "langchain/vectorstores/memory";
Expand Down
7 changes: 7 additions & 0 deletions examples/src/models/llm/yandex.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
import { YandexGPT } from "langchain/llms/yandex";

const model = new YandexGPT();

const res = await model.call('Translate "I love programming" into French.');

console.log({ res });
3 changes: 3 additions & 0 deletions langchain/.gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -163,6 +163,9 @@ llms/writer.d.ts
llms/portkey.cjs
llms/portkey.js
llms/portkey.d.ts
llms/yandex.cjs
llms/yandex.js
llms/yandex.d.ts
prompts.cjs
prompts.js
prompts.d.ts
Expand Down
8 changes: 8 additions & 0 deletions langchain/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -175,6 +175,9 @@
"llms/portkey.cjs",
"llms/portkey.js",
"llms/portkey.d.ts",
"llms/yandex.cjs",
"llms/yandex.js",
"llms/yandex.d.ts",
"prompts.cjs",
"prompts.js",
"prompts.d.ts",
Expand Down Expand Up @@ -1518,6 +1521,11 @@
"import": "./llms/portkey.js",
"require": "./llms/portkey.cjs"
},
"./llms/yandex": {
"types": "./llms/yandex.d.ts",
"import": "./llms/yandex.js",
"require": "./llms/yandex.cjs"
},
"./prompts": {
"types": "./prompts.d.ts",
"import": "./prompts.js",
Expand Down
1 change: 1 addition & 0 deletions langchain/scripts/create-entrypoints.js
Original file line number Diff line number Diff line change
Expand Up @@ -69,6 +69,7 @@ const entrypoints = {
"llms/llama_cpp": "llms/llama_cpp",
"llms/writer": "llms/writer",
"llms/portkey": "llms/portkey",
"llms/yandex": "llms/yandex",
// prompts
prompts: "prompts/index",
"prompts/load": "prompts/load",
Expand Down
123 changes: 123 additions & 0 deletions langchain/src/llms/yandex.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,123 @@
import { getEnvironmentVariable } from "../util/env.js";
import { LLM, BaseLLMParams } from "./base.js";

const apiUrl = "https://llm.api.cloud.yandex.net/llm/v1alpha/instruct";

export interface YandexGPTInputs extends BaseLLMParams {
/**
* What sampling temperature to use.
* Should be a double number between 0 (inclusive) and 1 (inclusive).
*/
temperature?: number;

/**
* Maximum limit on the total number of tokens
* used for both the input prompt and the generated response.
*/
maxTokens?: number;

/** Model name to use. */
model?: string;

/**
* Yandex Cloud Api Key for service account
* with the `ai.languageModels.user` role.
*/
apiKey?: string;

/**
* Yandex Cloud IAM token for service account
* with the `ai.languageModels.user` role.
*/
iamToken?: string;
}

export class YandexGPT extends LLM implements YandexGPTInputs {
static lc_name() {
return "Yandex GPT";
}

get lc_secrets(): { [key: string]: string } | undefined {
return {
apiKey: "YC_API_KEY",
iamToken: "YC_IAM_TOKEN",
};
}

temperature = 0.6;

maxTokens = 1700;

model = "general";

apiKey?: string;

iamToken?: string;

constructor(fields?: YandexGPTInputs) {
super(fields ?? {});

const apiKey = fields?.apiKey ?? getEnvironmentVariable("YC_API_KEY");

const iamToken = fields?.iamToken ?? getEnvironmentVariable("YC_IAM_TOKEN");

if (apiKey === undefined && iamToken === undefined) {
throw new Error(
"Please set the YC_API_KEY or YC_IAM_TOKEN environment variable or pass it to the constructor as the apiKey or iamToken field."
);
}

this.apiKey = apiKey;
this.iamToken = iamToken;
this.maxTokens = fields?.maxTokens ?? this.maxTokens;
this.temperature = fields?.temperature ?? this.temperature;
this.model = fields?.model ?? this.model;
}

_llmType() {
return "yandexgpt";
}

/** @ignore */
async _call(
prompt: string,
options: this["ParsedCallOptions"]
): Promise<string> {
// Hit the `generate` endpoint on the `large` model
return this.caller.callWithOptions({ signal: options.signal }, async () => {
const headers = { "Content-Type": "application/json", Authorization: "" };
if (this.apiKey !== undefined) {
headers.Authorization = `Api-Key ${this.apiKey}`;
} else {
headers.Authorization = `Bearer ${this.iamToken}`;
}
const bodyData = {
model: this.model,
generationOptions: {
temperature: this.temperature,
maxTokens: this.maxTokens,
},

requestText: prompt,
};

try {
const response = await fetch(apiUrl, {
method: "POST",
headers,
body: JSON.stringify(bodyData),
});
if (!response.ok) {
throw new Error(
`Failed to fetch ${apiUrl} from YandexGPT: ${response.status}`
);
}

const responseData = await response.json();
return responseData.result.alternatives[0].text;
} catch (error) {
throw new Error(`Failed to fetch ${apiUrl} from YandexGPT ${error}`);
}
});
}
}
1 change: 1 addition & 0 deletions langchain/src/load/import_map.ts
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,7 @@ export * as llms__ai21 from "../llms/ai21.js";
export * as llms__aleph_alpha from "../llms/aleph_alpha.js";
export * as llms__ollama from "../llms/ollama.js";
export * as llms__fireworks from "../llms/fireworks.js";
export * as llms__yandex from "../llms/yandex.js";
export * as prompts from "../prompts/index.js";
export * as vectorstores__base from "../vectorstores/base.js";
export * as vectorstores__memory from "../vectorstores/memory.js";
Expand Down
2 changes: 2 additions & 0 deletions langchain/src/load/import_type.d.ts
Original file line number Diff line number Diff line change
Expand Up @@ -475,6 +475,8 @@ export interface SecretMap {
VECTARA_CUSTOMER_ID?: string;
WRITER_API_KEY?: string;
WRITER_ORG_ID?: string;
YC_API_KEY?: string;
YC_IAM_TOKEN?: string;
ZAPIER_NLA_API_KEY?: string;
ZEP_API_KEY?: string;
ZEP_API_URL?: string;
Expand Down
1 change: 1 addition & 0 deletions langchain/tsconfig.json
Original file line number Diff line number Diff line change
Expand Up @@ -87,6 +87,7 @@
"src/llms/llama_cpp.ts",
"src/llms/writer.ts",
"src/llms/portkey.ts",
"src/llms/yandex.ts",
"src/prompts/index.ts",
"src/prompts/load.ts",
"src/vectorstores/analyticdb.ts",
Expand Down

1 comment on commit b74b340

@vercel
Copy link

@vercel vercel bot commented on b74b340 Oct 13, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please sign in to comment.