Skip to content

Commit

Permalink
Merge branch 'master' of github.com:upstash/rag-chat
Browse files Browse the repository at this point in the history
  • Loading branch information
ogzhanolguncu committed Jul 5, 2024
2 parents bf46eb3 + b9ba8db commit e41c6f6
Show file tree
Hide file tree
Showing 7 changed files with 24 additions and 10 deletions.
18 changes: 16 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,7 @@ OPENAI_API_KEY=...

#### Using Upstash-hosted Open-Source Models

To use an OpenAI model, first initialize RAGChat:
To use an Upstash model, first initialize RAGChat:

```typescript
import { RAGChat, upstashModel } from "@upstash/rag-chat";
Expand All @@ -82,11 +82,25 @@ export const ragChat = new RAGChat({
});
```

#### Using Custom Providers - TogetherAi, Replicate

To use an Upstash model, first initialize RAGChat:

````typescript
import { RAGChat, upstashModel } from "@upstash/rag-chat";

export const ragChat = new RAGChat({
model: customModel("codellama/CodeLlama-70b-Instruct-hf", {
apiKey: "TOGETHER_AI_TOKEN",
baseUrl: "https://api.together.xyz/v1",
}),
});

And set your Upstash QStash API key environment variable:

```bash
QSTASH_TOKEN=...
```
````
<details>
<summary>Where do I find my Upstash API key?</summary><br>
Expand Down
2 changes: 1 addition & 1 deletion examples/nodejs/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
"@ai-sdk/google": "latest",
"@ai-sdk/mistral": "latest",
"@ai-sdk/openai": "latest",
"@upstash/rag-chat": "0.0.4-2.alpha",
"@upstash/rag-chat": "1.0.1",
"ai": "latest",
"dotenv": "16.4.5",
"zod": "3.23.8",
Expand Down
2 changes: 1 addition & 1 deletion examples/nuxt/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,6 @@
},
"version": "0.0.0",
"dependencies": {
"@upstash/rag-chat": "0.0.4-2.alpha"
"@upstash/rag-chat": "1.0.1"
}
}
2 changes: 1 addition & 1 deletion examples/solidjs/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@
"@solidjs/meta": "0.29.4",
"@solidjs/router": "^0.13.6",
"@solidjs/start": "^1.0.2",
"@upstash/rag-chat": "0.0.4-2.alpha",
"@upstash/rag-chat": "1.0.1",
"ai": "latest",
"solid-js": "^1.8.17",
"zod": "^3.23.8"
Expand Down
2 changes: 1 addition & 1 deletion examples/sveltekit/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@
"dependencies": {
"@ai-sdk/openai": "latest",
"@ai-sdk/svelte": "latest",
"@upstash/rag-chat": "0.0.4-2.alpha",
"@upstash/rag-chat": "1.0.1",
"ai": "latest",
"openai": "4.47.1"
},
Expand Down
4 changes: 2 additions & 2 deletions src/rag-chat-base.ts
Original file line number Diff line number Diff line change
Expand Up @@ -106,9 +106,9 @@ export class RAGChatBase {
onChunk?.({
content: message,
inputTokens: value?.usage_metadata?.input_tokens ?? 0,
outputTokens: value?.usage_metadata?.output_tokens ?? 0,
chunkTokens: value?.usage_metadata?.output_tokens ?? 0,
totalTokens: value?.usage_metadata?.total_tokens ?? 0,
// This actually streamed output from LLM, but cast it to UpstashMessage above to make everything type.But, in this case its not needed
// This actually streamed output from LLM, but cast it to UpstashMessage above to make everything type. But, in this case its not needed
rawContent: value as unknown as string,
});
concatenatedOutput += message;
Expand Down
4 changes: 2 additions & 2 deletions src/types.ts
Original file line number Diff line number Diff line change
Expand Up @@ -65,12 +65,12 @@ export type ChatOptions = {
onChunk?: ({
content,
inputTokens,
outputTokens,
chunkTokens,
totalTokens,
rawContent,
}: {
inputTokens: number;
outputTokens: number;
chunkTokens: number;
totalTokens: number;
content: string;
rawContent: string;
Expand Down

0 comments on commit e41c6f6

Please sign in to comment.