-
Notifications
You must be signed in to change notification settings - Fork 10
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Add support for configuring and using Ollama models with Foyle: (#43)
- Add 'AgentConfig' struct to 'config' package with a 'Model' field for specifying the model to use for completions - Set default configuration for the 'AgentConfig' model to 'openai.GPT3Dot5Turbo0125' - Update 'agent' package to handle missing AgentConfig in configuration and log an error - Update 'oai' package to handle missing OpenAI configuration and custom BaseURL, and log BaseURL usage - Add 'DefaultModel' constant for Ollama model name to 'const.go' in 'config' package - Add a manual test for Ollama usage with 'Test_Ollama' in 'client_test.go' with a check for 'GITHUB_ACTIONS' environment variable - Add documentation for using Ollama with Foyle in 'docs/content/en/docs/ollama/_index.md' - Include steps for configuring Foyle to use Ollama's baseURL and model in the documentation
- Loading branch information
Showing
6 changed files
with
136 additions
and
12 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,5 @@ | ||
package config | ||
|
||
import "github.com/sashabaranov/go-openai" | ||
|
||
const DefaultModel = openai.GPT3Dot5Turbo0125 |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,39 @@ | ||
--- | ||
title: "Ollama" | ||
description: "How to use Ollama with Foyle" | ||
weight: 3 | ||
--- | ||
|
||
## What You'll Learn | ||
|
||
How to configure Foyle to use models served by Ollama | ||
|
||
## Prerequisites | ||
|
||
1. Follow [Ollama's docs] to download Ollama and serve a model like `llama2` | ||
|
||
## Setup Foyle to use Ollama | ||
|
||
Foyle relies on [Ollama's OpenAI Chat Compatability API]() to interact with models served by Ollama. | ||
|
||
|
||
1. Configure Foyle to use the appropriate Ollama baseURL | ||
|
||
``` | ||
foyle config set openai.baseURL=http://localhost:11434/v1 | ||
``` | ||
|
||
* Change the server and port to match how you are serving Ollama | ||
* You may also need to change the scheme to https; e.g. if you are using a VPN like [Tailscale](https://tailscale.com/) | ||
|
||
1. Configure Foyle to use the appropriate Ollama model | ||
|
||
``` | ||
foyle config agent.model=llama2 | ||
``` | ||
|
||
* Change the model to match the model you are serving with Ollama | ||
|
||
1. You can leave the `apiKeyFile` unset since you aren't using an API key with Ollama | ||
|
||
1. That's it! You should now be able to use Foyle with Ollama |