-
Notifications
You must be signed in to change notification settings - Fork 20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Ollama Model for local usage of yorkie intelligence #297
Conversation
WalkthroughThe recent updates significantly enhance the backend configuration and Docker setup. Key improvements include the integration of the Ollama model for Yorkie Intelligence, offering greater flexibility and reduced costs compared to the previous OpenAI model. Environment variables were updated to support this functionality, and the Docker configuration expanded to include new services, streamlining collaborative editing features and ensuring proper authentication. Changes
Assessment against linked issues
Poem
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (invoked as PR comments)
Additionally, you can add CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Files ignored due to path filters (1)
backend/package-lock.json
is excluded by!**/package-lock.json
Files selected for processing (6)
- backend/.env.development (2 hunks)
- backend/docker/docker-compose-full.yml (1 hunks)
- backend/docker/docker-compose.yml (1 hunks)
- backend/package.json (1 hunks)
- backend/src/langchain/langchain.module.ts (1 hunks)
- backend/src/settings/settings.service.ts (1 hunks)
Additional context used
Gitleaks
backend/.env.development
8-9: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
Additional comments not posted (6)
backend/src/settings/settings.service.ts (1)
13-13
: Verify the impact of broadening the enable condition.The condition for enabling
yorkieIntelligence
has been changed to check if the value is not"false"
, which broadens the criteria. Ensure this change aligns with the intended behavior and verify that all configurations are correctly set to avoid unexpected enabling of the feature.backend/src/langchain/langchain.module.ts (1)
8-19
: Verify model instantiation logic and compatibility.The
chatModelFactory
now conditionally instantiatesChatOllama
orChatOpenAI
based on theYORKIE_INTELLIGENCE
environment variable. Ensure that both models are correctly implemented and compatible with the rest of the system, and verify that the environment variable is set appropriately in all deployment environments.backend/docker/docker-compose.yml (1)
12-16
: Verify the integration of theyorkie-intelligence
service.The new service
yorkie-intelligence
has been added with the imageollama/ollama:latest
, exposing port11434
. Ensure there are no port conflicts with other services and that the service is correctly integrated into the overall architecture.Verification successful
Integration of
yorkie-intelligence
service verified successfully.The
yorkie-intelligence
service is correctly integrated with no port conflicts. It uses port11434
, which is not shared with any other service in the Docker Compose files.
- Port
11434
is used only by theyorkie-intelligence
service.- No other services conflict with this port.
Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Description: Verify the integration of the `yorkie-intelligence` service in Docker Compose. # Test: Check for port conflicts and ensure correct service integration. rg --type yaml 'ports:.*11434'Length of output: 32
Script:
#!/bin/bash # List all services and their exposed ports in Docker Compose files to verify integration and check for conflicts. rg --type yaml 'ports:' -A 3Length of output: 1896
backend/docker/docker-compose-full.yml (1)
43-47
: Ensure proper security and resource allocation for the new service.The
yorkie-intelligence
service has been added with a restart policy and port mapping. Consider the following:
- Security: Ensure that the exposed port
11434
is secured and only accessible where necessary.- Resource Allocation: Depending on the resource requirements of the
ollama
model, you might need to specify resource limits (e.g., CPU, memory) to prevent overconsumption.Ensure that the new service integrates well with existing services and that there are no conflicts with other configurations.
backend/package.json (1)
35-35
: Verify the necessity and compatibility of the new dependency.The
@langchain/ollama
package has been added. Ensure that:
- This dependency is necessary for the new functionality.
- It is compatible with other packages in your project, especially those within the Langchain ecosystem.
Ensure that the dependency is used effectively in the codebase and does not introduce conflicts.
backend/.env.development (1)
40-43
: Clarify the model configuration for Yorkie Intelligence.The
YORKIE_INTELLIGENCE
variable now specifies a model name"gemma2:2b"
. Ensure that:
- The model name is correctly configured and available in the Ollama library.
- Documentation is updated to reflect this change, explaining how to modify or update the model if needed.
Check the integration of this model with the application to ensure it functions as expected.
- add docker image in docker compose file - change yorkie intelligence env var - add lib related to ollama
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Files ignored due to path filters (1)
backend/package-lock.json
is excluded by!**/package-lock.json
Files selected for processing (6)
- backend/.env.development (2 hunks)
- backend/docker/docker-compose-full.yml (1 hunks)
- backend/docker/docker-compose.yml (1 hunks)
- backend/package.json (1 hunks)
- backend/src/langchain/langchain.module.ts (1 hunks)
- backend/src/settings/settings.service.ts (1 hunks)
Files skipped from review due to trivial changes (1)
- backend/package.json
Files skipped from review as they are similar to previous changes (5)
- backend/.env.development
- backend/docker/docker-compose-full.yml
- backend/docker/docker-compose.yml
- backend/src/langchain/langchain.module.ts
- backend/src/settings/settings.service.ts
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have a question.
I have a question. What happens if someone requests Yorkie intelligence while the model is downloading?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Files selected for processing (1)
- backend/src/langchain/langchain.module.ts (1 hunks)
Files skipped from review as they are similar to previous changes (1)
- backend/src/langchain/langchain.module.ts
- fix docker compose file (user can change ollama conatainer port) - fix readme docs(add --env-file option) - add usable model
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
Outside diff range, codebase verification and nitpick comments (1)
README.md (1)
62-62
: Ensure Proper Configuration of Environment VariablesThe
.env.development
file contains placeholders forGITHUB_CLIENT_ID
andGITHUB_CLIENT_SECRET
. Please replace these placeholders with actual values to ensure proper configuration and functionality of the application.Analysis chain
LGTM! Verify the
.env.development
setup.The addition of
--env-file ./backend/.env.development
to thedocker-compose
command is beneficial for managing environment configurations.Ensure that the
.env.development
file is correctly set up with all necessary environment variables.Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Description: Verify that the `.env.development` file contains necessary environment variables. # Test: Check for the presence of environment variables. Expect: Non-empty values for each variable. grep -E 'GITHUB_CLIENT_ID|GITHUB_CLIENT_SECRET' backend/.env.developmentLength of output: 302
Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Files selected for processing (6)
- README.md (2 hunks)
- backend/.env.development (1 hunks)
- backend/README.md (1 hunks)
- backend/docker/docker-compose-full.yml (1 hunks)
- backend/docker/docker-compose.yml (1 hunks)
- backend/src/langchain/langchain.module.ts (1 hunks)
Files skipped from review as they are similar to previous changes (4)
- backend/.env.development
- backend/docker/docker-compose-full.yml
- backend/docker/docker-compose.yml
- backend/src/langchain/langchain.module.ts
Additional comments not posted (2)
backend/README.md (1)
27-27
: LGTM! Verify the.env.development
setup.The addition of
--env-file ./backend/.env.development
to thedocker-compose
command is a good practice for managing environment configurations.Ensure that the
.env.development
file is correctly set up with all necessary environment variables.Verification successful
The
.env.development
file is correctly set up with placeholders for necessary environment variables. The file contains entries forGITHUB_CLIENT_ID
andGITHUB_CLIENT_SECRET
, which are expected to be filled with actual values by the user. This is a standard practice for environment configuration files.Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Description: Verify that the `.env.development` file contains necessary environment variables. # Test: Check for the presence of environment variables. Expect: Non-empty values for each variable. grep -E 'GITHUB_CLIENT_ID|GITHUB_CLIENT_SECRET' backend/.env.developmentLength of output: 302
README.md (1)
91-91
: LGTM! Verify the.env.development
setup.The addition of
--env-file ./backend/.env.development
to thedocker-compose
command is beneficial for managing environment configurations.Ensure that the
.env.development
file is correctly set up with all necessary environment variables.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
Outside diff range, codebase verification and nitpick comments (1)
backend/src/langchain/langchain.module.ts (1)
49-49
: Improve the error message for clarity.Consider providing additional guidance on where to verify the model name.
- throw new Error(`${modelType} is not found. please check your model name`); + throw new Error(`${modelType} is not found. Please check your model name in the environment configuration.`);
Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Files ignored due to path filters (1)
backend/package-lock.json
is excluded by!**/package-lock.json
Files selected for processing (5)
- backend/.env.development (1 hunks)
- backend/docker/docker-compose-full.yml (1 hunks)
- backend/docker/docker-compose.yml (1 hunks)
- backend/package.json (1 hunks)
- backend/src/langchain/langchain.module.ts (1 hunks)
Files skipped from review due to trivial changes (2)
- backend/docker/docker-compose-full.yml
- backend/package.json
Files skipped from review as they are similar to previous changes (2)
- backend/.env.development
- backend/docker/docker-compose.yml
Additional context used
Biome
backend/src/langchain/langchain.module.ts
[error] 32-32: Change to an optional chain.
Unsafe fix: Change to an optional chain.
(lint/complexity/useOptionalChain)
Additional comments not posted (3)
backend/src/langchain/langchain.module.ts (3)
3-3
: Import ofChatOllama
is appropriate.The import statement is necessary for the new functionality involving the
ChatOllama
model.
6-22
:modelList
definition is well-structured.The
modelList
object provides a clear mapping of providers to their models, facilitating model selection based on the environment variable.
26-50
:chatModelFactory
logic is sound.The factory correctly parses the
YORKIE_INTELLIGENCE
environment variable, validates the model, and instantiates the appropriate chat model. Error handling is in place for invalid model types.Ensure that the
YORKIE_INTELLIGENCE
environment variable is set correctly in the deployment environment to prevent runtime errors.Verification successful
Ensure
YORKIE_INTELLIGENCE
is correctly configured.The
YORKIE_INTELLIGENCE
environment variable is referenced in multiple files, including:
backend/src/settings/settings.service.ts
: Used to determine if a feature is enabled.backend/src/langchain/langchain.module.ts
: Used to determine the model type.backend/docker/docker-compose-full.yml
: Set tofalse
.Verify that the
YORKIE_INTELLIGENCE
variable is set appropriately in the deployment environment to prevent runtime errors related to model configuration.Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Description: Verify that the YORKIE_INTELLIGENCE environment variable is set in the environment files. # Test: Check for the presence of YORKIE_INTELLIGENCE in environment files. Expect: At least one occurrence. rg --type env YORKIE_INTELLIGENCELength of output: 63
Script:
#!/bin/bash # Description: Search for the YORKIE_INTELLIGENCE environment variable in common configuration files. # Test: Check for the presence of YORKIE_INTELLIGENCE in any file. Expect: At least one occurrence. rg 'YORKIE_INTELLIGENCE' --glob '*.{env,js,json,ts,py,yml,yaml}'Length of output: 349
Tools
Biome
[error] 32-32: Change to an optional chain.
Unsafe fix: Change to an optional chain.
(lint/complexity/useOptionalChain)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Files selected for processing (1)
- backend/src/langchain/langchain.module.ts (1 hunks)
Files skipped from review as they are similar to previous changes (1)
- backend/src/langchain/langchain.module.ts
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for your contribution.
* Add .gitignore into root project directory(#279) * chore: just combine front, back ignore file - remove .gitignore in each folder * chore: fix legacy file & seperate OD part in gitignore * Add ollama llm for yorkie intelligence # 255 - add docker image in docker compose file - change yorkie intelligence env var - add lib related to ollama * apply formatting * Add ollama model option #255 - fix docker compose file (user can change ollama conatainer port) - fix readme docs(add --env-file option) - add usable model * feat: add modelList type, change port to baseurl #255 - apply github code review * fix: apply npm format * fix: refactor by github review
What this PR does / why we need it?
This PR adds the Docker image necessary for the Ollama model to the docker-compose file and updates the environment variables required for utilizing the model.
Any background context you want to provide?
The changes in this PR are essential for incorporating the Ollama model into the project. By adding the necessary Docker image and updating the environment variables, the Ollama model can be seamlessly integrated and utilized.
What are the relevant tickets?
Fixes #255
Checklist
Summary by CodeRabbit
Summary by CodeRabbit
New Features
Bug Fixes
Dependencies