-
Notifications
You must be signed in to change notification settings - Fork 53
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
draft: Added Model Options #99
base: main
Are you sure you want to change the base?
Conversation
}: { | ||
query: string; | ||
relevant: string; | ||
// model: string; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is correct here to define this for the askQuestion() - then you can pass in the model ID where you have the model: selectedModel
. You will just have to only use the cloud models here (for now) since we are not handling the download and usage of any local model.
Using the local models requires the modelProgressDownloadController file that you can see an example of here to monitor their download status so they can actually be passed into your question logic.
For now i would just stick to the cloud models:
- OpenAI models (4 models)
- Palm2 models (2 models)
- gemini models (1 model)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey @Arindam200 Could I get on update on this pr as well? Don't want them sitting here for too long |
Sorry for the Delay! Will update the changes shortly |
Thank you! Just want to ensure that this project stays active 👍 |
Description
In this PR I have fetched the available LLM models and added a Dropdown to switch between different LLM Models.
To Do:
ScreenShot:
This PR fixes #89