Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for openai baseurl/models #56

Open
babybirdprd opened this issue May 26, 2024 · 6 comments · May be fixed by #143
Open

Support for openai baseurl/models #56

babybirdprd opened this issue May 26, 2024 · 6 comments · May be fixed by #143
Assignees
Labels
enhancement New feature or request

Comments

@babybirdprd
Copy link

Is your feature request related to a problem? Please describe.
No, it's adding to the open ai extension to allow for baseurl and model to be set in environment variables

Describe the solution you'd like
A way to extend the function to open ai compatible endpoints

Describe alternatives you've considered
Making it as it's own extension, as there are a lot of models that are not multimodal, or that at least don't have all the features of open ai. Although there are several projects that use the open ai api, to replace all features. Tts, stt, image generation, etc.

Additional context
This would allow people that don't have access to claude, gemini, etc. The ability to run any llm through: openrouter, deepinfra, etc

@babybirdprd babybirdprd added the enhancement New feature or request label May 26, 2024
@Dabolus Dabolus self-assigned this May 28, 2024
@Dabolus
Copy link
Collaborator

Dabolus commented May 28, 2024

Thanks for opening the issue. I'm not familiar with those services, but after a brief look I think adding support should be quite easy. I'll look into it 👍🏻

@babybirdprd
Copy link
Author

Thank you! Greatly appreciate it. And just as as an after thought. If you look at https://github.com/matatonic , they have a few different openai compatible API's for vision, stt, tts, and image gen. But that is more of a personal want, not sure if those integrations are as straight forward.

@ehelbig1
Copy link

Any updates on this feature?

@ehelbig1
Copy link

It looks like this is already supported for Groq

@MagdielCAS MagdielCAS linked a pull request Sep 6, 2024 that will close this issue
7 tasks
@MagdielCAS
Copy link

Hi, I was hoping for the same feature for the OpenAI + Genkit, I started working on my own plugin when I found this one, so instead of working on everything again I opened up the PR #143, improving the config so we could add custom models and set the Base URL as well. It would be a really nice feature since we have a lot of services available with OpenAI API support (deepinfra, openrouter, litellm, etc)

@alexastrum
Copy link
Contributor

I proposed a fix for baseUrl support in #168

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants