Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for Github models (models.inference.ai.azure.com) API #814

Open
aenold opened this issue Dec 18, 2024 · 0 comments
Open

Add support for Github models (models.inference.ai.azure.com) API #814

aenold opened this issue Dec 18, 2024 · 0 comments

Comments

@aenold
Copy link

aenold commented Dec 18, 2024

Is your feature request related to a problem? Please describe:

Currently, bolt.diy does not directly support the Github Models API available at https://github.com/marketplace/models/ with the endpoint https://models.inference.ai.azure.com. This limits the ability to integrate models hosted on this platform, specifically models like the o1-preview and other Github models. This would be beneficial in extending the scope of models the application has access to.

Describe the solution you'd like:

I would like bolt.diy to include a new provider configuration and associated logic to interact with the Azure AI Inference API at https://models.inference.ai.azure.com. This would involve:

  • Adding a new provider called "Github models" or similar.
  • Supporting authentication with API keys.
  • Adding the available models to the MODEL_LIST.
  • Adding the https://models.inference.ai.azure.com endpoint to the list of valid providers.
  • Support passing an model name to the api using client.chat.completions.create

Describe alternatives you've considered:

As a workaround, we could attempt to use the "OpenAILike" provider. However, this is not ideal, it might not fully support all features specific to Azure AI Inference API, and would not provide an ideal experience for users.

Additional context:

The Github models API at https://models.inference.ai.azure.com offers access to models like o1-preview and Llama-3.3-70B-Instruct and could broaden the models available through bolt.diy. Having direct support would improve the user experience for anyone looking to use models from Azure. This would allow bolt.diy to support a bigger array of models.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant