Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[bug/feat] Azure OpenAI support for proxy endpoint #86

Open
CalebCourier opened this issue Nov 21, 2024 · 0 comments
Open

[bug/feat] Azure OpenAI support for proxy endpoint #86

CalebCourier opened this issue Nov 21, 2024 · 0 comments

Comments

@CalebCourier
Copy link
Contributor

We currently support OpenAI chat completions via the /guards/{guardName}/openai/v1/chat/completions endpoint. At face value, it looks like it we should also be able to support Azure's OpenAI hostings as well, but as discovered by one of our users in guardrails-ai/guardrails#1159, the endpoint constructed by the AzureOpenAI client differs from one constructed by the OpenAI client.

We should test this thoroughly to see what the differences are; e.g. endpoint, payload, headers, etc., and, if feasible, handle both cilents' requests with the same route logic by adding some additional path parameters or wildcards. Otherwise we will need to stand up a new endpoint specifically to support Azure's implementation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant