You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We currently support OpenAI chat completions via the /guards/{guardName}/openai/v1/chat/completions endpoint. At face value, it looks like it we should also be able to support Azure's OpenAI hostings as well, but as discovered by one of our users in guardrails-ai/guardrails#1159, the endpoint constructed by the AzureOpenAI client differs from one constructed by the OpenAI client.
We should test this thoroughly to see what the differences are; e.g. endpoint, payload, headers, etc., and, if feasible, handle both cilents' requests with the same route logic by adding some additional path parameters or wildcards. Otherwise we will need to stand up a new endpoint specifically to support Azure's implementation.
The text was updated successfully, but these errors were encountered:
We currently support OpenAI chat completions via the
/guards/{guardName}/openai/v1/chat/completions
endpoint. At face value, it looks like it we should also be able to support Azure's OpenAI hostings as well, but as discovered by one of our users in guardrails-ai/guardrails#1159, the endpoint constructed by the AzureOpenAI client differs from one constructed by the OpenAI client.We should test this thoroughly to see what the differences are; e.g. endpoint, payload, headers, etc., and, if feasible, handle both cilents' requests with the same route logic by adding some additional path parameters or wildcards. Otherwise we will need to stand up a new endpoint specifically to support Azure's implementation.
The text was updated successfully, but these errors were encountered: