-
Notifications
You must be signed in to change notification settings - Fork 329
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[bug] error with openai proxy and lang chain #1052
Comments
@dtam I think they originally were using
Source: https://python.langchain.com/v0.1/docs/expression_language/interface/#async-stream-events-beta This would explain why it worked when telling the model to activate streaming (that should send the |
This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 14 days. |
This issue was closed because it has been stalled for 14 days with no activity. |
I tested this again today to see where we were on being able to support this. The call actually goes through and validates appropriately but since we are not responding with the entire message structure including the role, Langchain's OpenAI pydantic model blows up. Also, the streaming implementation does not work with |
This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 14 days. |
Describe the bug
should be able to chain.invoke via the OpenAI callable in langchain and a base url pointing at a guardrails server but it errors on role of type None
To Reproduce
Steps to reproduce the behavior:
Expected behavior
should complete and validate
Library version:
0.5.0
Additional context
The text was updated successfully, but these errors were encountered: