Skip to content

Commit

Permalink
workaround broken vlm endpoints, they do not accept stream_options pa…
Browse files Browse the repository at this point in the history
…rameter
  • Loading branch information
mattf committed Sep 17, 2024
1 parent fe75fd9 commit 48b1b5e
Showing 1 changed file with 8 additions and 0 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -323,6 +323,14 @@ def _stream(
stream_options={"include_usage": True},
**kwargs,
)
# todo: get vlm endpoints fixed and remove this
# vlm endpoints do not accept standard stream_options parameter
if (
self._client.model
and self._client.model.model_type
and self._client.model.model_type == "vlm"
):
payload.pop("stream_options")
for response in self._client.get_req_stream(payload=payload):
self._set_callback_out(response, run_manager)
parsed_response = self._custom_postprocess(response, streaming=True)
Expand Down

0 comments on commit 48b1b5e

Please sign in to comment.