-
Notifications
You must be signed in to change notification settings - Fork 5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add tolerance for empty choices #4311
base: main
Are you sure you want to change the base?
add tolerance for empty choices #4311
Conversation
Signed-off-by: Mohammad Mazraeh <[email protected]>
python/packages/autogen-ext/src/autogen_ext/models/_openai/_openai_client.py
Outdated
Show resolved
Hide resolved
python/packages/autogen-ext/src/autogen_ext/models/_openai/_openai_client.py
Outdated
Show resolved
Hide resolved
Signed-off-by: Mohammad Mazraeh <[email protected]>
@@ -566,6 +567,7 @@ async def create_stream( | |||
json_output (Optional[bool], optional): If True, the output will be in JSON format. Defaults to None. | |||
extra_create_args (Mapping[str, Any], optional): Additional arguments for the creation process. Default to `{}`. | |||
cancellation_token (Optional[CancellationToken], optional): A token to cancel the operation. Defaults to None. | |||
max_consequative_empty_chunk_tolerance (int, optional): The maximum number of consecutive empty chunks to tolerate before stopping the stream. This seems to only be needed to set when using `AzureOpenAIChatCompletionClient`. Defaults to 0. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What does none
mean?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
my bad, removed
python/packages/autogen-ext/src/autogen_ext/models/_openai/_openai_client.py
Outdated
Show resolved
Hide resolved
python/packages/autogen-ext/src/autogen_ext/models/_openai/_openai_client.py
Outdated
Show resolved
Hide resolved
while True: | ||
try: | ||
chunk_future = asyncio.ensure_future(anext(stream)) | ||
if cancellation_token is not None: | ||
cancellation_token.link_future(chunk_future) | ||
chunk = await chunk_future | ||
|
||
# This is to address a bug in AzureOpenAIChatCompletionClient. OpenAIChatCompletionClient works fine. | ||
# https://github.com/microsoft/autogen/issues/4213 | ||
if len(chunk.choices) == 0: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do the chunks with no choices have other data set on other fields of the delta?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I can't see anything int the chunk.__dict__
. In the consecutive chunks, the choices gets populated properly and has all the expected fields.
{'id': '', 'choices': [], 'created': 0, 'model': '', 'object': '', 'service_tier': None, 'system_fingerprint': None, 'usage': None}
This has been happening only for the first 1-2 chunks to me, not sure if it's reproducible to all models. I've tried gpt-4o
python/packages/autogen-ext/src/autogen_ext/models/_openai/_openai_client.py
Outdated
Show resolved
Hide resolved
…enai_client.py Co-authored-by: Jack Gerrits <[email protected]>
…enai_client.py Co-authored-by: Jack Gerrits <[email protected]>
Signed-off-by: Mohammad Mazraeh <[email protected]>
Why are these changes needed?
I encountered this issue only in
AzureOpenAIChatCompletionClient
, adding some tolerance for emptychoices
! Not a super clean fix ... just works to unblock updating streaming samples usingAzureOpenAIChatCompletionClient
Related issue number
closes #4213
Checks