Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add tolerance for empty choices #4311

Open
wants to merge 9 commits into
base: main
Choose a base branch
from

Conversation

MohMaz
Copy link
Contributor

@MohMaz MohMaz commented Nov 22, 2024

Why are these changes needed?

I encountered this issue only in AzureOpenAIChatCompletionClient, adding some tolerance for empty choices ! Not a super clean fix ... just works to unblock updating streaming samples using AzureOpenAIChatCompletionClient

Related issue number

closes #4213

Checks

Signed-off-by: Mohammad Mazraeh <[email protected]>
@MohMaz MohMaz self-assigned this Nov 22, 2024
@MohMaz MohMaz marked this pull request as ready for review November 22, 2024 07:54
@MohMaz MohMaz requested a review from rysweet November 22, 2024 18:17
@@ -566,6 +567,7 @@ async def create_stream(
json_output (Optional[bool], optional): If True, the output will be in JSON format. Defaults to None.
extra_create_args (Mapping[str, Any], optional): Additional arguments for the creation process. Default to `{}`.
cancellation_token (Optional[CancellationToken], optional): A token to cancel the operation. Defaults to None.
max_consequative_empty_chunk_tolerance (int, optional): The maximum number of consecutive empty chunks to tolerate before stopping the stream. This seems to only be needed to set when using `AzureOpenAIChatCompletionClient`. Defaults to 0.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What does none mean?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

my bad, removed

while True:
try:
chunk_future = asyncio.ensure_future(anext(stream))
if cancellation_token is not None:
cancellation_token.link_future(chunk_future)
chunk = await chunk_future

# This is to address a bug in AzureOpenAIChatCompletionClient. OpenAIChatCompletionClient works fine.
# https://github.com/microsoft/autogen/issues/4213
if len(chunk.choices) == 0:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do the chunks with no choices have other data set on other fields of the delta?

Copy link
Contributor Author

@MohMaz MohMaz Nov 23, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can't see anything int the chunk.__dict__. In the consecutive chunks, the choices gets populated properly and has all the expected fields.

{'id': '', 'choices': [], 'created': 0, 'model': '', 'object': '', 'service_tier': None, 'system_fingerprint': None, 'usage': None}

This has been happening only for the first 1-2 chunks to me, not sure if it's reproducible to all models. I've tried gpt-4o

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

finish_reason not set in AzureOpenAIChatCompletionClient.create_stream
3 participants