Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Duplicate first chunk in playground streaming chat #5338

Open
yangxikun opened this issue Dec 18, 2024 · 1 comment
Open

Duplicate first chunk in playground streaming chat #5338

yangxikun opened this issue Dec 18, 2024 · 1 comment
Assignees
Labels
bug Something isn't working

Comments

@yangxikun
Copy link

Bug Description

Clipboard_Screenshot_1734538600 Clipboard_Screenshot_1734538722

The first chunk content is 今天,will send two events add_message and token with same content 今天.

Reproduction

This bug can be reproduced when the llm response first message chunk content is not empty

Expected behavior

Expected output: 今天是2024年12月19日,星期四,农历是二零二四年十一月十九。

Who can help?

@anovazzi1

Operating System

mac

Langflow Version

v1.1.1

Python Version

3.12

Screenshot

No response

Flow File

No response

@yangxikun yangxikun added the bug Something isn't working label Dec 18, 2024
Copy link

dosubot bot commented Dec 18, 2024

To fix the bug in the _process_chunk method that causes the first chunk to be duplicated, you need to ensure that the complete_message is only updated with the chunk content after the initial message has been sent. The duplication occurs because the complete_message is updated with the chunk content before the initial message is sent in the first chunk. Here's the corrected code:

async def _process_chunk(
    self, chunk: str, complete_message: str, message_id: str, message: Message, *, first_chunk: bool = False
) -> str:
    if self._event_manager:
        if first_chunk:
            # Send the initial message only on the first chunk
            msg_copy = message.model_copy()
            msg_copy.text = complete_message + chunk
            await self._send_message_event(msg_copy, id_=message_id)
        await asyncio.to_thread(
            self._event_manager.on_token,
            data={
                "chunk": chunk,
                "id": str(message_id),
            },
        )
    complete_message += chunk
    return complete_message

In this corrected version, the complete_message is updated with the chunk after the initial message is sent, preventing the duplication of the first chunk [1].

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants