Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Jupyternaut now chat with a generate output style, I want to know why not use streaming output style? #835

Closed
johnsGuo opened this issue Jun 15, 2024 · 1 comment
Labels
duplicate This issue or pull request already exists enhancement New feature or request

Comments

@johnsGuo
Copy link

Problem

Jupyter-AI's Jupyternaut is currently using a generated output style for chatting. I would like to know why not using a stream output style? Because I think streaming output provides a better visual interaction experience.

Proposed Solution

May provider a switch for user to select?

Additional context

When I use jupyter ai to ask relatively complex questions, the LLM model sometimes outputs a bit too much content, and the current output style gives me a feeling of lag

@johnsGuo johnsGuo added the enhancement New feature or request label Jun 15, 2024
@JasonWeill
Copy link
Collaborator

@johnsGuo Thank you for opening this issue! Let's keep the conversation going in #228, which covers this feature request.

@JasonWeill JasonWeill added the duplicate This issue or pull request already exists label Jun 17, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
duplicate This issue or pull request already exists enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants