让LLM支持被打断功能
#289
Replies: 3 comments 1 reply
-
给个邮箱,这个问题非常好 |
Beta Was this translation helpful? Give feedback.
0 replies
-
直接打断是可以的,但是模型能不能顺利接上新话题,这是一个大问题 |
Beta Was this translation helpful? Give feedback.
1 reply
-
这个问题已经被标记为未来目标,之后将会探索一些解决方案。 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
当提问后,LLM会给出较长的回复,如果用户想终止当前回复,能否发送新的请求来打断LLM的输出?就像人们在聊天的时候,有时候也需要打断/提前终止对话,开启新的话题。
Beta Was this translation helpful? Give feedback.
All reactions