You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I don't know whether yack shorts the answers to fit within the token requirements of certain models, however after having a long continuous discussion (think 7+ long questions and/or answers) chatgpt will return an error stating the token limit is exceeded.
The text was updated successfully, but these errors were encountered:
So it doesn't strip earlier conversations when sending them for "context".
It'd probably be helpful to maybe restrict the last 10-15 chat messages when sending the new message for context, so that token limits don't get exhausted.
Probably just have to do an array.slice when sending message conversations to the stream
I don't know whether yack shorts the answers to fit within the token requirements of certain models, however after having a long continuous discussion (think 7+ long questions and/or answers) chatgpt will return an error stating the token limit is exceeded.
The text was updated successfully, but these errors were encountered: