forked from mlc-ai/mlc-llm
-
Notifications
You must be signed in to change notification settings - Fork 8
Issues: octoml/mlc-llm
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
[Bug] Undeterministic Batch Formation
bug
Something isn't working
#208
opened Feb 14, 2024 by
sunggg
[Feature Request] Remove the confusion about the application of Chat Template
#187
opened Feb 1, 2024 by
sunggg
[Bug] Recovering logic of a long evicted request is broken
bug
Something isn't working
#163
opened Jan 17, 2024 by
masahi
[Bug] Parallel sampling: The new token after prefill is duplicated for all generations
bug
Something isn't working
#161
opened Jan 16, 2024 by
masahi
[Bug] Empty token can appear at the beginning of a generated sequence
bug
Something isn't working
#140
opened Jan 3, 2024 by
masahi
Remove repeated concat of prompt and decode tokens in detokenization
bug
Something isn't working
#139
opened Jan 3, 2024 by
masahi
ProTip!
Follow long discussions with comments:>50.