Skip to content

Issues: octoml/mlc-llm

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Assignee
Filter by who’s assigned
Sort

Issues list

[Tracking] PT model support follow up
#217 opened Feb 21, 2024 by masahi
8 tasks
[Bug] Undeterministic Batch Formation bug Something isn't working
#208 opened Feb 14, 2024 by sunggg
[Tracking] Sampler optimization
#199 opened Feb 8, 2024 by masahi
1 of 2 tasks
[Tracking] InferenceEngine Remake
#193 opened Feb 6, 2024 by yelite
8 tasks
[Feature Request] Support system prompt
#188 opened Feb 1, 2024 by sunggg
[Bug] User side cancellation bug Something isn't working
#179 opened Jan 30, 2024 by sunggg
[Bug] Recovering logic of a long evicted request is broken bug Something isn't working
#163 opened Jan 17, 2024 by masahi
Remaining pieces for upstreaming
#153 opened Jan 9, 2024 by sunggg
Remove repeated concat of prompt and decode tokens in detokenization bug Something isn't working
#139 opened Jan 3, 2024 by masahi
ProTip! Follow long discussions with comments:>50.