-
-
Notifications
You must be signed in to change notification settings - Fork 134
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for LCM lora #322
Comments
As mentioned else where, this would benefit from n_iter ( Namely, a 4090 could likely do a 4 step image in sub-second time frames (<0.7 seconds according to the paper), but initializing a comfy pipeline takes longer than that. Workers would benefit in terms of efficiency being able to batch these jobs and users would get more images for less kudos. |
Correct. Batching is very much on my priority list. I'm hoping to get the NLNet funding to work on it soon. |
As of Haidra-Org/hordelib@261b259, the version of comfyui packaged supports LCM, and now hordelib needs the relevant changes to accept it, accounting for the fact it needs a certain sampler to work. (see here for more info). |
This was resolved as of Haidra-Org/hordelib#132 and #337 |
https://huggingface.co/blog/lcm_lora
The text was updated successfully, but these errors were encountered: