-
Notifications
You must be signed in to change notification settings - Fork 4
Issues: chflame163/ComfyUI_OmniGen_Wrapper
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Phi3Transformer does not support an attention implementation through torch.nn.functional.scaled_dot_product_attention yet. Please request the support for this architecture: https://github.com/huggingface/transformers/issues/28005. If you believe this error is a bug, please open an issue in Transformers GitHub repository and load your model with the argument
attn_implementation="eager"
meanwhile. Example: model = AutoModel.from_pretrained("openai/whisper-tiny", attn_implementation="eager")
#11
opened Nov 22, 2024 by
flyricci
total images must be the same as the number of image tags, got 1 image tags and 2 images
#4
opened Nov 7, 2024 by
cardenluo
OmniGen.from_pretrained() got an unexpected keyword argument 'quantize'
#2
opened Nov 6, 2024 by
TAYLENHE
ProTip!
Type g i on any issue or pull request to go back to the issue listing page.