We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
When starting finetune_qlora.sh and using transformers==4.34.0, it crashes with the error
finetune_qlora.sh
transformers==4.34.0
TypeError: forward() got an unexpected keyword argument 'padding_mask'
bitsandbytes 0.41.0 peft 0.4.0 torch 2.0.1 torchvision 0.15.2 transformers 4.34.0
Updated to transformers==4.35.0 and a new error occurs
transformers==4.35.0
ImportError: cannot import name '_expand_mask' from 'transformers.models.bloom.modeling_bloom'
Updated to transformers==4.36.0 and 4.37.0 and another error occurs
transformers==4.36.0
4.37.0
ValueError: 'llava' is already used by a Transformers config, pick another name.
Changed the named used in the AutoConfig and we are back to the same error as before:
Anyone knows how to fix this problem?
The text was updated successfully, but these errors were encountered:
I have the same problem, have you solved it? Thank you!
Sorry, something went wrong.
No branches or pull requests
When starting
finetune_qlora.sh
and usingtransformers==4.34.0
, it crashes with the errorUpdated to
transformers==4.35.0
and a new error occursUpdated to
transformers==4.36.0
and4.37.0
and another error occursChanged the named used in the AutoConfig and we are back to the same error as before:
Anyone knows how to fix this problem?
The text was updated successfully, but these errors were encountered: