You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[2023-10-28 15:58:46,969] [INFO] [real_accelerator.py:110:get_accelerator] Setting ds_accelerator to cuda (auto detect)
Traceback (most recent call last):
File "/home/artur/anaconda3/envs/llava/lib/python3.10/runpy.py", line 187, in _run_module_as_main
mod_name, mod_spec, code = _get_module_details(mod_name, _Error)
File "/home/artur/anaconda3/envs/llava/lib/python3.10/runpy.py", line 110, in _get_module_details
__import__(pkg_name)
File "/T8/ai_soft/BakLLaVA/llava/__init__.py", line 1, in <module>
from .model import LlavaLlamaForCausalLM
File "/T8/ai_soft/BakLLaVA/llava/model/__init__.py", line 3, in <module>
from .language_model.llava_mpt import LlavaMPTForCausalLM, LlavaMPTConfig
File "/T8/ai_soft/BakLLaVA/llava/model/language_model/llava_mpt.py", line 26, in <module>
from .mpt.modeling_mpt import MPTConfig, MPTForCausalLM, MPTModel
File "/T8/ai_soft/BakLLaVA/llava/model/language_model/mpt/modeling_mpt.py", line 19, in <module>
from .hf_prefixlm_converter import add_bidirectional_mask_if_missing, convert_hf_causal_lm_to_prefix_lm
File "/T8/ai_soft/BakLLaVA/llava/model/language_model/mpt/hf_prefixlm_converter.py", line 15, in <module>
from transformers.models.bloom.modeling_bloom import _expand_mask as _expand_mask_bloom
ImportError: cannot import name '_expand_mask' from 'transformers.models.bloom.modeling_bloom' (/T8/ai_soft/BakLLaVA/.venv/lib/python3.10/site-packages/transformers/models/bloom/modeling_bloom.py)
[2023-10-28 16:22:15,589] [INFO] [real_accelerator.py:110:get_accelerator] Setting ds_accelerator to cuda (auto detect)
Traceback (most recent call last):
File "/home/artur/anaconda3/envs/llava/lib/python3.10/runpy.py", line 187, in _run_module_as_main
mod_name, mod_spec, code = _get_module_details(mod_name, _Error)
File "/home/artur/anaconda3/envs/llava/lib/python3.10/runpy.py", line 110, in _get_module_details
__import__(pkg_name)
File "/T8/ai_soft/BakLLaVA/llava/__init__.py", line 1, in <module>
from .model import LlavaLlamaForCausalLM
File "/T8/ai_soft/BakLLaVA/llava/model/__init__.py", line 2, in <module>
from .language_model.llava_mistral import LlavaMistralForCausalLM, LlavaConfig
File "/T8/ai_soft/BakLLaVA/llava/model/language_model/llava_mistral.py", line 22, in <module>
from transformers import AutoConfig, AutoModelForCausalLM, \
ImportError: cannot import name 'MistralConfig' from 'transformers' (/T8/ai_soft/BakLLaVA/.venv/lib/python3.10/site-packages/transformers/__init__.py)
I have the following error:
I think it's related to these changes:
huggingface/transformers@ac58937
Steps:
Ubuntu 22.10 VM on Proxmox
The text was updated successfully, but these errors were encountered: