Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cannot import name '_expand_mask' from 'transformers.models.bloom.modeling_bloom' #7

Open
ArturFormella opened this issue Oct 28, 2023 · 2 comments

Comments

@ArturFormella
Copy link

ArturFormella commented Oct 28, 2023

I have the following error:

[2023-10-28 15:58:46,969] [INFO] [real_accelerator.py:110:get_accelerator] Setting ds_accelerator to cuda (auto detect)
Traceback (most recent call last):
  File "/home/artur/anaconda3/envs/llava/lib/python3.10/runpy.py", line 187, in _run_module_as_main
    mod_name, mod_spec, code = _get_module_details(mod_name, _Error)
  File "/home/artur/anaconda3/envs/llava/lib/python3.10/runpy.py", line 110, in _get_module_details
    __import__(pkg_name)
  File "/T8/ai_soft/BakLLaVA/llava/__init__.py", line 1, in <module>
    from .model import LlavaLlamaForCausalLM
  File "/T8/ai_soft/BakLLaVA/llava/model/__init__.py", line 3, in <module>
    from .language_model.llava_mpt import LlavaMPTForCausalLM, LlavaMPTConfig
  File "/T8/ai_soft/BakLLaVA/llava/model/language_model/llava_mpt.py", line 26, in <module>
    from .mpt.modeling_mpt import MPTConfig, MPTForCausalLM, MPTModel
  File "/T8/ai_soft/BakLLaVA/llava/model/language_model/mpt/modeling_mpt.py", line 19, in <module>
    from .hf_prefixlm_converter import add_bidirectional_mask_if_missing, convert_hf_causal_lm_to_prefix_lm
  File "/T8/ai_soft/BakLLaVA/llava/model/language_model/mpt/hf_prefixlm_converter.py", line 15, in <module>
    from transformers.models.bloom.modeling_bloom import _expand_mask as _expand_mask_bloom
ImportError: cannot import name '_expand_mask' from 'transformers.models.bloom.modeling_bloom' (/T8/ai_soft/BakLLaVA/.venv/lib/python3.10/site-packages/transformers/models/bloom/modeling_bloom.py)

I think it's related to these changes:
huggingface/transformers@ac58937

Steps:

conda create -n llava python=3.10 -y
conda activate llava
pip install --upgrade pip # enable PEP 660 support
pip install -e .
pip uninstall transformers
pip install git+https://github.com/huggingface/transformers
ls .venv/lib/python3.10/site-packages/transformers/models/bloom/modeling_bloom.py
.venv/lib/python3.10/site-packages/transformers/models/bloom/modeling_bloom.py

python -m llava.serve.model_worker --host 0.0.0.0 --controller http://localhost:10000 --port 40001 --worker http://localhost:40001 --model-path /T8/ai_files/gpt_models/SkunkworksAI_BakLLaVA-1

nvidia-smi -L
GPU 0: NVIDIA GeForce RTX 3090 (UUID: GPU-c632cd8d-fdb2-5295-e950-ddb18636c332)

Ubuntu 22.10 VM on Proxmox

@ArturFormella
Copy link
Author

ArturFormella commented Oct 28, 2023

> pip install transformers==4.31.0
> python -m llava.serve.model_worker --host 0.0.0.0 --controller http://localhost:10000 --port 40001 --worker http://localhost:40001 --model-path /T8/ai_files/gpt_models/SkunkworksAI_BakLLaVA-1
[2023-10-28 16:22:15,589] [INFO] [real_accelerator.py:110:get_accelerator] Setting ds_accelerator to cuda (auto detect)
Traceback (most recent call last):
  File "/home/artur/anaconda3/envs/llava/lib/python3.10/runpy.py", line 187, in _run_module_as_main
    mod_name, mod_spec, code = _get_module_details(mod_name, _Error)
  File "/home/artur/anaconda3/envs/llava/lib/python3.10/runpy.py", line 110, in _get_module_details
    __import__(pkg_name)
  File "/T8/ai_soft/BakLLaVA/llava/__init__.py", line 1, in <module>
    from .model import LlavaLlamaForCausalLM
  File "/T8/ai_soft/BakLLaVA/llava/model/__init__.py", line 2, in <module>
    from .language_model.llava_mistral import LlavaMistralForCausalLM, LlavaConfig
  File "/T8/ai_soft/BakLLaVA/llava/model/language_model/llava_mistral.py", line 22, in <module>
    from transformers import AutoConfig, AutoModelForCausalLM, \
ImportError: cannot import name 'MistralConfig' from 'transformers' (/T8/ai_soft/BakLLaVA/.venv/lib/python3.10/site-packages/transformers/__init__.py)

I have no idea what to do.

@ArturFormella
Copy link
Author

It looks like release 4.34.0 is the latest working with BakLLaVA:

pip install transformers==4.34.0

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant