We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
python -m llava.serve.model_worker --host 0.0.0.0 --controller http://localhost:10000 --port 40000 --worker http://localhost:40000 --model-path liuhaotian/llava-v1.5-13b [2023-11-23 15:10:56,220] [INFO] [real_accelerator.py:110:get_accelerator] Setting ds_accelerator to cuda (auto detect) Traceback (most recent call last): File "/usr/lib/python3.10/runpy.py", line 187, in _run_module_as_main mod_name, mod_spec, code = _get_module_details(mod_name, _Error) File "/usr/lib/python3.10/runpy.py", line 110, in _get_module_details __import__(pkg_name) File "/workspace/BakLLaVA/llava/__init__.py", line 1, in <module> from .model import LlavaLlamaForCausalLM File "/workspace/BakLLaVA/llava/model/__init__.py", line 2, in <module> from .language_model.llava_mistral import LlavaMistralForCausalLM, LlavaConfig File "/workspace/BakLLaVA/llava/model/language_model/llava_mistral.py", line 22, in <module> from transformers import AutoConfig, AutoModelForCausalLM, \ ImportError: cannot import name 'MistralConfig' from 'transformers' (/usr/local/lib/python3.10/dist-packages/transformers/__init__.py)
The text was updated successfully, but these errors were encountered:
I had this same issue just run pip install transformers==4.34.0 and it should work
pip install transformers==4.34.0
Sorry, something went wrong.
thanks will try
No branches or pull requests
The text was updated successfully, but these errors were encountered: