-
Beta Was this translation helpful? Give feedback.
Answered by
fanll
Apr 17, 2024
Replies: 4 comments
-
我也是这个问题 请问您解决了吗 |
Beta Was this translation helpful? Give feedback.
0 replies
-
您好,我也是这个问题,请问解决了吗 |
Beta Was this translation helpful? Give feedback.
0 replies
-
问题解决:重装transformers |
Beta Was this translation helpful? Give feedback.
0 replies
-
统一回复一下,我后来尝试用下面的方法可以正常加载了。transformer用的是requirements指定的版本。 from transformers import AutoTokenizer, AutoModel
from peft import PeftModel
TOKENIZER_PATH = "path/to/tokenizer"
LORA_PATH = "path/to/lora"
device = "cuda"
tokenizer = AutoTokenizer.from_pretrained(TOKENIZER_PATH, trust_remote_code=True)
model = AutoModel.from_pretrained(MODEL_PATH, trust_remote_code=True, device_map=device).eval()
model = PeftModel.from_pretrained(model, LORA_PATH) |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
fanll
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
统一回复一下,我后来尝试用下面的方法可以正常加载了。transformer用的是requirements指定的版本。
代码: