-
sles 15, RTX2070 + 3060, CUDA11.8, 报错过程如下: 用户:hello ChatGLM3:Traceback (most recent call last): |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
已解决。 |
Beta Was this translation helpful? Give feedback.
已解决。
原因是自动安装的transformers(4.40.0) 和 torch(2.2.2)版本太高,自动调用FlashAttention ,将版本分别降到4.31.0和2.0.1问题解决。