Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Telechat对于flash-attn的版本要求? #64

Open
yishengduwu opened this issue Aug 14, 2024 · 3 comments
Open

Telechat对于flash-attn的版本要求? #64

yishengduwu opened this issue Aug 14, 2024 · 3 comments

Comments

@yishengduwu
Copy link

使用pip install -r requirements.txt 时卡在 “Building wheel for flash-attn (setup.py) ... ”
2小时都没有完成。
请问如何解决。
另外如果单独安装flash-attn时,对flash-attn的版本要求是多少?

@ge-xing
Copy link

ge-xing commented Aug 19, 2024

可以试试直接单独安装最新版的flash-attn

@huyuan-cn
Copy link

关了这个吧。 直接从https://github.com/lilongxian/telechat-chat-openai/blob/main/telechat/modeling_telechat.py 复制替换文件后,修改config文件关闭flashatt

@AIlvxiaobu
Copy link

关了这个吧。 直接从https://github.com/lilongxian/telechat-chat-openai/blob/main/telechat/modeling_telechat.py 复制替换文件后,修改config文件关闭flashatt

每次执行modeling_telechat.py的内容会被自动更新

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants