Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

开低显存模式下报错 #7

Open
BecarefulW opened this issue Nov 10, 2024 · 4 comments
Open

开低显存模式下报错 #7

BecarefulW opened this issue Nov 10, 2024 · 4 comments

Comments

@BecarefulW
Copy link

torch=2.4.1
cuda=cu124
xformers=0.0.28.post1

torch.OutOfMemoryError: CUDA out of memory. Tried to allocate 2.25 GiB. GPU 0 has a total capacity of 11.00 GiB of which 0 bytes is free. Of the allocated memory 9.81 GiB is allocated by PyTorch, and 97.56 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation. See documentation for Memory Management (https://pytorch.org/docs/stable/notes/cuda.html#environment-variables)

无论开关低显存模式,都报错,似乎无法在11G显存下使用

@chflame163
Copy link
Owner

11G 显存,你是2080显卡吗?未在20系显卡测试。

@BecarefulW
Copy link
Author

11G 显存,你是2080显卡吗?未在20系显卡测试。

是的,2080ti,开了共享内存后,发现占用50G共享内存+10.5G显存,共60G+,应该是一些错误导致的,希望大佬可以修复

@chflame163
Copy link
Owner

可能与显卡不支持bf16精度有关,我没有设备调试

@liutianping
Copy link

2080ti 22G也是经常爆显存。而且爆显存的时候,内存占用也是非常高,64G的内存,能干到98%

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants