-
Notifications
You must be signed in to change notification settings - Fork 94
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
显存显示问题 #7
Comments
112 |
@archlitchi “管理vgpu所需要的上下文显存“这个是预留的吗?112m在显示上有办法屏蔽吗,因为对用户来说有点误解哈~ |
屏蔽warning的话,设置环境变量LIBCUDA_LOG_LEVEL=0即可,但是112M显示暂时不会去改,因为管理vGPU确实需要这么大的显存,显示为0的话不大合适 |
从用户角度来看,112M会造成困惑,因为我都没用就白白耗费了112M的显存。 |
@qifengz 这个问题上slack上聊吧 |
This comment has been minimized.
This comment has been minimized.
|
@qifengz 直接加我微信吧 xuanzong4493 |
modify readme_cn to format gpu-pod yaml
容器内执行nvidia-smi返回如下:
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 440.64.00 Driver Version: 440.64.00 CUDA Version: 10.2 |
|-------------------------------+----------------------+----------------------+
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
|===============================+======================+======================|
| 0 Tesla V100-SXM2... On | 00000000:00:0A.0 Off | 0 |
| N/A 36C P0 42W / 300W | 112MiB / 16160MiB | 0% Default |
+-------------------------------+----------------------+----------------------+
+-----------------------------------------------------------------------------+
| Processes: GPU Memory |
| GPU PID Type Process name Usage |
|=============================================================================|
| No running processes found |
Memory-Usage: 112MiB / 16160MiB
The text was updated successfully, but these errors were encountered: