You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Why did you remove the defaults? I only see the maximum value, the minimum value, the step and the current value. Where are the default values for VID, CORE and so on?
I have to do a reset of the adapter and then read the current settings to get the defaults.
To get the VRAM usage I have to call the function to get the standard information about the adapter and its maximum capacity and only after that I will monitor the current memory size to get the percentage.
The timing settings are controlled through mapping. Have you even seen the number of timing options for VRAM? There are 2 for RDNA2 / RDNA3. For previous generations there are 3. For this mapping?
Speed of execution. ADL polling of all sensors - less than 1ms, ADLX - >50ms (GetCurrentGPUMetrics > 50ms).
Set Core Freq, Core VID, Mem Freq > 580ms. Instead of 16ms?
The code was written by people who have never seen a video card?
Do you think this is the "new generation" of ADLs? It's a shot in the foot.
The text was updated successfully, but these errors were encountered:
Why did you remove the defaults? I only see the maximum value, the minimum value, the step and the current value. Where are the default values for VID, CORE and so on?
I have to do a reset of the adapter and then read the current settings to get the defaults.
To get the VRAM usage I have to call the function to get the standard information about the adapter and its maximum capacity and only after that I will monitor the current memory size to get the percentage.
The timing settings are controlled through mapping. Have you even seen the number of timing options for VRAM? There are 2 for RDNA2 / RDNA3. For previous generations there are 3. For this mapping?
Speed of execution. ADL polling of all sensors - less than 1ms, ADLX - >50ms (GetCurrentGPUMetrics > 50ms).
Set Core Freq, Core VID, Mem Freq > 580ms. Instead of 16ms?
The code was written by people who have never seen a video card?
Do you think this is the "new generation" of ADLs? It's a shot in the foot.
The text was updated successfully, but these errors were encountered: