You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This is more of a suggestion but given the limited number of gpu's one can use at a time, I would be curious to see whether we still need the GPU's for heating simulations.
If not, we could limit the GPU usage to acoustic simulations only so that, when acoustic simulations are done and the pipeline progresses to heating simulations, the next job can start doing acoustic simulations. That way more than 2 heating simulations can run concurrently.
But this is purely speculative. Not yet sure if this is possible.
The text was updated successfully, but these errors were encountered:
I assume that heating sims also benefit from the GPU. If that is not the case, we should indeed alter the code. We can already run it like you describe it above, as the acoustic sims can be submitted with GPU. Once these are done, heating sims without GPU can be submitted.
Currently, the heating sims can require large GPU memory when specifying a pulsed repetition scheme. It would make sense to also consider optimization in that computation. (We may be creating large 4D matrices somewhere and may be able to drop the time dimension?).
This is more of a suggestion but given the limited number of gpu's one can use at a time, I would be curious to see whether we still need the GPU's for heating simulations.
If not, we could limit the GPU usage to acoustic simulations only so that, when acoustic simulations are done and the pipeline progresses to heating simulations, the next job can start doing acoustic simulations. That way more than 2 heating simulations can run concurrently.
But this is purely speculative. Not yet sure if this is possible.
The text was updated successfully, but these errors were encountered: