You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I just stumbled upon the following problem. When creating a tensor from a scalar or sequence, with requiresGrad=true, the value of the argument is lost. The following code exposes the problem
Setting the parameter explicitly with x.requiresGrad(true) works.
I looked into the problem but could not spot anything wrong. It seems to get lost in this line
The problem persists when using the latest pytorch version (2.5.1). Does anybody know what could be the problem?
The text was updated successfully, but these errors were encountered:
marcelluethi
changed the title
requiresGrad lost when creating Tensors from Scalars of Sequences
requiresGrad lost when creating tensors from scalars or sequences
Dec 22, 2024
Hello
I just stumbled upon the following problem. When creating a tensor from a scalar or sequence, with
requiresGrad=true
, the value of the argument is lost. The following code exposes the problemSetting the parameter explicitly with
x.requiresGrad(true)
works.I looked into the problem but could not spot anything wrong. It seems to get lost in this line
The problem persists when using the latest pytorch version (2.5.1). Does anybody know what could be the problem?
The text was updated successfully, but these errors were encountered: