You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In particular, the forward should no longer take a ctx argument. Instead, it should return the output and all tensors etc. that need to be saved to ctx. And a third function, setup_context should save it in ctx.
Also, it would be nice to make use of vmap possible
As this would break compatibility for pytorch versions <2.0 (released before March 2023), this would either require a new torchkbnufft version and a bump in the required torch version, or an import-time switch depending on the pytorch version.
I would be willing to prepare a PR for either option.
The text was updated successfully, but these errors were encountered:
Hi,
to be able to use
torch.func.*
(for example,torch.func.grad
) with torchkbnufft, thetorch.autograd.Function
s in torchkbnufft/_autograd/interp.py would need an update to the newer calling signature.See https://pytorch.org/docs/stable/notes/extending.func.html.
In particular, the
forward
should no longer take a ctx argument. Instead, it should return the output and all tensors etc. that need to be saved to ctx. And a third function,setup_context
should save it in ctx.Also, it would be nice to make use of
vmap
possibleAs this would break compatibility for pytorch versions <2.0 (released before March 2023), this would either require a new torchkbnufft version and a bump in the required torch version, or an import-time switch depending on the pytorch version.
I would be willing to prepare a PR for either option.
The text was updated successfully, but these errors were encountered: