Skip to content

Obtaining circuit gradient with Torch #180

Answered by jcmgray
vincentmr asked this question in Q&A
Discussion options

You must be logged in to vote

I haven't time to run through your example properly, but:

  1. If you are using the Model infrastructure, I don't think you need to set requires_grad yourself, that is all handled by setting attributes as parameters.
  2. You could use requires_grad, then simply contract the tensor network, and call x.backward() yourself, no need for torch.nn.Module or any abstractions at all:
import torch
import quimb.tensor as qtn

tn = qtn.TN_rand_reg(3, 2, 2)

for i in range(3):
    if i == 1:
        # or with get_params/set_params to handle PTensors
        tn[f'I{i}'].modify(apply=lambda x: torch.tensor(x, requires_grad=True))
    else:
        tn[f'I{i}'].modify(apply=lambda x: torch.tensor(x))
        
x = 

Replies: 1 comment 9 replies

Comment options

You must be logged in to vote
9 replies
@vincentmr
Comment options

@jcmgray
Comment options

Answer selected by vincentmr
@vincentmr
Comment options

@jcmgray
Comment options

@jcmgray
Comment options

@vincentmr
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants