-
Notifications
You must be signed in to change notification settings - Fork 58
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
RuntimeError: One of the differentiated Variables appears to not have been used in the graph #3
Comments
Hi @dragen1860 @gitabcworld How did you solve this? |
Hi, I did n't solve this problem. Have you got any solution? |
I change metaLearner.py, line 174 like following: It runs but Grads lstm + lstm2 print None . Not sure yet. |
@elviswf can it work now? how about your final precision? |
@dragen1860 I just get the script run. I think you can get it run too. The final precision is just like above. Some params may need changed. I will try it in next week since I'm working on another project. If I get something new I will update this comment. |
@elviswf How is your latest progress? |
Hi! I am sorry I could not work at this code for long. As @elviswf I am really busy working with other projects and I have not been able to dedicate more time to this project. So any help will be appreciated. I will try to do the changes @elviswf proposes and see if it solves the backprop problem as soon as possible. |
Have anyone solve this problem(One of the differentiated Variables appears to not have been used in the graph)? |
Have you encountered this problem? |
|
Learner nParams: 32901
Traceback (most recent call last):
File "main.py", line 38, in
results = importlib.import_module(opt['metaLearner']).run(opt, data)
File "/home/i/meta/FewShotLearning/model/lstm/train-lstm.py", line 121, in run
opt['batchSize'][opt['nTrainShot']])
File "/home/i/conda/envs/py27/lib/python2.7/site-packages/torch/nn/modules/module.py", line 325, in call
result = self.forward(*input, **kwargs)
File "/home/i/meta/FewShotLearning/model/lstm/metaLearner.py", line 174, in forward
torch.autograd.grad(loss, self.lstm2.parameters())
File "/home/i/conda/envs/py27/lib/python2.7/site-packages/torch/autograd/init.py", line 158, in grad
inputs, only_inputs, allow_unused)
RuntimeError: One of the differentiated Variables appears to not have been used in the graph
I run your code with py27 env. However the error occurs and I don't know what goes wrong?
The text was updated successfully, but these errors were encountered: