-
Great toolbox! Thanks for providing it. I am interested in using So a question is: are these optimization steps for coeffs are taken care of by the toolbox as soon as we use Any guidance would be highly appreciated. Many thanks! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
Happy to hear you're finding it useful! Good point about a demo, I'll add it to the examples! |
Beta Was this translation helpful? Give feedback.
Happy to hear you're finding it useful!
Indeed it is that simple:
GradParams
creates a pytorch parameterlist (see line 43-45 inconstraint.py
), so that coefficients get added to the model parameters and are returned when you callmodel.parameters()
. This means that the coefficients and the network share the same optimizer (and settings like learning rate), so if you want separate optimizers you'd need to manually specify which of themodel.parameters()
you want to optimize for each optimizer.Good point about a demo, I'll add it to the examples!