Skip to content

GradParams feature as constraint #62

Answered by GJBoth
goyalpike asked this question in Q&A
Discussion options

You must be logged in to vote

Happy to hear you're finding it useful!
Indeed it is that simple: GradParams creates a pytorch parameterlist (see line 43-45 in constraint.py), so that coefficients get added to the model parameters and are returned when you call model.parameters(). This means that the coefficients and the network share the same optimizer (and settings like learning rate), so if you want separate optimizers you'd need to manually specify which of the model.parameters() you want to optimize for each optimizer.

Good point about a demo, I'll add it to the examples!

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@goyalpike
Comment options

Answer selected by goyalpike
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants