Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

An interp that supports gradient #194

Open
wants to merge 6 commits into
base: master
Choose a base branch
from

Conversation

rainwoodman
Copy link
Contributor

@rainwoodman rainwoodman commented Mar 3, 2017

I am not sure if this is the right approach. It is impossible (at least when I fiddled) to directly use numpy.interp as an implementation of vjp of itself, because the width of the windows must be fixed. Therefore I put up a reimplementation of interp as a matrix product. I am not using a sparse matrix because I don't know if it is appropiate to pull in scipy.sparse for a numpy gradient.

The current version only allows propagating the gradient on the yp argument. It is trivial to add others -- just use the derivative of W instead of W. Adding support to period != None mode should be possible too.

But we need to convince ourself if this is the right approach to support non-trivial gradient of numpy functions. I think at least numpy.bincount will be similar to this; there may be more in the horizon.

This code may be expanded to handle other real space convolution based operators (with a finite support), I think.

It shall give idential result to numpy's version, but this
computes the matrix, and thus is differentiable.
import as done in numpy.numpy_grads.
@rainwoodman
Copy link
Contributor Author

@duvenaud : opinions?

@duvenaud
Copy link
Contributor

duvenaud commented Mar 6, 2017

This is a reasonable approach, but this PR replaces numpy's interp() with a slightly incomplete implementation. I would suggest keeping but hiding your implementation of interp, and using its gradient to define the gradient of numpy's interp.

@rainwoodman
Copy link
Contributor Author

rainwoodman commented Mar 6, 2017

Yes. I was playing with the idea too.

Last time I tried, I could not import vector_product_jacobian from autograd due to a circular import-like issue
this will be in anp, but the wrapper module imports anp too. I could localize the import but it looked ugly.

  • Is there another function that I can use to pull out the gradient of a internal function?

  • The convenient wrapper module does look heavier than I imagined.I wonder if there is a consensus to move non-numpy wrappers to a new module, e.g. autograd.api; and if it is possible at all.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants