Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Automatic differentiation #16

Open
prcastro opened this issue Sep 7, 2015 · 1 comment
Open

Automatic differentiation #16

prcastro opened this issue Sep 7, 2015 · 1 comment

Comments

@prcastro
Copy link
Owner

prcastro commented Sep 7, 2015

This is what Theano do when used in Deep Learning with Python. We can do it by using packages from JuliaDiff to compute the derivatives instead of using a derivative dictionary. While this could cause some slowdown on differentiation (and at import, because we would depend on a package), this wouldn't be much, and would make much easier to extend the library with new cost/activation functions.

@prcastro
Copy link
Owner Author

prcastro commented Sep 7, 2015

This would probably mean that activation functions would have to be defined as functions of vectors, and not of FFNNLayers

@prcastro prcastro added this to the Version 1.0 milestone Sep 9, 2015
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant