-
Notifications
You must be signed in to change notification settings - Fork 169
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Porting to Cython #7
Comments
Thanks for your suggestion. Let us make this the priority! We'll @ you when it is done. |
Thank you @keroro824 ! |
Sort of related, but I've been building R bindings. |
@wrathematics Thanks for contributing 👍 |
Hi, are there any updates on this? |
I'm also interested in implementing such a thing. But it seems to me the way to do this would be to implement custom layers instead of builtin ones. This could be added to the main codebase once it is tested rather than a separate package. For example in pytorch you would first subclass 'torch.autograd.Function' to implement forward and backward operations which calculate the hashing operations and take that into account in forward and back propagation. Cython might not be needed as you might be able to use numba and get better performance more easily. @keroro824 I've actually started doing what I described. I have a question: It would be very interesting to me to implement this. |
Is it possible for you to port this into Cython and release it as PyPI package ?
It would be easy for existing DL users(tf and pytorch users) to use it natively in their code.
The text was updated successfully, but these errors were encountered: