Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

softmax activation in GRU #2

Open
fgvbrt opened this issue Mar 8, 2017 · 2 comments
Open

softmax activation in GRU #2

fgvbrt opened this issue Mar 8, 2017 · 2 comments

Comments

@fgvbrt
Copy link

fgvbrt commented Mar 8, 2017

Hi, I noticed that you put softmax activation inside GRU cell, as I understand in this case you wont get sum of activations for each timestep equals to 1. Here is link for GRU cell and the same situation for terminal GRU https://github.com/HIPS/molecule-autoencoder/blob/master/autoencoder/train_autoencoder.py#L225

I also checked with you version of keras that it does not sum to 1, here is link to ghist https://gist.github.com/fgvbrt/1f2e1828c6d8c0eb88614f14c60874ad

Was it done on purpose or was it mistake?
Thanks in advance.

@fgvbrt
Copy link
Author

fgvbrt commented Mar 16, 2017

I also want to add, that this softmax in GRU would be valid if initialization of initial state would represent probability distribution (i.e. initial states sums to one), but in code there is initialization with zeros.

@duvenaud
Copy link
Contributor

Thanks for catching this! I'll bring it up with Rafa G-B next time I talk to him.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants