Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add a safe wrapper for cross entropy. #142

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

IanTayler
Copy link
Contributor

This fixes an error we were getting when tf.nn.softmax_cross_entropy_with_logits received empty tensors. The error was quite elusive, as it only happened on GPUs and not CPUs. This is the issue as reported over at the TF repo.

@IanTayler IanTayler requested a review from vierja December 13, 2017 17:31
@IanTayler IanTayler force-pushed the safe-wrapper-for-softmax branch from 3e81f1f to 8ff1ba9 Compare December 13, 2017 17:46
This fixes an error we were getting when softmax_cross_entropy_with_logits received empty tensors.
@IanTayler IanTayler force-pushed the safe-wrapper-for-softmax branch from 8ff1ba9 to a664f4d Compare December 13, 2017 19:08
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant