Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

issues with bitFlip-tensor fault type #13

Open
nniranjhana opened this issue Oct 31, 2019 · 0 comments
Open

issues with bitFlip-tensor fault type #13

nniranjhana opened this issue Oct 31, 2019 · 0 comments
Labels
bug Something isn't working

Comments

@nniranjhana
Copy link
Contributor

Configuring tensors with the bitFlip-tensor over all operators in mnist_nn.py returns an accuracy of 10000+, when the value should be between 0 and 1.

The accuracy returned is between the range for other fault types. Will need to debug in depth to figure out where the issue is, and what causes this inflation of accuracy. Btw, we should also have a sanity check on the accuracy output.

@nniranjhana nniranjhana added the bug Something isn't working label Oct 31, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant