You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Configuring tensors with the bitFlip-tensor over all operators in mnist_nn.py returns an accuracy of 10000+, when the value should be between 0 and 1.
The accuracy returned is between the range for other fault types. Will need to debug in depth to figure out where the issue is, and what causes this inflation of accuracy. Btw, we should also have a sanity check on the accuracy output.
The text was updated successfully, but these errors were encountered:
Configuring tensors with the
bitFlip-tensor
over all operators in mnist_nn.py returns an accuracy of 10000+, when the value should be between 0 and 1.The accuracy returned is between the range for other fault types. Will need to debug in depth to figure out where the issue is, and what causes this inflation of accuracy. Btw, we should also have a sanity check on the accuracy output.
The text was updated successfully, but these errors were encountered: