Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Gap between influence and real loss difference #18

Open
HAoYifei996 opened this issue Aug 19, 2020 · 1 comment
Open

Gap between influence and real loss difference #18

HAoYifei996 opened this issue Aug 19, 2020 · 1 comment

Comments

@HAoYifei996
Copy link

Hi, I am trying to perform a leave-one-out retraining to compare the difference between influence computed by calc_influence_single here and actual loss difference by computing two times of loss before and after retrain. I use calc_loss here to compute loss. Also, I scale the influence by len(trainset). Surprisingly I found big difference between computed influence and real loss difference after retrain. For a random pick test_idx=10, train_idx_to_remove = 609, I'm getting the following result:

actual_loss_diffs:  tensor(0.7848, device='cuda:0', grad_fn=<SubBackward0>)
predict_loss_diffs:  -0.003923072204589844

which doesn't seem very relevant to me.

Thanks in advance for any kind suggestions!

@HAoYifei996
Copy link
Author

So far I think it may have something to do with the net in the example is only trained with 10 epochs, which does not achieve close to global optimum, but I'm not sure.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant