You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
# TODO: verify if computation really needs to be done
# on the CPU or if GPU would work, too
###################################
torch.sum(k*j).data.cpu().numpy()
fork, jinzip(grad_z_vecs[i], e_s_test)
###################################
# Originally with [i] because each grad_z contained
# a list of tensors as long as e_s_test list
# There is one grad_z per training data sample
###################################
]) /train_dataset_size
In the final step, the code here accually calculate the -1/n·I_up,loss(z,z_test). However, in the equation(2) of the original paperhttps://arxiv.org/abs/1703.04730, the term I_up,loss(z,z_test) do has a minus sign. So two negatives make a positive, the calculating here should be 1/n·I_up,loss(z,z_test). Or I just misunderstand some part of code or paper?
The text was updated successfully, but these errors were encountered:
pytorch_influence_functions/pytorch_influence_functions/calc_influence_function.py
Lines 258 to 271 in 4df5d2e
In the final step, the code here accually calculate the -1/n·I_up,loss(z,z_test). However, in the equation(2) of the original paperhttps://arxiv.org/abs/1703.04730, the term I_up,loss(z,z_test) do has a minus sign. So two negatives make a positive, the calculating here should be 1/n·I_up,loss(z,z_test). Or I just misunderstand some part of code or paper?
The text was updated successfully, but these errors were encountered: