Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GradSampleModuleFastGradientClipping ignores strict and force_functorch params #673

Closed
anhnami opened this issue Sep 13, 2024 · 4 comments
Closed

Comments

@anhnami
Copy link

anhnami commented Sep 13, 2024

Per-sample gradient clipping has recently been reported to be useful for speech processing [1][2][3]. Implementing per-sample gradient clipping is complicated, hence I just want to use opacus to do the job. However, since opacus is privacy-focused, it does not support several layers. Furthermore, it seems we can't turn off "strict" mode in GradSampleModuleFastGradientClipping. It would be nice to support this non-privacy use case.

[1] https://arxiv.org/pdf/2406.02004
[2] https://arxiv.org/pdf/2310.11739
[3] https://arxiv.org/pdf/2408.16204

@HuanyuZhang
Copy link
Contributor

Good catch for the "strict" part, will make a patch to fix it.

Do you mind explaining a bit on "it does not support several layers"? I believe the current implementation supports all the layers which were previously supported by Opacus GradSampleModule.

@anhnami
Copy link
Author

anhnami commented Sep 14, 2024

It's BatchNorm and customized layers with buffers. I'm hoping the strict option may allow me to use them since my use case is not privacy-related.

@HuanyuZhang
Copy link
Contributor

I see. I think you can unblock the usage by setting strict = False. I have never tested it by myself, so had better do some quick test to make sure the gradient norm is consistent (https://github.com/pytorch/opacus/blob/main/opacus/tests/gradient_accumulation_test.py).

@HuanyuZhang
Copy link
Contributor

Close the issue, due to the launched fix (#675).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants