-
Notifications
You must be signed in to change notification settings - Fork 386
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Sampling random generator #2309
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks great! Just some minor comments on consistency and simplicity.
We should probably do the same for all datamodules, but let's do that in a separate PR.
What is the thing with codecov failing? When I check the dashboard states 100%. |
I'm trying to figure out the codecov thing now. I'm 99.9% sure it's a glitch, not your fault. Let's rerun the tests just to make sure. |
This is actually a bit tricky. For many uses of if generator is None:
generator = torch.default_generator() or something like that. |
Do you want to open a PR to harmonize the previous examples where we used |
Following #2307, adding the possibility to set the seed of the random samplers by supplying a
torch.Generator
. As far as I could tell, this is the only place where no random generator was used yet, apart from the Kornia transforms (would we want that? They do not support a generator argument).