Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ReturnnTrainingJob info, more robust LR log loading #549

Merged
merged 1 commit into from
Oct 17, 2024

Conversation

albertz
Copy link
Member

@albertz albertz commented Oct 16, 2024

In my case, I had some np.float64(...) entries in it (for the effective LR).
Not sure how I got that (maybe due to new PyTorch/Numpy), and I already pushed some RETURNN change to avoid this in the future, but I think we can anyway just handle that here as well.

In my case, I had some np.float64(...) entries in it
(for the effective LR).
Not sure how I got that (maybe due to new PyTorch/Numpy),
and I already pushed some RETURNN change to avoid this in the future,
but I think we can anyway just handle that here as well.
@@ -287,6 +287,8 @@ def _get_run_cmd(self):
return run_cmd

def info(self):
import numpy as np
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what was the reasoning behind doing a local import vs module level?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  • I like to keep things local to see where they are used. That's a thing you should do in general, for all kinds of things, about the structure of code (e.g. local variables, etc.).
  • It's not good practice to put too much things into the global namespace.
  • It could increase the module load time when numpy was not imported yet.

Copy link
Member Author

@albertz albertz Oct 17, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note, the only some (valid) arguments to put the import at module level:

  • You see all the dependencies of the module more easily. (But I'm not sure this is so important here.)
  • Convention
  • Save some lines of code when the import is used often. (This is very valid, when most of the code in the module is actually depending on it, which is often the case for tensorflow or torch.)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It comes very surprising to me to break with the convention now, but I do not care too much.

the only

Definitely not the only in general, maybe the only one here. Once you use it more often you save lines of code, definitely a benefit.

@albertz albertz merged commit 7e067e8 into main Oct 17, 2024
4 checks passed
@albertz albertz deleted the albert-returnn-lr-np branch October 17, 2024 08:46
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants