Replies: 1 comment
-
Augmentation should never be done offline. This just wastes disk space. You can do this on the fly during training. Thus, the trainer must be aware of it and implement it. The configuration of the data augmentation can depend on the plans and is thus part of the rule-based system nnU-Net is based on. Best, Fabian |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Looking for general advice to add (some fairly complex) data augmentations to nnUNet.
If there is already documentation for doing this please point me to it, but I only found one line in
documentation/extending_nnunet.md
which said to look innnunet.training.network_training.data_augmentation.*
. That points me to the network trainers, as there is nodata_augmentation/
folder or file (at least not anymore). My assumption was that our data augmentations to the training data must happen before nnUNet is even aware of our network trainer.Clearly, im very confused.
For some clarification, I am working on the kits21 data, and we already have a baseline nnUNet working with nnUNetTrainerV2. So I think I should modify the nnUNetTrainerV2 file?
Some questions:
nnUNet_plan_and_preprocess
? IsnnUNet_plan_and_preprocess
even aware of which network trainer we use?Even if you cant answer those questions, just giving me a direction to go would be a big help, thanks!
Beta Was this translation helpful? Give feedback.
All reactions