3D FullRes strange results help #1474
Unanswered
stefanoTruzzi
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I built a dataset of patient:
The shape of my data is: pat, slice_pos_z, x-shape, y-shape
I separated my data in two sets: train and test and then I trained a nnUNet 3D_fullres using the fold ALL option.
With the train data I launch the following command:
nnUNetv2_plan_and_preprocess with the following json configuration:
{
"channel_names": {"0": "rgb_to_0_1"},
"labels": {"background":0,"TUMOR":1},
"numTraining": 36,
"file_ending": ".nii.gz"
}
nnUNetv2_train 201 3d_fullres all --npz (250 epoch)
The output of this seems very good I paste the last epoch:
2023-05-23 03:08:33.607406: Epoch 245
2023-05-23 03:08:33.607506: Current learning rate: 0.0003
2023-05-23 03:09:28.132529: train_loss -0.8097
2023-05-23 03:09:28.132849: val_loss -0.8842
2023-05-23 03:09:28.133035: Pseudo dice [0.9145]
2023-05-23 03:09:28.133226: Epoch time: 54.53 s
2023-05-23 03:09:28.133382: Yayy! New best EMA pseudo Dice: 0.9043
2023-05-23 03:09:31.874618:
2023-05-23 03:09:31.874882: Epoch 246
2023-05-23 03:09:31.874975: Current learning rate: 0.00024
2023-05-23 03:10:31.197363: train_loss -0.809
2023-05-23 03:10:31.197754: val_loss -0.8717
2023-05-23 03:10:31.197931: Pseudo dice [0.9161]
2023-05-23 03:10:31.198091: Epoch time: 59.32 s
2023-05-23 03:10:31.198219: Yayy! New best EMA pseudo Dice: 0.9054
2023-05-23 03:10:35.162158:
2023-05-23 03:10:35.162286: Epoch 247
2023-05-23 03:10:35.162412: Current learning rate: 0.00019
2023-05-23 03:11:35.915755: train_loss -0.8178
2023-05-23 03:11:35.916052: val_loss -0.838
2023-05-23 03:11:35.916196: Pseudo dice [0.9024]
2023-05-23 03:11:35.916365: Epoch time: 60.75 s
2023-05-23 03:11:38.879916:
2023-05-23 03:11:38.880252: Epoch 248
2023-05-23 03:11:38.880400: Current learning rate: 0.00013
2023-05-23 03:12:39.322534: train_loss -0.8065
2023-05-23 03:12:39.333308: val_loss -0.8448
2023-05-23 03:12:39.333480: Pseudo dice [0.9027]
2023-05-23 03:12:39.333625: Epoch time: 60.44 s
2023-05-23 03:12:41.372148:
2023-05-23 03:12:41.372279: Epoch 249
2023-05-23 03:12:41.372395: Current learning rate: 7e-05
2023-05-23 03:13:44.812617: train_loss -0.82
2023-05-23 03:13:44.812854: val_loss -0.8546
2023-05-23 03:13:44.812991: Pseudo dice [0.9033]
2023-05-23 03:13:44.813133: Epoch time: 63.44 s
2023-05-23 03:13:47.940917: Training done.
2023-05-23 03:13:47.958667: predicting PET3DPAT_001
2023-05-23 03:13:54.670786: predicting PET3DPAT_055
.......
2023-05-23 03:13:55.948227: predicting PET3DPAT_075
2023-05-23 03:13:56.129446: predicting PET3DPAT_077
2023-05-23 03:14:00.205794: Validation complete
2023-05-23 03:14:00.205883: Mean Validation Dice: 0.8996978618099529
I launched the nnUNetv2_predict on the test sample.
When I check the results printing the images of test prediction I obtain:
A dataset of shape:
2, 128, 128, 56 mask channel?, x-shape, y-shape, slices_z
For that I understand here the first shape is:
on left the mask 0 (healty part)
On the right the 1 (TUMOR part)
Then when i plot the 2d_slices cycling on slices_z I obtain these results:
These are not all the single patient results but only a part:
Also I plot some validation images and I found another strange behaviour


I had some doubt about:
Could someone help me understand where I'm wrong
Beta Was this translation helpful? Give feedback.
All reactions