You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello,@xuebinqin thank you very much for your outstanding contributions and sharing on DIS. I encountered a problem while researching and training the GT encoder. The trained GT encoder has poor traLoss and traTarLoss, and the training was interrupted at epoch=10, generating: GTENCODER-gpu_itr_90000_traLoss_2.5166_traTarLoss_0.568_valLoss_0.3852_valTarLoss_0.0118_maxF1_0.9978_mae_0.0009_time_0.035759.pth. In contrast, the model trained without hypar["interm_sup"] = False has higher segmentation accuracy. I would like to ask what might be the reason for this?
The text was updated successfully, but these errors were encountered:
When training without the GT encoder supervision, the trained model is gpu_itr_47500_traLoss_0.7772_traTarLoss_0.1048_valLoss_0.395_valTarLoss_0.0206_maxF1_0.9952_mae_0.0022_time_0.067231.pth, which surprisingly performs better than the GT encoder. Here are the comparison images below.
Hello,@xuebinqin thank you very much for your outstanding contributions and sharing on DIS. I encountered a problem while researching and training the GT encoder. The trained GT encoder has poor traLoss and traTarLoss, and the training was interrupted at epoch=10, generating: GTENCODER-gpu_itr_90000_traLoss_2.5166_traTarLoss_0.568_valLoss_0.3852_valTarLoss_0.0118_maxF1_0.9978_mae_0.0009_time_0.035759.pth. In contrast, the model trained without hypar["interm_sup"] = False has higher segmentation accuracy. I would like to ask what might be the reason for this?
The text was updated successfully, but these errors were encountered: