Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GT encoder has poor traLoss and traTarLoss #141

Open
Maozhentan2024 opened this issue Sep 16, 2024 · 1 comment
Open

GT encoder has poor traLoss and traTarLoss #141

Maozhentan2024 opened this issue Sep 16, 2024 · 1 comment

Comments

@Maozhentan2024
Copy link

Hello,@xuebinqin thank you very much for your outstanding contributions and sharing on DIS. I encountered a problem while researching and training the GT encoder. The trained GT encoder has poor traLoss and traTarLoss, and the training was interrupted at epoch=10, generating: GTENCODER-gpu_itr_90000_traLoss_2.5166_traTarLoss_0.568_valLoss_0.3852_valTarLoss_0.0118_maxF1_0.9978_mae_0.0009_time_0.035759.pth. In contrast, the model trained without hypar["interm_sup"] = False has higher segmentation accuracy. I would like to ask what might be the reason for this?

@Maozhentan2024
Copy link
Author

When training without the GT encoder supervision, the trained model is gpu_itr_47500_traLoss_0.7772_traTarLoss_0.1048_valLoss_0.395_valTarLoss_0.0206_maxF1_0.9952_mae_0.0022_time_0.067231.pth, which surprisingly performs better than the GT encoder. Here are the comparison images below.
WechatIMG2651

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant