Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Regarding the loss issue of training from scratch on one's own dataset #474

Open
DaIGaN2019 opened this issue Oct 25, 2024 · 0 comments
Open

Comments

@DaIGaN2019
Copy link

Hello author, thank you for being willing to open source such outstanding code. I trained from scratch on my own dataset, and when using the original vit_L14 config, the loss became higher and higher. After adjusting the parameters, the loss gradually decreased, but after a small decrease, it almost stagnated (13.8-12.9, where dino_global-loss remained unchanged for a long time). Is this normal? Can you briefly tell me the extent to which your loss decreased during the training process? Also, the log shows that my keleo-loss has decreased from a positive number to a negative number. Is this reasonable? Thank you again for your contribution. Looking forward to your reply.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant