Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add MSE minimization along with regularization as an objective #26

Open
kanodiaayush opened this issue Apr 22, 2023 · 3 comments
Open

Comments

@kanodiaayush
Copy link
Contributor

Also allow for Y to be continuous in [0, 1]; in this case the optimization would be through loss minimization and not through variational inference. Note that we still want the sigmoid to be applied as the final head.

@TianyuDu

@TianyuDu
Copy link
Collaborator

TianyuDu commented Apr 23, 2023

Can you have a look to this change: e6bef13
Does it work as you expected?
P.S., this is a preliminary solution to illustrate the idea of potential modifications, I haven't tested it yet.

@kanodiaayush
Copy link
Contributor Author

Oh. This looks great. This also means our forward function is fully modular, and we can use this infrastructure to compare across models. I will test this out. I will also add l2 regularization in the objective. That should be fine, right?

@TianyuDu
Copy link
Collaborator

Great, as long as you only modify bemb_flex_lightning_mse.py and the mse_loss_on_binary_label() function, it will be fine : ).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

When branches are created from issues, their pull requests are automatically linked.

2 participants