Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

why Roberta? #21

Open
fatemeh-sh264 opened this issue Jul 22, 2020 · 1 comment
Open

why Roberta? #21

fatemeh-sh264 opened this issue Jul 22, 2020 · 1 comment

Comments

@fatemeh-sh264
Copy link

Why did you use Roberta and not use BERT or ELMO instead?

@jongwook
Copy link
Collaborator

In an ablation study (that we didn't publish) we found that RoBERTa fine-tunes better than BERT or GPT-2 itself. We expect ELECTRA should work as well.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants