Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Confused by this conv1d operation #9

Open
airkid opened this issue May 18, 2019 · 1 comment
Open

Confused by this conv1d operation #9

airkid opened this issue May 18, 2019 · 1 comment

Comments

@airkid
Copy link

airkid commented May 18, 2019

Hi, I'm reading this code for study and it helps me a lot.
I'm confused by this line:

nn.Conv1d(d_model, d_ff, 1),

from the source paper of BERT, I've not found any description that BERT use a conv1d layer in transformer instead of linear transformation.

And from http://nlp.seas.harvard.edu/2018/04/03/attention.html#position-wise-feed-forward-networks, this is implement by a mlp.

Can anyone kindly help me with this problem?

@ne7ermore
Copy link
Owner

ne7ermore commented May 19, 2019 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants