You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I have a question about the "attention_cnn" module.
In general, it uses "softmax" rather than "tanh" to get the weights. But as below, your code it uses "tanh", so I am a little confused. Are there some reasons to use like that? thanks!
The text was updated successfully, but these errors were encountered:
Hi, I have a question about the "attention_cnn" module.
In general, it uses "softmax" rather than "tanh" to get the weights. But as below, your code it uses "tanh", so I am a little confused. Are there some reasons to use like that? thanks!
The text was updated successfully, but these errors were encountered: