You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was interested in the experiment of Label-Conditional Text Generation. I would have some questions about the losses.
loss_lsd corresponds to equation 16 of the paper. and loss_lsg I guess complement the equation 16. Is there a reason why this is not mentioned in Eq 16?
I have the feeling that loss_encode cancels out loss_lsc when added together in loss. Is it correct?
What would be needed to change to handle, for example, 3 classes? Ony this ?
Thanks a lot for your answers!
The text was updated successfully, but these errors were encountered:
Thanks for your question, actually I benefit from it a lot.
I am trying to run the Label-Conditional Text Generation experiment too, but unfortunately, I didn't find the entry point for the training where there is no code call the class "Ctrl_Gen".
Thus, if you managed to train it it would be appreciated if you can guide me.
Dear authors,
First, thank you for sharing the code!
I was interested in the experiment of Label-Conditional Text Generation. I would have some questions about the losses.
loss_lsd corresponds to equation 16 of the paper. and loss_lsg I guess complement the equation 16. Is there a reason why this is not mentioned in Eq 16?
I have the feeling that loss_encode cancels out loss_lsc when added together in loss. Is it correct?
What would be needed to change to handle, for example, 3 classes? Ony this ?
Thanks a lot for your answers!
The text was updated successfully, but these errors were encountered: