You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was successfully able to run the LSTM based pointer generator. While running the transformer_encoder branch with LSTM=false, I encounter this error:
File "training_ptr_gen/train.py", line 400, in <module>
train_processor.trainIters(config.max_iterations, args.model_file_path)
File "training_ptr_gen/train.py", line 341, in trainIters
loss = self.train_one_batch(batch)
File "training_ptr_gen/train.py", line 273, in train_one_batch
encoder_outputs, encoder_feature, encoder_hidden = self.model.encoder(enc_batch, enc_lens, enc_padding_mask)
File "/srv/home/kaur/pointer_summarizer/lstmpg/lib/python3.6/site-packages/torch/nn/modules/module.py", line 727, in _call_impl
result = self.forward(*input, **kwargs)
File "/srv/home/kaur/pointer_summarizer/training_ptr_gen/model.py", line 108, in forward
word_embed_proj = self.tx_proj(embedded)
File "/srv/home/kaur/pointer_summarizer/lstmpg/lib/python3.6/site-packages/torch/nn/modules/module.py", line 779, in __getattr__
type(self).__name__, name))
torch.nn.modules.module.ModuleAttributeError: 'Encoder' object has no attribute 'tx_proj'
Any help regarding this would be appreciated, thanks.
The text was updated successfully, but these errors were encountered:
Thanks for the reply @v-chuqin, but I wanted to use the transformer-based encoder and hence set use_lstm=False. With use_lstm=True, it is working fine.
I was successfully able to run the LSTM based pointer generator. While running the transformer_encoder branch with LSTM=false, I encounter this error:
Any help regarding this would be appreciated, thanks.
The text was updated successfully, but these errors were encountered: