Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RuntimeError: result type Float can't be cast to the desired output type Long #1

Open
odeven opened this issue Mar 11, 2021 · 2 comments

Comments

@odeven
Copy link

odeven commented Mar 11, 2021

Hi I was trying out to setup the github repo codes. I followed the entire setup process but I encountered this error:
I have installed the requirements.txt file before this and run through the process.
Is it possible to provide the full requirements.txt file with the versions of the dependencies inside? The current one only has the package but not the versions. Appreciate your time! Thank you!

RuntimeError: result type Float can't be cast to the desired output type Long

Full error log as follows:
Traceback (most recent call last):
File "targeted_flips.py", line 160, in
main()
File "targeted_flips.py", line 32, in main
targeted_flips(samples, args, trainer, generator, embedding_weight, bpe)
File "targeted_flips.py", line 46, in targeted_flips
translations = trainer.task.inference_step(generator, [trainer.get_model()], samples)
File "/data/home/boonghim/adversarial-mt-master/fairseq/tasks/fairseq_task.py", line 265, in inference_step
return generator.generate(models, sample, prefix_tokens=prefix_tokens)
File "/data/home/boonghim/.conda/envs/attacking/lib/python3.6/site-packages/torch/autograd/grad_mode.py", line 26, in decorate_context
return func(*args, **kwargs)
File "/data/home/boonghim/adversarial-mt-master/fairseq/sequence_generator.py", line 113, in generate
return self._generate(model, sample, **kwargs)
File "/data/home/boonghim/.conda/envs/attacking/lib/python3.6/site-packages/torch/autograd/grad_mode.py", line 26, in decorate_context
return func(*args, **kwargs)
File "/data/home/boonghim/adversarial-mt-master/fairseq/sequence_generator.py", line 379, in _generate
scores.view(bsz, beam_size, -1)[:, :, :step],
File "/data/home/boonghim/adversarial-mt-master/fairseq/search.py", line 83, in step
out=self.beams_buf)

@odeven
Copy link
Author

odeven commented Mar 11, 2021

Is it possible to provide the versions for torch and the other dependencies? @Eric-Wallace

@JunW15
Copy link

JunW15 commented Oct 5, 2021

I encountered the same issue before and fixed it by changing the torch version to 1.2.0

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants