Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

old_llama_for_causal_lm_forward() got an unexpected keyword argument 'cache_position' #18

Open
Kevinstone-199898 opened this issue Feb 18, 2025 · 0 comments

Comments

@Kevinstone-199898
Copy link

When I was running a demo following the code in Quick Start for DuoAttention, an error occurs as follow:

Traceback (most recent call last):
File "/home/sxy/duo-attention/demo.py", line 57, in
outputs = model.generate(
File "/home/sxy/python_env/duo/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs)
File "/home/sxy/python_env/duo/lib/python3.10/site-packages/transformers/generation/utils.py", line 2047, in generate
result = self._sample(
File "/home/sxy/python_env/duo/lib/python3.10/site-packages/transformers/generation/utils.py", line 3007, in _sample
outputs = self(**model_inputs, return_dict=True)
File "/home/sxy/python_env/duo/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1739, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/home/sxy/python_env/duo/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1750, in _call_impl
return forward_call(*args, **kwargs)
TypeError: old_llama_for_causal_lm_forward() got an unexpected keyword argument 'cache_position'

Does anyone know how to solve this problem?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant