Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] amd has updated thair fa2 fork quite some time ago #272

Open
4 tasks done
IMbackK opened this issue Jan 13, 2025 · 0 comments
Open
4 tasks done

[BUG] amd has updated thair fa2 fork quite some time ago #272

IMbackK opened this issue Jan 13, 2025 · 0 comments
Labels
bug Something isn't working

Comments

@IMbackK
Copy link

IMbackK commented Jan 13, 2025

OS

Windows

GPU Library

CUDA 12.x

Python version

3.12

Describe the bug

"(30 series) or newer. AMD GPUs are not supported."
is in appropriate as a blanket statement and condition, fa2 is up to date and works fine on amd gpus (CDNA only) with exllamav2

Reproduction steps

upstream https://github.com/Dao-AILab/flash-attention contains amd support by now

Expected behavior

AMD cdna gpus should be considered supported just as well as ampere+

Logs

No response

Additional context

No response

Acknowledgements

  • I have looked for similar issues before submitting this one.
  • I have read the disclaimer, and this issue is related to a code bug. If I have a question, I will use the Discord server.
  • I understand that the developers have lives and my issue will be answered when possible.
  • I understand the developers of this program are human, and I will ask my questions politely.
@IMbackK IMbackK added the bug Something isn't working label Jan 13, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant