-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
PyTorch ecosystem: identical packages with different names + differing package versions with same name #11012
Comments
tl;dr my questions are:
|
Unfortunately, no. There's no support in the Python runtime for having multiple versions of a single package installed. It's not possible. (You might be interested in Armin's multiversion from a decade ago as a guide on what breaks.) |
In this case the fa2 and fa3 packages are completely separate and have no overlapping files. Installing them side-by-side is expected and necessary to use tools like Is there any way to force uv to do so? Or to alias names so it doesn't think they're the same package? Similiarly for the I also run into this issue where it isn't available in newer pythons:
|
found #9174 (comment) and #4422 (comment) |
Is there anything you can link me to, to understand why you're supposed to install two flash-attention versions at the same time, or even that it's a recommended approach? |
I don't think it was necessarily designed that way.. but to practically use these packages right now, it is necessary. Basically, a lot of the sources in the flash-attn repo were copied into a So this 3.0 beta package just contains new cuda code for updated GPUs and some basic interface python code. To effectively use the cuda code, you need the rest of the supporting library, which is still only published in the flash-attn 2.x packages. So i.e. if you need Another example is EDIT: related Dao-AILab/flash-attention#1457 As a workaround, I guess the simplest thing would be fork Dao-AILab/flash-attention, change hopper/setup.py so the package name is unique, then use that new git url as my dependency. |
Question
Following up on #10694 with some more real-world examples.
triton vs pytorch-triton
Many pytorch ecosystem packages depend on
triton
, so pyproject/uv.lock will often containtriton == 3.1.0
.However, when switching to pytorch nightly the equivalent nightly package for triton is named
pytorch-triton
: https://github.com/pytorch/pytorch/blob/main/RELEASE.md#triton-dependency-for-the-releaseSo the equivalent entry to use nightly is something like
pytorch-triton == 3.2.0+git0d4682f0
. But adding this to your project doesn't remove the transitivetriton == 3.1.0
so it seems both get installed:and since they both write files to
site-packages/triton
its not fully clear which version ends up getting used.flash-attn v2 vs v3
There are two version of flash-attn available, both named flash-attn but with differing version numbers. In a regular python pip environment, it seems possible to install them both:
In an uv env, installing one replaces the other:
The replacement behavior is probably what makes the most sense, but currently it is necessary to have both installed to actually use FA3: Dao-AILab/flash-attention#1467
Platform
Linux amd64
Version
uv 0.5.22
The text was updated successfully, but these errors were encountered: