Fix the bug that joint_attention_kwargs
is not passed to the FLUX's transformer attention processors
#24489
Job | Run time |
---|---|
4m 56s | |
4m 56s |