Skip to content

Fix the bug that joint_attention_kwargs is not passed to the FLUX's transformer attention processors #24489

Fix the bug that joint_attention_kwargs is not passed to the FLUX's transformer attention processors

Fix the bug that joint_attention_kwargs is not passed to the FLUX's transformer attention processors #24489