Skip to content

Fix the bug that joint_attention_kwargs is not passed to the FLUX's transformer attention processors #10621

Fix the bug that joint_attention_kwargs is not passed to the FLUX's transformer attention processors

Fix the bug that joint_attention_kwargs is not passed to the FLUX's transformer attention processors #10621

Annotations

1 warning

The logs for this run have expired and are no longer available.