Skip to content

Fix the bug that joint_attention_kwargs is not passed to the FLUX's transformer attention processors #15561

Fix the bug that joint_attention_kwargs is not passed to the FLUX's transformer attention processors

Fix the bug that joint_attention_kwargs is not passed to the FLUX's transformer attention processors #15561

Re-run triggered September 27, 2024 11:01
Status Success
Total duration 29s
Artifacts

pr_dependency_test.yml

on: pull_request
check_dependencies
14s
check_dependencies
Fit to window
Zoom out
Zoom in

Annotations

1 warning
check_dependencies
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-python@v4. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/