Skip to content

Commit

Permalink
apply suggestions from review
Browse files Browse the repository at this point in the history
  • Loading branch information
a-r-r-o-w committed Jan 15, 2025
1 parent 40fc7a5 commit 248f103
Showing 1 changed file with 1 addition and 2 deletions.
3 changes: 1 addition & 2 deletions src/diffusers/hooks/pyramid_attention_broadcast.py
Original file line number Diff line number Diff line change
Expand Up @@ -221,8 +221,7 @@ def apply_pyramid_attention_broadcast(
# cannot be applied to this layer. For custom layers, users can extend this functionality and implement
# their own PAB logic similar to `_apply_pyramid_attention_broadcast_on_attention_class`.
continue
if isinstance(submodule, (Attention, MochiAttention)):
_apply_pyramid_attention_broadcast_on_attention_class(name, submodule, config)
_apply_pyramid_attention_broadcast_on_attention_class(name, submodule, config)


def _apply_pyramid_attention_broadcast_on_attention_class(
Expand Down

0 comments on commit 248f103

Please sign in to comment.