Skip to content

Commit

Permalink
remove unused kwargs
Browse files Browse the repository at this point in the history
  • Loading branch information
huanngzh committed Jan 9, 2025
1 parent 15b5fef commit a44f962
Showing 1 changed file with 0 additions and 2 deletions.
2 changes: 0 additions & 2 deletions src/diffusers/models/attention_processor.py
Original file line number Diff line number Diff line change
Expand Up @@ -2813,7 +2813,6 @@ def __call__(
encoder_hidden_states: torch.Tensor,
attention_mask: Optional[torch.Tensor] = None,
image_rotary_emb: Optional[torch.Tensor] = None,
**kwargs,
) -> torch.Tensor:
text_seq_length = encoder_hidden_states.size(1)

Expand Down Expand Up @@ -2885,7 +2884,6 @@ def __call__(
encoder_hidden_states: torch.Tensor,
attention_mask: Optional[torch.Tensor] = None,
image_rotary_emb: Optional[torch.Tensor] = None,
**kwargs,
) -> torch.Tensor:
text_seq_length = encoder_hidden_states.size(1)

Expand Down

0 comments on commit a44f962

Please sign in to comment.