Skip to content

Commit

Permalink
Fix another small bug in attention_bias redux (#6737)
Browse files Browse the repository at this point in the history
* fix a bug in the attn_masked redux code when using weight=1.0

* oh shit wait there was another bug
  • Loading branch information
Slickytail authored Feb 7, 2025
1 parent 079eccc commit 832e3f5
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions nodes.py
Original file line number Diff line number Diff line change
Expand Up @@ -1065,10 +1065,10 @@ def apply_stylemodel(self, conditioning, style_model, clip_vision_output, streng
(txt, keys) = t
keys = keys.copy()
# even if the strength is 1.0 (i.e, no change), if there's already a mask, we have to add to it
if strength_type == "attn_bias" and strength != 1.0 and "attention_mask" not in keys:
if "attention_mask" in keys or (strength_type == "attn_bias" and strength != 1.0):
# math.log raises an error if the argument is zero
# torch.log returns -inf, which is what we want
attn_bias = torch.log(torch.Tensor([strength]))
attn_bias = torch.log(torch.Tensor([strength if strength_type == "attn_bias" else 1.0]))
# get the size of the mask image
mask_ref_size = keys.get("attention_mask_img_shape", (1, 1))
n_ref = mask_ref_size[0] * mask_ref_size[1]
Expand Down

0 comments on commit 832e3f5

Please sign in to comment.