Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

added a fix to models/xti_attention_processor.py #5

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

maxwelljones14
Copy link

@maxwelljones14 maxwelljones14 commented Aug 12, 2023

Later versions of diffusers after 0.14.0 do not have models.cross_atttention. Instead they use the models.attention_processor.Attention class. Added compatibility with this new class. Similarly, later versions do not have an argument for cross_attention_norm. Added compatibility for this as well.
EDIT: cross_attention_norm on line 38 in the hasattr function should be a string

hasattr(attn, "cross_attention_norm")

…ible with later versions of diffusers that depreciated models.cross_attention and attn.cross_attention_norm
@NeuralTextualInversion
Copy link
Owner

Looks good. Would you like to also update the requirements.txt file so new users will use the latest diffusers version?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants