Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[LoRA] make LoRAs trained with peft loadable when peft isn't installed #6306

Merged
merged 3 commits into from
Dec 27, 2023

Conversation

sayakpaul
Copy link
Member

What does this PR do?

Reported by @apolinario. Internal thread: https://huggingface.slack.com/archives/C05GX934Z98/p1703334768186119.

Currently, our LoRA trainers use peft. To load those LoRAs you need peft to be installed. However, we want those LoRAs to be loadable without peft too (or so I'd think?).

This PR fixes that.

@apolinario does this work for you?

TODO

  • Propagate to the other scripts.

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

Copy link
Contributor

@younesbelkada younesbelkada left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Makes sense, thanks! DO we need the same for other training scripts as well? Can you confirm the slow tests pass on a V100 instance?

@sayakpaul
Copy link
Member Author

DO we need the same for other training scripts as well?

@younesbelkada hence the TODO here: #6306 (comment).

Can you confirm the slow tests pass on a V100 instance?

Which ones? Can you pinpoint?

@sayakpaul
Copy link
Member Author

Can you confirm the slow tests pass on a V100 instance?

Ran

RUN_SLOW=1 pytest tests/lora/test_lora_layers_peft.py

Works.

@sayakpaul
Copy link
Member Author

@pacman100 could you also give it a look?

@sayakpaul sayakpaul requested a review from pacman100 December 25, 2023 10:52
@patrickvonplaten
Copy link
Contributor

Makes sense! As a general guideline, for now let's try to always have the exact same serialization format that we had before using PEFT in the example scritps

@sayakpaul sayakpaul merged commit 78b87dc into main Dec 27, 2023
16 checks passed
sayakpaul added a commit that referenced this pull request Dec 28, 2023
* add to dreambooth lora.

* add: t2i lora.

* add: sdxl t2i lora.

* style

* lcm lora sdxl.

* unwrap

* fix: enable_adapters().
donhardman pushed a commit to donhardman/diffusers that referenced this pull request Dec 29, 2023
…alled (huggingface#6306)

* spit diffusers-native format from the get go.

* rejig the peft_to_diffusers mapping.
donhardman pushed a commit to donhardman/diffusers that referenced this pull request Dec 29, 2023
* add to dreambooth lora.

* add: t2i lora.

* add: sdxl t2i lora.

* style

* lcm lora sdxl.

* unwrap

* fix: enable_adapters().
antoine-scenario pushed a commit to antoine-scenario/diffusers that referenced this pull request Jan 2, 2024
…alled (huggingface#6306)

* spit diffusers-native format from the get go.

* rejig the peft_to_diffusers mapping.
antoine-scenario pushed a commit to antoine-scenario/diffusers that referenced this pull request Jan 2, 2024
* add to dreambooth lora.

* add: t2i lora.

* add: sdxl t2i lora.

* style

* lcm lora sdxl.

* unwrap

* fix: enable_adapters().
@sayakpaul sayakpaul deleted the make-peft-sd-work-non-peft branch March 11, 2024 03:00
AmericanPresidentJimmyCarter pushed a commit to AmericanPresidentJimmyCarter/diffusers that referenced this pull request Apr 26, 2024
…alled (huggingface#6306)

* spit diffusers-native format from the get go.

* rejig the peft_to_diffusers mapping.
AmericanPresidentJimmyCarter pushed a commit to AmericanPresidentJimmyCarter/diffusers that referenced this pull request Apr 26, 2024
* add to dreambooth lora.

* add: t2i lora.

* add: sdxl t2i lora.

* style

* lcm lora sdxl.

* unwrap

* fix: enable_adapters().
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants