sd-webui-adjust_lora_filename is an Extension for the AUTOMATIC1111 web UI. The Extension adjusts the output name in the metadata of the LoRA model file to the current stored file name.
At the moment I am using the AUTOMATIC1111 web UI [1] together with the Extension TrainTrain [3] for the LoRA modelling. The Extension currently under development should be compatible with the current AUTOMATIC1111 version. It should work with the LoRA models developed with AUTOMATIC1111 and TrainTrain as well as with LoRA models from other sources. The file type of the LoRA model must be in .safetensors
format. The metadata should generally not be changed, with the exception of incorrect data.
As I have already written at another places, there is a problem if the LoRA file name without extension is different from the output name in the metadata of the LoRA file itself. The labelling in the LoRA calling expression in a given Prompt can be different from the stored file name without extension. This can be irritating while using a LoRA model. The task of this Extension is to eliminate the difference or discrepancy between the file names.
I explain the earlier statement about the LoRA behaviour using a real live example. Let's say we are using the following Prompt:
cat with hat, <lora:FILE_NAME:WEIGHT>
Normally FILE_NAME is the file name without extension of the file stored on the hard disc or on whatever storage media the file may be located. Selecting a LoRA in AUTOMATIC1111 leads to the following Prompt if the output name in the metadata is different to the stored file name.
cat with hat, <lora:OUTPUT_NAME:WEIGHT>
If someone is using a lot of LoRA models this behaviour will be really confusing.
Typical metadata in a slightly abbreviated form, pretty printed as JSON data, looks like this:
{ "ss_base_model_version": "sd_v1", "ss_output_name": "LighthouseConceptLora", "ss_optimizer": "adamw", "ss_network_dim": "16", "ss_tag_frequency": { "1": { "lighthouse": 12 } }, "ss_lr_warmup_steps": "250", "ss_min_snr_gamma": "5.0", "ss_lr_scheduler": "cosine", "ss_lr_scheduler_power": "1.0", "ss_network_alpha": "8", "ss_learning_rate": "0.0001", "ss_max_train_steps": "1000", "ss_lr_scheduler_num_cycles": "1", "ss_v2": "False" }
The tag ss_output_name
in the JSON data contains the 'file name/output name'.
If one would rename LighthouseConceptLora.safetensors
on the hard disc to lighthouseconcept.safetensors
on the hard disc in the LoRA model subdirectory, then it happens what was described above. Then there will be a mismatch between LighthouseConceptLora and lighthouseconcept.
A .safetensors
files consist of a header and a binary part with the tensors. In the header there may be most of the time metadata. One tag of these metadata specifies the output name. If the filename is not changed the output filename is equal to the filename.
One can select a LoRA file from a dropdown menu. Sorting is possible in alphabetical forward and backward direction. The selected filename without extension is shown in a textbox on the left side. The output filename from the metadata is shown in a textbox on the right side. In parallel the JSON data is shown in textbox underneath. After clicking the Adjust button, the Extension tries to change the metadata tag, which is responsible for the output filename. Afterwards one can check clicking on the Update button, if the operation was successful.
Go to the tab Extensions
. Then go to the tab Install from URL
.
The installation link is
https://github.com/zentrocdot/sd-webui-adjust_lora_filename
No problems known yet.
I need a extension where I can modify the maximal training steps. At the moment I am performing more training steps then I am using at the end in the final version of the LoRA model. So there will be a mismatch of maximal training steps to the really used training steps.
The Extension was devolped and tested on a machine with a Debian based Linux distribution istalled using the web UI AUTOMATIC111 with following specification:
- API: v1.10.0
- Python: 3.10.14
- torch: 2.1.2+cu121
- xformers: 0.0.23.post1
- gradio: 3.41.2
AUTOMATIC1111 uses Gradio to programme the web user interface. The Gradio version used is extremely buggy and outdated. Currently my local installed version is 5.0.1, AUTOMATIC1111 is using version 3.41.2. Following some forum posts the outdated Gradio version is given priority over an adaptation or update. This does not really motivates to programme extensions.
Even in the Python Virtual Environment, which AUTOMATIC1111 is using, the subsequent installation of Python modules is not unproblematic. There is often a mismatch in versions and dependencies. It must be clarified on a case-by-case basis how problematic the corresponding warning or error messages are. So far I have been able to solve every problem that has arisen.
I am still looking for a good documentation on how to integrate custom Extensions into AUTOMATIC1111. So far I have to resort to analysing other extensions and also having a look at the code of AUTOMATIC1111. This is a very unsatisfactory approach.
[1] https://github.com/AUTOMATIC1111/stable-diffusion-webui
[2] https://github.com/AUTOMATIC1111/stable-diffusion-webui/wiki/Extensions
[3] https://github.com/hako-mikan/sd-webui-traintrain
There are various ways to support my work. One option is to purchase some of my extraordinary NFTs 😃. Some of my great collections can be found here:
- https://opensea.io/collection/fantastic-mushroom-collection
- https://opensea.io/collection/cats-with-hats-collection-1
- https://opensea.io/collection/devil-woman-collection
- https://opensea.io/collection/cup-of-ice-no-1
I loved the time when you could get also a hamburger 🍔 for one Euro!
If you like what I present here, or if it helps you, or if it is useful, you are welcome to donate a small contribution or a cup of coffee. Or as you might say: Every TRON counts! Many thanks in advance! 😃
TQamF8Q3z63sVFWiXgn2pzpWyhkQJhRtW7 (TRON) DMh7EXf7XbibFFsqaAetdQQ77Zb5TVCXiX (DOGE) 12JsKesep3yuDpmrcXCxXu7EQJkRaAvsc5 (BITCOIN) 0x31042e2F3AE241093e0387b41C6910B11d94f7ec (Ethereum)