We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Issue ported from the now closed bug report forums on Discord.
This is a known (and old) but niche bug.
For it to not fail when using anything but DDIM
: 100%|███████████████████████████████████████████| 16/16 [00:46<00:00, 2.93s/it, loss=0.0571, smooth loss=0.0512] step: 100%|███████████████████████████████████████████| 16/16 [00:44<00:00, 2.76s/it, loss=0.0403, smooth loss=0.0512] step: 100%|████████████████████████████████████████████| 16/16 [00:43<00:00, 2.70s/it, loss=0.0693, smooth loss=0.052] step: 100%|████████████████████████████████████████████| 16/16 [00:43<00:00, 2.70s/it, loss=0.0555, smooth loss=0.052] step: 100%|███████████████████████████████████████████| 16/16 [00:43<00:00, 2.70s/it, loss=0.0577, smooth loss=0.0522] step: 100%|███████████████████████████████████████████| 16/16 [00:43<00:00, 2.69s/it, loss=0.0605, smooth loss=0.0519] step: 100%|███████████████████████████████████████████| 16/16 [00:43<00:00, 2.70s/it, loss=0.0496, smooth loss=0.0518] step: 100%|███████████████████████████████████████████| 16/16 [00:43<00:00, 2.74s/it, loss=0.0612, smooth loss=0.0519] step: 100%|███████████████████████████████████████████| 16/16 [00:44<00:00, 2.79s/it, loss=0.0532, smooth loss=0.0524] step: 100%|█████████████████████████████████████████████| 16/16 [00:44<00:00, 2.76s/it, loss=0.05, smooth loss=0.0521] sampling: 97%|█████████████████████████████████████████████████████████████████████▋ | 30/31 [00:07<00:00, 3.89it/s] Traceback (most recent call last): | 0/16 [00:00<?, ?it/s] File "C:\repos\OneTrainer\modules\trainer\GenericTrainer.py", line 254, in __sample_loop0/31 [00:07<00:00, 4.08it/s] self.model_sampler.sample( File "C:\repos\OneTrainer\modules\modelSampler\StableDiffusionXLSampler.py", line 471, in sample image = self.__sample_base( File "C:\repos\OneTrainer\venv\lib\site-packages\torch\utils\_contextlib.py", line 116, in decorate_context return func(*args, **kwargs) File "C:\repos\OneTrainer\modules\modelSampler\StableDiffusionXLSampler.py", line 171, in __sample_base latent_image = noise_scheduler.step( File "C:\repos\OneTrainer\venv\src\diffusers\src\diffusers\schedulers\scheduling_euler_ancestral_discrete.py", line 415, in step sigma_to = self.sigmas[self.step_index + 1] IndexError: index 31 is out of bounds for dimension 0 with size 31 Error during sampling, proceeding without sampling sampling: 100%|████████████████████████████████████████████████████████████████████████| 31/31 [00:07<00:00, 4.01it/s] step: 100%|███████████████████████████████████████████| 16/16 [01:06<00:00, 4.18s/it, loss=0.0376, smooth loss=0.0525]
pip freeze
N/A Occurs in every single version of OT.
The text was updated successfully, but these errors were encountered:
No branches or pull requests
What happened?
Issue ported from the now closed bug report forums on Discord.
This is a known (and old) but niche bug.
What did you expect would happen?
For it to not fail when using anything but DDIM
Relevant log output
Output of
pip freeze
N/A Occurs in every single version of OT.
The text was updated successfully, but these errors were encountered: