Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OutOfMemoryError - is there a --lowmem style option? #64

Open
salamanders opened this issue Feb 17, 2024 · 2 comments
Open

OutOfMemoryError - is there a --lowmem style option? #64

salamanders opened this issue Feb 17, 2024 · 2 comments

Comments

@salamanders
Copy link

My hardware:

  • torch.cuda.get_device_name(0) = NVIDIA GeForce GTX 1080 Ti
  • torch.cuda.get_device_properties(0).total_memory = 10.91 GB

I get an OutOfMemoryError in the notebook at line
models_b = core_b.setup_models(extras_b, skip_clip=True)

I'm coming from Automatic1111 which does run XL models (just barely!) Any hints on getting this to fit?

@FurkanGozukara
Copy link

we added amazing optimizations check this out : #125

@salamanders
Copy link
Author

we added amazing optimizations check this out : #125

Sorry, this leads to a patreon subscription thing? Are you associated with this project?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants