-
-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
WGPU prefferencing a low power graphics adapter by default #2810
Comments
Some input on this (my opinion):
|
Ya I can see it being useful in rare debugging purposes. I might try upgrading iced to the latest wgpu crate version, if not too many things would decide to brake during the upgrade.
I agree that this behavior should probably be configurable in some way, for whoever needs it. A little skeptical about the battery life benefits on laptops from my knowledge of computers, but I have also mostly only investigated desktops so I can't really speak on this topic much, aside from pointing out that it would be nice if somebody could benchmark this for us. However as of now I can propose to set the gpu based on if iced would be able to detect a battery on the device (I will attempt to investigate how to do so slightly later if that is even possible)
I sadly haven't really went through those issues, and I would really appreciate if you could link them. From the way you phrased it, it sounds like the reasons for poor performance might be unknown, so I feel like it might be worth while checking out if the reasons for the poor performance is this exact issue that Im pointing out. |
I use it to force my integrated GPU on my laptop hahaha
I can completely turn off my dedicated GPU on my laptop while it's not in use, saving me around 7W of total consumption in comparison to an idle DGPU. Having it on would halve my battery life from ~10h to ~5h.
That's just my own experience, I haven't reported it because I haven't gotten around to testing whether the master branch exhibits the same behavior, or figuring out why it happens. |
@edwloef By the way almost forgot. Testing software rendering yields better performance, on my machine. So I was wondering if you have ever measured the performance, and battery life difference between iGPU, and CPU rendering on your laptop? |
Turns out this was why my system fails to get any iced applications to run...
(This is using the iGPU of a Ryzen 9 9900X, attempting to run the todos example) In case others have a similar problem and have a compatible dGPU that iced isn't using, just setting the env variable Would be nice to have iced keep trying other devices to find one that's compatible instead of just failing. Or better yet, don't use devices that aren't compatible... Not sure if I should create a new issue for this though considering it can temporarily be resolved. |
Pr above still prioritizes env variable above all if it exists, at least as of now. |
Is your issue REALLY a bug?
Is there an existing issue for this?
Is this issue related to iced?
What happened?
In cases where user has both a discrete, and integrated GPU, iced is currently preferring integrated GPU, which is a huge problem, as it isn't just the low powered version, but is also practically always not the GPU the user is actively using as there output device, which results in severe performance issues.
What is the expected behavior?
Currently wgpu picks the GPU to utilize by first calling
wgpu::util::power_preference_from_env()
(which was removed in the latest version of wgpu) which checks for the users preference based on there environment performance, which in practice is never set returningNone
.This results in the power preference being decided, by seeing if anti-aliasing is enabled. Not exactly sure what was the logic behind this decision back at the time, but I also don't know much about low powered devices so I digress (This gets relevant as bellow will be my speculations on when LowPower preference should be set).
Now intuitively one might think that it is checking if the user has anti-aliasing set as a system default, as if it isn't enabled this most likely means they are using a device which is either low powered, or is attempting to preserve battery as much as possible, where in both cases Low Power preference is probably preferred.
In reality after tracking down where the settings gets initialized you will be lead to the initialization of the Program, or Deamon, which means that in reality it is actually just checking if the app itself has anti-aliasing enabled, which it by default does not.
Sourced from iced::Deamon function:
Sourced from wgpu Settings definition:
Depending on the simplicity of the app, and the power of the integrated GPU, the performance hit might mostly be only noticeable only on window resize, however in cases such as
current behavior might result in a lot of headache for both the users, and the developers.
Also side note, it feels like this issue isn't something everyone who might be using this crate might be capable of debugging, so it might be able to also resolves issues like #2750
TLDR:
Replacing this:
To this:
Results in people who have both integrated, and discrete GPUs having a 5000% increase in performance on average. While also making this code compatible with newer version of WGPU crate.
I can push this change if I get a green light, but if there are any wishes to handling this better I can look in to attempting to implement some other solution, to this problem. Just note that I don't have mac on me right now for writing platform specific code.
Sources:
https://github.com/iced-rs/iced/blob/master/wgpu/src/window/compositor.rs#L84-L89
https://github.com/iced-rs/iced/blob/master/src/daemon.rs#L74-L84
https://github.com/iced-rs/iced/blob/master/wgpu/src/settings.rs#L32-L42
Version
master
Operating System
Windows
Do you have any log output?
The text was updated successfully, but these errors were encountered: