-
Notifications
You must be signed in to change notification settings - Fork 21
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Realtime use case affordances #241
Comments
We do say:
So we could try to be more explicit, but it seems fairly clear that implementations can really do whatever they want (or have to given the constraints) -- the only penalty is that they will not match other renderers very well, if they deviate from the spec. Which maybe is fine for that particular use case, e.g. no-one is going to expect a mobile game engine to match Arnold. Adding suggestions for how a game engine can approximate refraction or subsurface scattering seems a bit beyond the scope of the spec. Though perhaps we could at least mention that this is a practical problem, and suggest some references or resources. Also, there are example implementations available that do some such approximations for real-time (e.g. MaterialX, and Three.js is in progress), though I think we could provide more clear links to those. I also think we should provide suggested translations to some other popular shading models (e.g. Standard Surface, Disney Principled. USDPreviewSurface, GLTF) in some easily portable form, which would be a practical solution for rasterizers if they already support one of those models. |
I'd argue that we need to provide strong guidance for implementors, as opposed to diverting them to other BSDFs. If someone asked me, hey Nick get this running on a phone on 30 characters, the first thing I'd do is draw a graph of OpenPBRs components and how they compose and both mark that graph, and create a matrix of cost versus subjective impact, and I'd populate that matrix indications
I'd also point to concrete reference OpenPBR to PreviewSurface, to glTF, etc, as the MaterialX repo may already have to some degree and say those are canonical. But those would be very disappointing to use for me personally, I'd rather have the graph and matrix, and then do my best to fulfill an intent so that when I'm done with my cheap enough shader, I could qualify any claims I make about it being OpenPBR compatibility with some rigor and a rubric. In conclusion, I'd like to see a rubric for simplification, and indeed, some reference attempts, to demonstrate "lowest fidelity still useful" or other hints to the developer on how to hit it. |
Thanks for the responses.
Is this explicitly spelled out somewhere? I think it's implied but can I take this your comment as an official project stance? I just don't want people to cuss me out if we're different for practical reasons. I mean, they will do that regardless, but I'd like to be backed up by "This is an official stance" rather than being seen as going off on our own. I agree there doesn't need to be an explicit language about HOW to handle things in alternate ways, that was just a concern others had so I figured I'd mention it. I think that is a hole that could be filled in after the fact. If we could do the following, it would go a really long way to meeting our needs and that of a lot of realtime users:
|
I don't think I should state an official stance for the project here, as it's a group effort, but really everything should derive from the language of the spec without needing separate official statements. We can argue about the language of the spec, but it should really be clear from that alone, or if not the spec needs to be clarified.
Maybe it's better to put it this way. The spec defines a goal appearance that a practical implementation in an offline renderer can get close to, by adhering reasonably closely to the text. It doesn't go further and say that all implementations must do that, or something horrible will happen. Nothing will actually happen.. Except, the implementation won't look like a more full featured one, in ways that may be subtle or obvious. Whether those differences in appearance amount to "breaking the universe" or merely "detectable (only) by experts" would be answered differently for a VFX renderer, a triple-A game engine, or a renderer for mobile devices. Obviously, it would be best if implementations do their best to get as close to the goal as possible, so that users can rely on the appearance being reasonably similar in different renderers. I guess what you and @meshula are asking for is some more detailed guidance on what changes to the model (omissions, approximations, etc.) would be reasonable in certain contexts, or certain changes that should be forbidden to avoid horrible artifacts, etc. I think that would be valuable, though would be an adjunct to the spec. It seems to me that it would be most useful to just have some concrete examples of implementing the model for different use cases, rather than trying to write down some elaborate rules and recommendations (beyond what we already say in the spec). Also as noted, since shading models suitable for real-time already exist, another approach to the problem is to provide recommended translations to those, i.e. formulas which do the (presumably lossy) mapping from OpenPBR -> model X. A nice approach might be to pick a target model X, and write the translation down that makes the most sense, then do a write-up of how that translation was derived, so then it's both a readable piece of guidance as well as a usable solution in the form of a way to generate model X assets from OpenPBR. (I wouldn't want to put this inside the spec itself though) |
Not sure about this. I think it probably deserves a separate ticket as a proposed change to the model. |
To riff off what you're saying:
I think this is basically what I'd like the project to just explicitly state. Just a paragraph like this would solve it for me:
If there was just that one caveat somewhere in the opening README, that would unblock adoption in situations where I currently need to defend against the idea that it must support everything. |
As requested, I added another issue for AO #242 to track it separately |
Sounds good. I'll make a PR with your proposed change, and we can refine it further there. |
Awesome, thanks |
We'd discussed this in a few other meetings, but I'd like to formally request some affordances in OpenPBR for realtime use cases, where realtime could include:
The primary things that come to mind:
It's difficult for many rasterizers to support some of the features like transmission and subsurface. It would be great to allow for the following:
Realtime renderers often desire ambient occlusion maps which is not something that OpenPBR supports. It would be great to have some kind of officially supported way to do occlusion for renderers that prefer that route.
The text was updated successfully, but these errors were encountered: