-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Proposal: More detailed description of PBR in glTF 2.0 #1717
Proposal: More detailed description of PBR in glTF 2.0 #1717
Conversation
I really appreciate that. Getting started with (i.e. trying to implement) PBR from scratch can be hard, and a detailed description like that can certainly help here. But note that there might be some overlap with things like https://github.com/KhronosGroup/glTF-Sample-Viewer#physically-based-materials-in-gltf-20 . We should either make sure that this is really an overlap (meaning that they are describing the same thing, and not refer to different papers with different formulas), or figure out "the one and only place" where this could be described in detail (and then just link to that). |
Of course, changing this section must not make existing implementations like the sample viewer incompatible to the glTF 2.0 spec. If the proposal here is incomaptible to what has been in this section before, then it is a mistake. I have to admit that I didn't check if Appendix B in the spec agrees with the glTF-Sample-Viewer documentation. To give a bit more context: We want to have a glTF PBR (Next) compatible material model in a GI renderer. Unfortunately, the equations given in the spec are not well-suited for global illumination, as they break some fundamental properties of physical light transport. At the moment, Appendix B is non-normative, so we could just ignore it. However, I thought this is a good opportunity to improve it. There are three key aspects of the proposal:
|
specification/2.0/README.md
Outdated
|
||
**Fresnel Schlick** | ||
The multi-scattering approximation uses the directional albedo `E_m` and average albedo `E_mavg` of the single-scattering model. These values are typically precomputed and fetched from a lookup table during rendering. The average Fresnel `F_mavg` for Schlick can be computed analytically. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a bit of a stumbling point for me. Should we link to Enterprise PBR Shading Model for background? Can we provide these precomputed lookup tables? I see that your github provides these here:
GGX_E
GGX_E_avg
It might also be worth mentioning the numerical fit described in Energy Preservation for those who are willing to sacrifice energy conservation error in exchange for one less texture lookup into floating point texture lookups.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Worth noting that I came across this when trying to see how a new gltf material would be written, using this appendix as reference. I really appreciate the well defined abstractions and diagrams that have been developed thus far; looking forward to seeing the final writeup!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure if we should link to specific implementation in the spec, is it possible to do that? I was also thinking about describing how to build the tables (similar to this), but maybe that is too much detail and we should link to the slides from Imageworks or other implementation-independent sources. We could also put the tables into the glTF repo.
specification/2.0/README.md
Outdated
|
||
Simplified implementation of Fresnel from [An Inexpensive BRDF Model for Physically based Rendering](https://www.cs.virginia.edu/~jdl/bib/appearance/analytic%20models/schlick94b.pdf) by Christophe Schlick. | ||
``` | ||
(1 - E_m(VdotN)) * (1 - E_m(LdotN)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should E_m(VdotN)
be E_m(VdotN, alpha)
and E_mavg
be E_mavg(alpha)
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, E_m
and E_mavg
depend on alpha
. I omitted it here to improve readability, but it is confusing. I will add it. Thanks!
diffuse_brdf() * diffuse_weight(VdotN, LdotN, f0) + | ||
microfacet_brdf(alpha) * fresnel(HdotV, f0) + | ||
multiscatter_microfacet_brdf(alpha) * multiscatter_fresnel(f0) | ||
``` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this equation correct? I don't see where baseColor factors into dielectric materials. Is this f0 (0.04) or the full reflectance color (normal incidence angle)?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In dielectric materials, baseColor defines the color of the diffuse component. It's missing in this equation, fixed it.
specification/2.0/README.md
Outdated
![](figures/lightingF.PNG) | ||
1 20 | ||
F_mavg = ---- * baseColor + ---- | ||
21 21 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this equation correct or should it be 1/21 + 20/21 * baseColor? Same below in the dielectric section.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fixed, thanks!
5478dc3
to
0cbbc72
Compare
@proog128 Please bump with a comment here when this is ready for the next review, thanks! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some minor comments from a quick look.
specification/2.0/README.md
Outdated
material = mix(dielectric_brdf, metal_brdf, metallic) | ||
``` | ||
|
||
The metal BRDF is based on a microfacet model which describes the orientation of microfacets on the surface as a statistical distribution. The distribution is controlled by a parameter called roughness, varying between 0 (smooth surface) and 1 (rough surface). As most microfacet distributions the blending between the two extremes does not behave linear, the material's `roughness` parameter is squared before using it as the distribution's roughness. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The metal BRDF is based on a microfacet model which describes the orientation of microfacets on the surface as a statistical distribution. The distribution is controlled by a parameter called roughness, varying between 0 (smooth surface) and 1 (rough surface). As most microfacet distributions the blending between the two extremes does not behave linear, the material's `roughness` parameter is squared before using it as the distribution's roughness. | |
The metal BRDF is based on a microfacet model which describes the orientation of microfacets on the surface as a statistical distribution. The distribution is controlled by a parameter called roughness, varying between 0 (smooth surface) and 1 (rough surface). As the blending between the two extremes does not behave linearly, like most microfacet distributions, the material's `roughness` parameter is squared before using it as the distribution's roughness. |
specification/2.0/README.md
Outdated
|
||
![](figures/pbr.png) | ||
|
||
The glTF spec is designed to allow applications to choose different lighting implementations based on their requirements. Some implementations may focus on an accurate simulation of light transport, others may choose to deliver real-time performance. Therefore, any implementation that adheres to the rules for mixing BRDFs is compliant to the glTF spec. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The glTF spec is designed to allow applications to choose different lighting implementations based on their requirements. Some implementations may focus on an accurate simulation of light transport, others may choose to deliver real-time performance. Therefore, any implementation that adheres to the rules for mixing BRDFs is compliant to the glTF spec. | |
The glTF spec is designed to allow applications to choose different lighting implementations based on their requirements. Some implementations may focus on an accurate simulation of light transport while others may choose to deliver real-time performance. Therefore, any implementation that adheres to the rules for mixing BRDFs is conformant to the glTF spec. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it should be 'conformant'
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oops :) Updated my suggestion.
specification/2.0/README.md
Outdated
|
||
In a physically-accurate light simulation, the BRDFs have to follow some basic principles: the BRDF has to be positive, reciprocal and energy conserving. This ensures that the visual output of the simulation is independent of the underlying rendering algorithm, as long as it is unbiased. The specification will provide a mathematical model that allows implementations to achieve an exact result in an unbiased renderer. Note that unbiased renderers may still decide to deviate from the specification to achieve better visual quality. | ||
|
||
The unbiased light simulation with physically realistic BRDFs will be the ground-truth for approximations in real-time renderers that are often biased, but still give visually pleasing results. Usually, these renderers take short-cuts to solve the rendering equation, like the split-sum approximation for image based lighting, or simplify the math to save instructions and reduce register pressure. However, there are many ways to achieve good approximations, depending on the platform (mobile or web applications, desktop applications on low or high-end hardware, VR) different constraints have to be taken into account. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"short-cut" => "shortcut"
The unbiased light simulation with physically realistic BRDFs will be the ground-truth for approximations in real-time renderers that are often biased, but still give visually pleasing results. Usually, these renderers take short-cuts to solve the rendering equation, like the split-sum approximation for image based lighting, or simplify the math to save instructions and reduce register pressure. However, there are many ways to achieve good approximations, depending on the platform (mobile or web applications, desktop applications on low or high-end hardware, VR) different constraints have to be taken into account. | |
The unbiased light simulation with physically realistic BRDFs will be the ground-truth for approximations in real-time renderers that are often biased, but still give visually pleasing results. Usually, these renderers take shortcuts to solve the rendering equation, like the split-sum approximation for image based lighting, or simplify the math to save instructions and reduce register pressure. However, there are many ways to achieve good approximations, depending on the platform (mobile or web applications, desktop applications on low or high-end hardware, VR) different constraints have to be taken into account. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks a lot for the review and all the suggestions!
0cbbc72
to
c81a82d
Compare
Thanks for all the reviews! So far I used the formulas from the Enterprise PBR Material in the non-normative implementation section. However, I understand that it is probably too complicated for a simple example, especially when considering real-time WebGL renderers. Therefore, I will change it back to the implementation that was described before (and, as far as I know, is used in the glTF sample viewer). |
@proog128 When you get a chance, please resolve the merge conflicts here. Thanks! |
Something that has always bugged me in Appendix B is that we define Geometric Occlusion (G) in terms of Vis. While we are making changes, should we rename that section to "Microfacet Shadowing (Vis)"? |
I feel like there is a disconnect between the brdf pseudo code and the implementation. I wonder if the implementation section would be more cohesive if we renamed fdiffuse to diffuse_brdf, fspecular to specular_brdf, defined fresnel_mix, and so on. We could then reference these building block from other extensions, as we are doing in the clearcoat extension. |
@Crisspl maybe you'd like to get involved? |
I would completely support something along the lines of Nvidia MDL (which would be banal to express in json) except that the BRDF blocks that can be composed in a DAG are already predefined, much like Mitsuba's Such a solution is feasible, my team has a material compiler from a DAG Intermediate Representation into a GPU state machine which optimizes out states that can never be transferred into, right now its only frontend is Mitsuba XML and backend is GLSL. However we aim to support Nvidia MDL as a front end, while CUDA, CUDA+OptiX (leveraging SBT) and GLSL closestHit shaders as extra backends. Futhermore it would unify #1717 , #1718 and #1719 into a single solution instead of fragmented extensions. |
@devshgraphicsprogramming We value outside perspectives greatly, but I'd like to be careful not to derail this particular PR with ideas that run in the opposite direction. This group did carefully consider MDL, including a full presentation on the language from some of the key members of NVIDIA's MDL team. Some future version of glTF should ideally support either MDL or some other configurable shader type of arrangement. That said, this particular PR is in support of shoring up the core glTF PBR definition, with a short-term goal of enabling the next wave of PBR parameters to arrive via extensions. We've chosen to keep the extensions separate during their development, but with a careful eye on the end goal of all of the new "PBR Next" material properties working together in harmony, for example as part of a client's core material definition when this is done. MDL, or something like it, will probably re-appear on the roadmap at some future point. But in the short term, "PBR Next" will appear more similar to systems such as Dassault Systèmes Enterprise PBR and Autodesk Standard Surface. |
8b36f11
to
2ce983a
Compare
2ce983a
to
792c7ca
Compare
I pushed an update with the following major changes:
I hope that this separation of normative and non-normative parts now makes clear which parts can be changed by an implementor/are allowed to be approximated and which parts have to stay as in the spec. In comparison to the original document (master branch), it's more complicated, because I had to introduce another layer of abstraction. I think there is still potential to simplify it, but I had to keep some complexity to avoid having to rewrite all the other extensions (because they refer to some terms in Appendix B). |
|
||
``` | ||
function diffuse_brdf(color) { | ||
return (1/pi) * color |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks like only the D term; I believe you still need the G term multiplied here. Otherwise your diffuse is independent of light direction, and I don't think you're getting any diffuse reflection from light behind the tangent plane.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
diffuse BRDF is albedo/PI
the NdotL factor is to account for projected solid angle and is not part of the BRDF itself.
If however you are referencing with G to the microfacet masking function (i.e. Smith) then that would also be incorrect because G is only defined for a specific microfacet with a specific normal, which is only the case for consideration in specular BRDFs.
It would also be incorrect because lambertian BRDF assumes a smooth surface of 0 roughness, so geometrical masking function would be always 1.0 for directions in the hemisphere.
The only thing that makes sense here is a Heaviside function of NdotL
but thats usually omitted because when you render in realtime by evaluating the BRDF at specific points, you take care not to evaluate for points outside the hemisphere (behind the tangent plane), and when you render with Monte Carlo or sampling methods, your sample generator should never produce such samples anyway.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm, not sure I agree. I think the microfacet and shadowing model applies just as much to the diffuse as the specular, since shadowing stops grazing light from entering the surface at the place you expect. If you use a Heaviside (step) function instead of a smooth G, all that means is that a point light will produce a hard rather than soft shadow line for the colored reflection, which I don't think is a more accurate render. In any case, you said yourself that at least a Heaviside function is required here (which I would call an approximation of G), so either way I think it should be mentioned.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
On second thought, perhaps NdotL
is enough to handle this properly; I think the real thing I'm reacting to is further down where diffuse_brdf is used; no NdotL
term shows up there, which I think is what got me started down this road. Maybe that's all that's actually missing.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
NdotL
is not part of the BRDF. The BRDF is defined as reflected differential radiance over differential incident irradiance.
NdotL
is part of the integral over all incoming light directions (upper hemisphere around the surface normal) which a renderer needs to solve to compute the outgoing radiance. I don't think it should be described here.
The Lambertian BRDF used here is the most simple BRDF possible, it's just a constant. A more realistic diffuse-like BRDF that fits into the microfacet framework is the Oren-Nayar BRDF. In this model, the facets are assumed to be Lambertian (as opposed to perfectly specular in the "typical" glossy microfacet models).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, okay, I see my mistake now. So material =
means the material's BRDF. I think because I saw some LdotH
terms in there that we were closer to light output. I think it might be nice to show the BRDF integral somewhere just to make it clear how to get all the way from input light to viewer pixel color. I looked up at the material section that references this appendix, and it doesn't define BRDF at all either.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
NdotL is not part of the BRDF. The BRDF is defined as reflected differential radiance over differential incident irradiance.
What I've said but more formally.
The Lambertian BRDF used here is the most simple BRDF possible, it's just a constant. A more realistic diffuse-like BRDF that fits into the microfacet framework is the Oren-Nayar BRDF. In this model, the facets are assumed to be Lambertian (as opposed to perfectly specular in the "typical" glossy microfacet models).
Also Oren-Nayar assumes the V-cavities model, not GGX's NDF.
AFAIK, Oren-Nayar is not a closed form solution but rather a numerical fit of a simulation of such a microfacet model, correct me if I'm wrong
Hmm, not sure I agree. I think the microfacet and shadowing model applies just as much to the diffuse as the specular, since shadowing stops grazing light from entering the surface at the place you expect. If you use a Heaviside (step) function instead of a smooth G, all that means is that a point light will produce a hard rather than soft shadow line for the colored reflection, which I don't think is a more accurate render. In any case, you said yourself that at least a Heaviside function is required here (which I would call an approximation of G), so either way I think it should be mentioned.
Like I said, if you assume microfacets to be anything other than perfectly smooth, your average microfacet reflection is not 1/PI anymore...
A perfectly smooth surface has no geometric shadowing, except by the tangent plane.
We can mention a heaviside function, but its not required in the reference implementation because of practicality:
- No one will call the BRDF evaluation function when the observer is below the tangent plane (you'd be seeing the back face of a triangle, VdotN is negative) * actually for smooth shaded triangles, a
if (NdotV>0.0)
check would make sense. - No one will call the BRDF evaluation function when the surface is backfacing towards the light, people usually use a
max(NdotL,0.0)
outside the BRDF or anif (NdotL>0.0)
before evaluating lighting
Changing the microfacet distribution to V-cavities necessitates you use Oren-Nayar, which already has the geometric shadowing baked into the fit AFAIK.
Using the same GGX distribution as specular introduces its own problems, mainly that such a diffuse BRDF has no closed solution and needs to be stochastically simulated with inifnite random walks like in Heitz's papers. and it seems there is no standardised numerical fit.
Also you need to decide whether the specular and diffuse distributions are in a coating or a fuzed substrate, basically:
- is there a layer of coating (varnish) that has different microsurface on top and on bottom (uncorrelated microfacets from two layers)
- are the individual microfacets all the same and exhibit specular and diffuse behaviour (correlated microfacet from pretty much a single layer)
For practicality, all the models I've seen even in offline rendering use option (1) where the two BRDFs are treated as separate physical layers.
specification/2.0/README.md
Outdated
|
||
### Metal BRDF and Dielectric BRDF | ||
|
||
Applying the functions we arrive at the metal BRDF and dielectric BRDF: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
My understanding is that you are trying to build a bridge here between the old Appendix B content and this new formalization. I think we are missing some details in this translation of the new representation of the brdf from the old representation. For example, it's not clear here that 0.04 is the f0 when ior is 1.5. Do we need to elaborate a bit more on why this section is here, and expand a bit more on how we arrive at the metal_brdf and dielectric_brdf?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
My idea was to first describe the material as a normative node graph (Section "Complete Model"), then give a non-normative (sample) implementation for the nodes (Sections "Specular BRDF", "Diffuse BRDF", "Fresnel"), and finally substitute all nodes in the graph for their implementations. This is how we arrive at the old (shader code) representation. I added some notes to the beginning of the "Metal BRDF and Dielectric BRDF" section to make the transition more smooth (hopefully). Does that help?
This has been a long time in the making, and it's now in a state where I think merging it is a big improvement over what's been here. I know some people want further tweaks or fine-tuning. Let's open new PRs or new issues for any such remaining concerns. |
This is a draft that extends Appendix B (BRDF Implementation) of the glTF 2.0 specification. It describes the metallic-roughness material in an abstract way, derives a physically realistic BRDF (positive, reciprocal, energy conserving) for offline ray tracing, and gives some hints how to approximate this in real-time renderers. The old equations are still valid after this change.
It's work in progress, there are still a lot of TODOs. I am sharing this to get feedback on the approach. Maybe this could be a template for PBR Next (#1442) including extensions (#1677, #1688, #1698) that allows to have consistent implementations of the material in different rendering engines, from mobile, real-time VR or web to offline, unbiased GI.