Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Transparent objects brighten when using gammacorrectionshader #23019

Closed
gloriousjob opened this issue Dec 14, 2021 · 26 comments
Closed

Transparent objects brighten when using gammacorrectionshader #23019

gloriousjob opened this issue Dec 14, 2021 · 26 comments

Comments

@gloriousjob
Copy link

gloriousjob commented Dec 14, 2021

Describe the bug

When using sRGB textures (such as in 3DS models), the GammaCorrectionShader seems to brighten the texture more than when the EffectComposer is not used. The GammaCorrectionShader is needed when using the EffectComposer to produce similar colors to when the EffectComposer is not used. I have not been able to upgrade past r111 because of this issue.

As a side note, a similar problem was seen with generic shapes with color but it appears that setting the clearColor on the RenderPass can help with that issue but not with textured objects.

To Reproduce

  1. Create a shape with a texture and an opacity of 0.4 and use an EffectComposer, with a GammaCorrectionShader pass.
  2. Create a shape with an opacity of 0.4 just using a normal render (no EffectComposer).
  3. Note that the colors are brighter on the first shape.

Live example

The following fiddle demonstrates the issue (top is no postprocessing, bottom is using postprocessing):
https://jsfiddle.net/s5tpxq1g/1/

You can see how the bottom is brighter for the texture and a lower opacity makes the issue more apparent.

Expected behavior

Colors match when using GammaCorrectionShader in an EffectComposer compared to not using EffectComposer.

Screenshots
Snapshot from fiddle (top half does not use EffectComposer, bottom half does). The left object shows that a normal object (with clearColor set correctly) does not have an issue but the right object, having textures, differs in brightness.:
image

Platform:

  • Device: Desktop
  • OS: Windows 10, 21H2
  • Browser: Chrome/Firefox/Edge
  • Three.js version: r134 but was noted in r112, when gammaInput and gammaOutput properties were removed and GammaCorrectionShader was recommended for sRGB scenarios.
@WestLangley
Copy link
Collaborator

WestLangley commented Dec 16, 2021

The proper workflow is to render the scene in linear-sRGB color space and convert to sRGB color space as a final post-processing step.

However, currently, three.js converts to sRGB color space in-line in the shader, prior to blending with the drawing buffer.

This is fine if the material is opaque. But if the material is transparent, it is not correct.

I would expect three.js to integrate a proper post-processing workflow into the engine at some point.

/ping @bhouston

@bhouston
Copy link
Contributor

We can also add a post processing step to convert a linear HDR RGB buffer to a LDR gamma corrected buffer. You need at least an FP16 render target for this to work well. It shouldn't really be hard -- I think all the code is in Three.js. This is actually how Clara.io worked back in 2014:

https://clara.io/view/9e2c46ce-6ada-450c-8acd-c44b84de2a38

I guess when you say we need to integrate a proper post-processing workflow, @WestLangley, you are meaning out of the box. That makes sense now.

I actually switched from the proper HDR FP16/FP32 render buffer approach in 2016, and I contributed that back to Three.js here #8250, because many mobile devices simply didn't have enough memory to handle it properly, especially Apple iPads of the era. The approach we switched to is what Three.js currently uses as the default workflow -- it was actually awesome and fast. It does have a few limitations though as pointed out in this bug report.

Thus we could go back to this more traditional approach now that mobile hardware has caught up.

@bhouston
Copy link
Contributor

bhouston commented Dec 16, 2021

And interesting @WestLangley and I had this exact same discussion back then: #8250 (comment)

@Mugen87
Copy link
Collaborator

Mugen87 commented Dec 17, 2021

So to clarify, the upper part of the fiddle is actually too dark whereas the lower part (using post-processing) is the correct image.

@bhouston
Copy link
Contributor

@Mugen87 did you delete the comment where you mentioned the framebuffer/texture format "SRGB8_ALPHA8_EXT"? Could we use this as the render target format? From a quick reading, assuming the browser does the blending correctly between two separate sRGB8_ALPHA8_EXT buffers, it would solve the issue no?

@Mugen87
Copy link
Collaborator

Mugen87 commented Dec 17, 2021

Well, I was not sure about the details of my comment so I've deleted it^^.

I think the goal should be to avoid all decodes/encodes and tone mapping in the material shaders. Avoiding the decodes should be easier to implement via changes like #23039.

As mentioned in this thread, avoiding the output encoding and tone mapping in the materials means the renderer has to implement some sort of post-processing. I'm not yet sure how this would look like though (implementation-wise) .

@looeee
Copy link
Collaborator

looeee commented Dec 19, 2021

Note this has been brought up a few times over on the forum as well, so the current behavior does confuse people.

The difference is also stronger the more transparent an object is - so there's not much difference with opacity = 0.9 but with opacity = 0.1 it's quite obvious.

@gloriousjob
Copy link
Author

Not sure if I missed something but I tried updating the fiddle to use UnsignedByte, which should have enabled SRGB8_ALPHA but it seems to have no difference:
https://jsfiddle.net/2qjth7un/

So to clarify, the upper part of the fiddle is actually too dark whereas the lower part (using post-processing) is the correct image.

So when I've used things like a transparent vase, the post-processing makes it look like there's a light brightening the vase and makes seeing through the vase difficult so I think I'd actually prefer the upper picture, although I can see how the example I gave might look better.

@Mugen87
Copy link
Collaborator

Mugen87 commented Jan 12, 2022

From a quick reading, assuming the browser does the blending correctly between two separate sRGB8_ALPHA8_EXT buffers, it would solve the issue no?

Yes. And according to KhronosGroup/WebGL#2474 (comment), blending will indeed work correctly when using sRGB8_ALPHA8.

However, I want to make clear that WebGL does not allow to define the color space of the default framebuffer. That means we have to perform the final linear-sRGB encode ourself in a custom pass.

I've hacked around and verified that the components of the framebuffer's color attachment are actually gl.SRGB when a render target has the RGBA8 format + sRGBEncoding. You can see this by checking the console output and the respective code in WebGLTextures in this demo:

https://raw.githack.com/Mugen87/three.js/dev19/examples/webgl_geometry_cube.html

As you can see, the scene is too dark since I have removed the encodings_fragment chunk from MeshBasicMaterial. Meaning no sRGB encode happens in the material shader anymore.

@Mugen87
Copy link
Collaborator

Mugen87 commented Jan 12, 2022

@Kangz Is it possible with WebGPU to configure the canvas context with a sRGB format? We are currently using bgra8unorm but when trying bgra8unorm-srgb nothing is rendered and the following warning appears in the browser console:

unsupported swap chain format
context configuration is invalid.

Tested with Chrome Canary 99.0.4819.0 on macOS.

According to the spec bgra8unorm-srgb should be supported. Does it mean this part is not yet implemented?

The idea is to write linear color values into the buffer but the format automatically performs a linear to sRGB encoding. Like mentioned in #23019 (comment), it's not possible to configure the drawing buffer's color space in WebGL.

@Kangz
Copy link

Kangz commented Jan 12, 2022

CCing @kainino0x. The group wants to have a way to configure the webgpu canvas with sRGB but it's not clear whether it will be by having bgra8unorm in the list of formats guaranteed to be supported (like is in the spec right now) or by allowing to configure the canvas with "format reinterpretation" so you can render to a bgra8unorm-srgb view into the bgra8unorm texture. See gpuweb/gpuweb#1231

@kainino0x
Copy link

I am pretty sure that we will require applications to configure the canvas with, for example, bgra8unorm and then reinterpret to bgra8unorm-srgb:

Don't allow -srgb formats as canvas texture formats. Instead, the texture view needs to reinterpret the format as the -srgb equivalent. gpuweb/gpuweb#2336
-- gpuweb/gpuweb#1231 (comment)

Unfortunately 2336 is still blocked in standardization but I'm certain we'll have a way to do this reinterpretation, so canvas can rely on it. (I don't want to add bgra8unorm-srgb as a valid canvas format because (1) it's redundant and (2) it is somewhat muddy whether the browser will sample it as bgra8unorm or bgra8unorm-srgb for compositing.)

In the meantime, you should be able to create an intermediate bgra8unorm-srgb texture and copyTextureToTexture from it to the canvas (spec: gpuweb/gpuweb#2329, Chromium: https://dawn-review.googlesource.com/c/dawn/+/74301).

@Mugen87
Copy link
Collaborator

Mugen87 commented Jan 23, 2022

I've updated WebGLRenderer to perform tone mapping and output encoding as a separate post processing pass (Mugen87@88549cf).

Unfortunately, a separate render pass noticeably downgrades the performance. Examples that used to run at 60 FPS on my iMac now run at approx. 40 FPS. Here are a live links for comparisons:

https://raw.githack.com/Mugen87/three.js/dev64/examples/index.html
https://threejs.org/examples/

If there is no way to improve the performance, it seems we have to keep using the inline approach for tone mapping and the sRGB encoding for the drawing framebuffer. At least in context of WebGL.

@bhouston
Copy link
Contributor

The slowdown is to be expected. I believe you now have a FP16 or FP32 linear render target right? FP16 may lead to some speed improvements over FP32. You could try that. Just using that much memory could slow things down a lot, even without considering the separate pass.

And then a separate pass that reads this and writes to a sRGB output is another slow down.

I would suggest seeing if you wrote to the main linear render target as RGBA8 and had the separate pass the same as it is, what would the frame rate be? I'm trying to separate out the cost of FP16, FP32 on the render time versus just the secondary pass architecture.

@bhouston
Copy link
Contributor

I think it is reasonable to offer this separate pass mode as an option that people can enable. Most people do not need it, thus most people can live with the high speed all in one mode, and only those who can, can enable this higher quality mode after the fact.

@Mugen87
Copy link
Collaborator

Mugen87 commented Jan 24, 2022

I believe you now have a FP16 or FP32 linear render target right?

Yes, it's FP16. Testing with webgl_animation_skinning_blending and different types results in:

  • RGBA8 ->47 FPS
  • FP16 -> 45 FPS
  • FP32 -> 39 FPS

Tested on an iMac with Chrome.

@bhouston
Copy link
Contributor

So adding in the extra pass it reduces from 60FPS to 47FPS. That is pretty significant, more than I thought it would.

I assume this isn't a new iMac, but rather an older one right? How old? What specific GPU/CPU?

@Mugen87
Copy link
Collaborator

Mugen87 commented Jan 25, 2022

I assume this isn't a new iMac, but rather an older one right? How old? What specific GPU/CPU?

Yes, it is almost eight years old. Details:

  • iMac (Retina 5K, 27'', ultimo 2014)
  • 4 GHz Quad-Core Intel Core i7
  • AMD Radeon R9 M295X 4 GB

@iuriiiurevich
Copy link
Contributor

iuriiiurevich commented Apr 13, 2022

Here are some thoughts and some solution on this issue.

UPDATE: This is not a solution, mainly because it is based on the idea of blending sRGB components, while the proper 3D workflow is to blend linear-sRGB components.
However, it can be useful if someone wants to blend sRGB for some reason.
The good solution (the proper workflow at the cost of performance) is described here.

So the problem is that when rendering to the WebGLRenderTarget (in this case, using the EffectComposer), transparent objects mutate (opacity decreases, color becomes brighter).

As far as I understand, the problem is related to blending and is described as follows:
When the renderer renders the image directly onto the screen (in sRGB), everything is fine, because the mechanism is (roughly speaking):

  1. Take all linear colors (decode those that are not linear)
  2. Run these colors through shaders
  3. Convert resulting colors to sRGB encoding
  4. Blend them (considering the opacity)

Well done!

But when the renderer renders the image onto the WebGLRenderTarget, the mechanism is slightly different: step number 3 is missing, so that everything is blended in linear encoding and converted to sRGB later.

This workflow gives the correct result if all materials are opaque (no blending occurs between objects), but if there are some transparent objects, then obviously this will not give the same (correct) result.
So the problem is not actually related to the GammaCorrectionShader, because it works at the output of the EffectComposer, and the data is already incorrect at the input, immediately after the RenderPass.

The SRGB8_ALPHA8 render target doesn't solve this issue because as far as I understand SRGB8_ALPHA8 is a storage format, so the blending of material shader results still occurs in linear color space before the data gets into this render target.

Is it right?

Anyway, there is one way to solve this problem: https://codesandbox.io/s/issue-23019-0wjd8n?file=/src/index.js
(had to slightly complicate the original fiddle in order to demonstrate more complex cases)

Use import { init } from "./wrong" to see the issue
wrong
and import { init } from "./right" to see the solution.
right

So the current solution is to blend objects in sRGB color space and write the result to the EffectComposer's input buffer, and then, in its very first pass, convert the data to linear color space to work with it.
In order to blend objects in sRGB color space while writing to the render target, I had to hack into the Three code and allow it like so: #23870 (changed two lines of code).

So it takes an extra pass for the decoding, but this is hardly critical when using an EffectComposer with many passes.

@Mugen87 how bad is this solution?)
How could this problem be better solved?

Thanks!

@donmccurdy
Copy link
Collaborator

@iuriiiurevich I believe you've understood why EffectComposer works differently, but may have missed @WestLangley's comment above – the behavior with EffectComposer is correct, it's the default behavior in three.js that is (for most purposes) wrong.

I'm not sure there's anything we can change here... the 'inline' approach is dramatically cheaper, and it would be hard to justify removing the approach entirely. EffectComposer behaves correctly, and is available when the cost is acceptable. I would also point out that pmndrs/postprocessing has done a good thing in combining postprocessing passes — color space transforms, tone mapping, and other effects can be done in a single additional pass rather than two, which would be cheaper than implementing with three.js own EffectComposer today. I see @Mugen87 already combined two passes in his demo above.

tl;dr I would vote to close this. Per #23614 (comment) their might be value in an 'unlit' color management workflow that doesn't go through Linear-sRGB, which would mean alpha blending happens in sRGB as a side effect. Beyond that (which I'm not even sure about) there doesn't seem to be anything we can change.

@iuriiiurevich
Copy link
Contributor

@donmccurdy
Thanks for the answer!
Summing up, what solutions can we offer to people who will face this issue?

As far as I understand, when a user is faced with the fact that the colors in Three.js (when using a Render Target) and, for example, in Photoshop are blended differently, there are two main options:

  1. Leave everything as it is, accept the difference in favor of performance.
  2. Hack into Three.js' code using either @Mugen87's approach or this approach in favor of correct blending, but at the cost of performance.

Is it correct?
If so, shouldn't there be an easier way to achieve the correct color blending?

@donmccurdy
Copy link
Collaborator

donmccurdy commented Jul 5, 2022

To clarify one more thing, what I'm referring to as "correct" here is the behavior of blending Linear-sRGB components, instead of blending sRGB components. Photoshop has an option to enable/disable that, "Blend RGB Colors Using Gamma" — the preferred behavior may be different when working with 2D graphics, photography, 3D rendering, etc.

In a WebGL 3D renderer our goal is to blend colors in Linear-sRGB, and not sRGB1. Doing that requires post-processing, which can be expensive. Two cases:

  1. With post-processing (using EffectComposer and GammaCorrectionShader), color is blended in Linear-sRGB, as it should be for a lit 3D rendering workflow.
  2. Without post-processing, we simply can't do that, and color is blended in sRGB instead.

We cannot change (2), and we can't force all users to choose post-processing, but either way no hacks are required. Users can choose EffectComposer+GammaCorrectionShader if the performance cost is acceptable. Some applications will need post-processing for other reasons.


1 To be fair, some developers do use three.js for working with 2D graphics and unlit workflows, and they might prefer blending in sRGB. That's fine, and I wish we could offer a simple toggle for this, but technically our hands are tied.

@iuriiiurevich
Copy link
Contributor

may have missed @WestLangley's comment above

Yes, looks like I really missed it and got a bit confused)
So, the correct 3D workflow is to blend linear-sRGB.
I've added an update to my kind of solution so it doesn't confuse anyone.
@donmccurdy, thanks again for the clarification.

So, if the current behaviour does confuse people, maybe this should be explained somehow in the Color management documentation?

@donmccurdy
Copy link
Collaborator

donmccurdy commented Jul 6, 2022

Yes, it's unfortunate that the Color Management documentation does not discuss post-processing at all currently. Personally I'm using https://github.com/pmndrs/postprocessing — it handles color management better, IMHO (and does handle blending the same way). So I've been delaying on documenting the built-in post-processing color management.

@Mugen87
Copy link
Collaborator

Mugen87 commented May 30, 2023

Minor update: This is how the OP's fiddle should look like with the upcoming r153: https://jsfiddle.net/pbofL1u9/

sRGB and color management are enabled by default since r152. The new OutputPass which combines tone mapping and color space conversion was added with r153. Besides, since r153 EffectComposer uses half float render targets by default now.

Next step: #26129

@Mugen87
Copy link
Collaborator

Mugen87 commented Jun 28, 2023

As pointed out in #23019 (comment) and #23019 (comment), this issue can be closed.

We might want to note the usage of the new OutputPass in the post processing or color management guide though so it's more visible for the user.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

9 participants