-
-
Notifications
You must be signed in to change notification settings - Fork 35.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Remove sRGB inline decoding. #23109
Comments
Maybe @richgel999 and/or @zeux have recommendations for what to do here? |
Not sure I fully follow you (as I'm not super familiar with three.js internals). It sounds like you're currently sampling sRGB textures using linear filtering (i.e. not telling the GPU's texture filtering unit that the texture is actually sRGB), then converting from sRGB->linear in the shader. (Right?) It should be simple to just change the texture formats you're feeding WebGL to use the sRGB variants, and let the GPU do the filtering itself, with no inline sRGB decode. It shouldn't be necessary to decode the .basis/.ktx2 texture data on the CPU - the texture data should remain unchanged. |
Yes, that is right.
That is what I had in mind, too. However, this means code like in the above example needs be changed. And Looking at the current constants that would be all |
For Basis and KTX2 to work properly, we need the ability to transcode to the formats listed here. The rough prioritization of those formats is given in a series of charts from the KTX2 guidelines. Can WebGL not do sRGB conversion for compressed formats other than ASTC? 😕
|
Yes, it can. However,
|
At least internally within the rendered I think we should use those, yes – thanks! In terms of the user-facing API, I do have a concern. See #23116 (comment). |
Hmm... |
Are video textures supposed to be in sRGB? Maybe we can just state to always use |
Well, if someone is doing a video player (think YouTube in VR) would be a hard task to recompress their whole library... 🤔 |
It seems when changing the format of a video textures to However, since we are not using |
Live example: https://jsfiddle.net/uzgo9fh4/3/ |
Could we add inline sRGB decoding only for |
I've added a solution here: 57ac477 Should be easy to revert when the Chromium bug is solved. Just for the protocol: |
BTW: Is #21874 (comment) still true? In what way has Chrome no optimized support for video textures? |
A usage is basically possible if we know the dimensions of the texture in advance. Right now, when a video texture is marked for an update in |
Excellent!
I'll check. |
I'm currently working on a PR to remove the inline sRGB decode from the shader. We want to do this to avoid wrong texture filtering which was discussed in earlier issues. The removal also simplifies the code base at various places (especially the WebGL 2 code paths).
The idea is to use
SRGB8_ALPHA8
with WebGL 2 and a fallback for WebGL 1. In order to useSRGB8_ALPHA8
, textures have to be in RGBA format and must use unsigned byte which is true for most color textures. However, there are a couple of issues with that approach and I want to discuss them step by step. The first problem is processing compressed textures holding sRGB encoded values like in:three.js/examples/webgl_loader_texture_basis.html
Lines 53 to 58 in d10d1be
How should we handle this use case? Decoding compressed texels values on the CPU is by definition no option. If you do a separate render pass and decode the texture into a render target, we would loose the compression (unless we compress it again which is no good idea either).
As far as I understand we can only support compressed textures with sRGB values if formats like
SRGB8_ALPHA8_ASTC_4x4_Format
are used. Or we have to keep the inline decode. Are there any other options?The text was updated successfully, but these errors were encountered: