Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Overlaying scenes with postprocessing in r167 is broken #29087

Closed
Spiri0 opened this issue Aug 8, 2024 · 14 comments
Closed

Overlaying scenes with postprocessing in r167 is broken #29087

Spiri0 opened this issue Aug 8, 2024 · 14 comments
Labels
Milestone

Comments

@Spiri0
Copy link
Contributor

Spiri0 commented Aug 8, 2024

Description

I create two scenes and two cameras. The first camera is a perspective camera and the second is an orthographic camera for the second scene. The second scene is rendered with the orthographic camera and should be rendered over the first scene. In this way, UI elements can be easily used in an app. Until r166 it worked like in the codePen example.

footnote:
I try to create issues here as rarely as possible because I have an idea of ​​how much work all the issues are. I often spend days trying to figure out what could be wrong when it doesn't work.
I'm excited about the three.js project. I have completely switched to webgpu and only work with the webgpu world in three.js. The elegance with which it all works makes me realize how much energy and passion there is from all the developers who work on it.

Reproduction steps

I think the code examples on jsfiddle from you Mugen and mine from codePen are small enough, so I refer to the links here

Code

Try to use postProcessing.render(); instead of renderer.render(scene, camera); in the codePen example.
It shouldn't make any difference in the display on the screen. But then you only see the uiscene and no longer both scenes.

render function from the codePen example:

function render() {
  requestAnimationFrame( render );
   
  renderer.autoClearColor = true;
  
  renderer.render(scene, camera);
  //postProcessing.render();
  
  renderer.autoClearColor = false;
  
  renderer.render(uiscene, uicamera);  
}

Live example

From @Mugen87
https://jsfiddle.net/21gfet07/1/

From me
https://codepen.io/Spiri0/pen/bGPrdzE

Screenshots

No response

Version

167

Device

No response

Browser

No response

OS

No response

@Mugen87
Copy link
Collaborator

Mugen87 commented Aug 8, 2024

tl;dr; It seems clear operations after postProcessing.render(); do not work correctly.

In the fiddle, I just call clearDepth() which results in a black screen. Removing the command makes the scene render as expected.

@Mugen87
Copy link
Collaborator

Mugen87 commented Aug 8, 2024

postProcessing.render(); renders directly to screen and not into the intermediate frameBufferTarget which is maintained in Renderer for tone mapping in color space conversion. This has been change in r167 via #28781.

So if you call clearDepth() after postProcessing.render(); and the output color space is SRGBColorSpace, it clears the depth of an empty frameBufferTarget and copies the empty framebuffer to the screen.

For the same reason, subsequent renderer.render() calls overwrite whatever post processing has produces. The bit that controls the clear is shown below:

if ( renderTarget !== null && this._renderTarget === null ) {
// If a color space transform or tone mapping is required,
// the clear operation clears the intermediate renderTarget texture, but does not update the screen canvas.
const quad = this._quad;
if ( this._nodes.hasOutputChange( renderTarget.texture ) ) {
quad.material.fragmentNode = this._nodes.getOutputNode( renderTarget.texture );
quad.material.needsUpdate = true;
}
this._renderScene( quad, quad.camera, false );
}

You can verify the root cause by forcing LinearSRGBColorSpace output color space and tone mapping in the scene: https://jsfiddle.net/86zs1p42/

@Mugen87
Copy link
Collaborator

Mugen87 commented Aug 8, 2024

It will require some refactoring to make this setup work so can you use the following approach in the meanwhile?

https://jsfiddle.net/raojn721/

The idea is to render your UI as a second scene pass and then mix both passes based on the alpha value.

postProcessing = new THREE.PostProcessing(renderer);
const scenePass = pass(scene, camera);
const scenePassUI = pass(uiscene, uicamera);

postProcessing.outputNode = mix( scenePass, scenePassUI, scenePassUI.a );

@Mugen87
Copy link
Collaborator

Mugen87 commented Aug 8, 2024

This issue is indeed tricky.

Basically the renderer assumes frameBufferTarget always holds the complete data of the scene and tone mapping and color space conversion happens at the very end of the rendering pipeline. As discussed before, this premise is incorrect since depending on how you render the scene you want to tone map and convert color spaces at a different point in time. That's why we have introduced #28781.

So the current combination of the OPs code is invalid. When PostProcessing is used (or any custom workflow that writes data directly to the default framebuffer), subsequent calls of renderer.render() do not work correctly because the internal frameBufferTarget holds empty data. PostProcessing already rendered to screen and outputted the colors in the output color space.

@Spiri0
Copy link
Contributor Author

Spiri0 commented Aug 8, 2024

I don't have any pressure and I don't want to be the cause of it for others. The point seems to be well known and will therefore be solved over time. Nevertheless, thank you for your effort @Mugen87 with the alternative way.

@Mugen87 Mugen87 removed the Regression label Aug 8, 2024
@Mugen87
Copy link
Collaborator

Mugen87 commented Aug 8, 2024

I'm not marking this as a regression since the state in r166 was incorrect in the first place. The workflows are more correct in r167 however certain rendering combinations are not supported so far.

@Spiri0
Copy link
Contributor Author

Spiri0 commented Aug 8, 2024

Then the fact that it worked until r166 was a happy coincidence?

@Mugen87
Copy link
Collaborator

Mugen87 commented Aug 8, 2024

It was intended to support this use case but assumptions were made which were too simplified. Besides, we don't have a single example that uses the code setup from this issue. So I would say we have not thought about the use case when somebody wants to call renderer.render() after PostProcessing.render().

@Mugen87
Copy link
Collaborator

Mugen87 commented Aug 8, 2024

Also notice that even your original codepen works if you exchange the output color space from SRGBColorSpace to LinearSRGBColorSpace. Of course the colors are not right but it demonstrates that the tone mapping and output color space step is the problematic bit.

@Spiri0
Copy link
Contributor Author

Spiri0 commented Aug 8, 2024

It feels like I'm still pretty much alone in the community with my massive use of WebGPU and the node system. Thinking through all the options in advance is not possible or would take an unnecessary amount of time and energy. It works very well the way it is handled 👍

@Spiri0
Copy link
Contributor Author

Spiri0 commented Aug 9, 2024

@Mugen87 I'll test it with the mix as you recommend this evening because I'm curious now about this alternative. If you and @sunag's overlay consider this to be the more efficient way then it is the new standard for me.

@sunag
Copy link
Collaborator

sunag commented Aug 9, 2024

#29087 (comment) This approach is more powerful, you can mix your GUI with different blending modes and create different effects.

@Spiri0
Copy link
Contributor Author

Spiri0 commented Aug 10, 2024

I was curious because instead of the scenePass I use a shader

//illustration
const shaderParams = {
  diffuse: myScenePassTextureNode,
  depth: myScenePassDepthTextureNode
}

mix( myShader(shaderParams), scenePassUI, scenePassUI.a );

With mix it looks right. For overlay I have to look in the node system to see how I have to use it because with overlay I get mixed scenes instead of overlaid like with mix.
The bottom line is that it works well with a custom shader which can read in multiple pass textures instead of a single scenePass.
@Mugen87 what should happen with the issue? The new way does not require manual render clears. And I get along very well with that.

@Mugen87
Copy link
Collaborator

Mugen87 commented Aug 10, 2024

I would suggest we close the issue for now and see how other develoeprs approach the API. We can reopen the issue if the solution outlined in #29087 (comment) is not acceptable for certain use cases.

@Mugen87 Mugen87 closed this as not planned Won't fix, can't repro, duplicate, stale Aug 10, 2024
@Mugen87 Mugen87 added this to the r168 milestone Aug 11, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants