Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add OIDN Denoiser. #663

Open
wants to merge 44 commits into
base: main
Choose a base branch
from
Open

Add OIDN Denoiser. #663

wants to merge 44 commits into from

Conversation

DennisSmolek
Copy link

@DennisSmolek DennisSmolek commented Jul 24, 2024

This PR Adds the OIDN Denoiser (Denoiser) into the core three-gpu-pathtracer.

It works, and looks decent but there are clear issues to resolve.

API Added

  • maxSamples Pathtracer now stops when reaching this number. Also is the number the system denoises at
  • enableDenoiser If we are using the denoiser (NOTE, the denosier actually initializes regardless of this for webGL compat)
  • weightsUrl Lets you bypass the jsDelivr URL for your own location of the TZAs. Also how you could pass your own weights in general
  • denoiserQuality Fast by default, Balanced is better. "high" is really heavy and not supported in the Denoiser yet
  • useAux Whether or not you want to use just the pathtraced input (fast, but bad) or with the Aux
  • renderAux If you want to view the aux inputs set this to albedo or normal null gets you the denoised output
  • cleanAux If you are 100% the aux inputs are clean, ANY noise will be projected into the output (NOTE, upstream fix may be needed)
  • externalAux If the user is supplying their own Aux Textures. disables automatic generation
  • denoiserDebugging Sets the denoiser to log more things including an output report.

State API Properties (read-only)

  • isDenoising If the denoiser is running
  • isDenoised if the current output is the denoised output

Methods

  • generateAux() runs the albedo/normals pass on the current scene. returns { albedo: THREE.Texture, normal: THREE.Texture }
  • setAuxTextures(albedoTexture, normalTexture) if the user wants to set the aux themselves, like in a pipeline or has a deferred setup. Deactiviates internal automatic creation (this.externalAux = true)
  • denoiseSample(inputTexture, albedoTexture, normalTexture) loads automatically or can be overridden letting you send to the denoiser directly to the DenoiserManager Not sure about exposing this, called internally by renderSample

Changes to Core

  • Not much was done to the core itself. The main thing is the addition of another fullScreenQuad pass that blends the pathtracer output with the denoised (or aux if you're debugging) outputs.
  • The second significant change is the addition of maxSamples and it's stopping the pathtracer when reached.
  • There were other changes to better support the denoiser, but nothing breaking.

One change was added to the BlendingMaterial to allow a conversion to happen to texture2. While we resolve the colorspace issues it may be useful.

Additions

  • AlbedoNormalPass: Generates Albedo & Normal textures based on the current scene
  • DenoiserManager: Holds all things related to the denoiser and interfaces with the Denoiser class directly

Flow With Denoising

Assuming Denoiser is enabled and all options default/set

  1. Pathtracer runs until maxSamples is hit
  2. Denoiser process started. Block future denoiseSample calls until finished
  3. Albedo and Normal Textures Generated (only when needed by the denoiser, not every frame)
  4. RawPathtracerTexture (pathtracer.target.texture), AlbedoTexture, and NormalTexture sent to the DenoiserManager.
  5. If the renderer size has changed, regenerate renderTargets and set Denoiser sizes.
  6. Render the RawPathtracer texture to a Quad with same tonemapping as default PT so resulting texture matches and within [0, 1]
  7. Extract raw WebGLTextures from internals of input textures (THREE.Textures)
  8. Set those Inputs on the denoiser
  9. Execute the denoiser, return denoisedWebGLTexture
  10. If not already created, create and load a ThreeJS texture that is properly initialized and setup within threejs internals outputTexture ( one of the many steps to get WebGL/Three to play nice)
  11. Merge the denoisedWebGLTexture with the outputTexture
  12. denoisedTexture is held in the class ready to be rendered
  13. denoiser marked finished, results say it is now denoised
  14. In the next renderSample call now that isDenoised is true, render using DenoiserManager.renderOutput()
  15. Quad material now set to BlendingMaterial
  16. Blend between previous PT Output (results of step 6) and the denoisedTexture (Note: if RenderAux is passed, will render the aux texture provided so you can see either the albedo or normal textures visually. Not sure the normals render how you'd expect, as they should be outputting in [-1, 1])

If reset is called it all drops back to normal, hides the outputs etc and starts over.

Known Issues

1. ColorSpaces/tonemapping/conversion: Something is clearly not setup right through the pipeline regarding colors. The denoiser DOES NOT CARE what colorspace you input. Whatever you input, it will output. Color and Albedo inputs should be in the same colorspace, and normals are expected to be in linear. The output looks correct but dark. converting makes it look flat.
This will take tweaking and someone smarter than me with regards to color to follow the renderTargets (all setup with THREE.SRGBColorSpace and what we should convert/adjust

2. Normal Generation: I have a script setup to use worldspace or viewspace normals and output to [-1, 1]. I don't know if colorSpace/RT's might be effecting this (I don't think it matters). Also, I have added code to accept normalMaps on meshes so they can be used instead of the raw surface. I tried converting the NormalMaps to worldspace with tragic results. For the moment If an object has a map I use it.

Something very important to point out about normals and the denoiser. The denoiser does not actually use them for normal values or any kind of light calculations whatsoever. They can be in world/local space. The normal maps just help define edges and breaks in materials.

3. Albedo Generation: In The OIDN Docs they go into detail about the albedos. In general, most matte surfaces can use just the basic color output or textures. This isn't true for highly reflective surfaces (like the floor). Here albedo wants something different, you can read the docs and it says something to the effect that the reflected surface should have the albedo of whatever they are reflecting, or a full 1 value. It gets even weirder for reflections.

4. Edges with floor and background. Looking at original OIDN examples they do not expect black backgrounds or hard edges where soft edges blend with the original background. The normals/albedo read those as hard crisp edges or flat planes. So the gradient background gets flattened into a weird flat gradient and the floor has sharp edges. So with threejs backgrounds or envMaps we should generate something for both of these passes and include any floor/horizons.

The only one of these issues I see as a blocker is the colorspaces. Generating albedos and normals etc we can work on for a while and it would be fine to release improvements as updates. But gotta get closer on the color outputs IMO.

Other Changes

I updated the plain example (index) to include denoiser props and some other slight adjustments. I added a rawOutput canvas to the html which is very easy to setup when debugging to see exactly what the denoiser is outputting and confirm your inputs/outputs.

Notes

I tried commenting a lot to explain what is happening, and I realize my commits are terribly labeled. I was focusing on getting this added as the main thing and not making a lot of gradual changes.

src/core/WebGLPathTracer.js Fixed Show resolved Hide resolved
example/index.js Fixed Show resolved Hide resolved
example/index.js Fixed Show resolved Hide resolved
Copy link
Owner

@gkjohnson gkjohnson left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I left a smattering of comments for some of the things that stuck out to me. Once some of the architecture is nailed down and it's more clear to me exactly what's happening I can help address some of the color space and tone mapping issues.

src/core/utils/AlbedoNormalPass.js Outdated Show resolved Hide resolved
src/core/utils/AlbedoNormalPass.js Outdated Show resolved Hide resolved
example/index.js Outdated Show resolved Hide resolved
example/index.js Outdated Show resolved Hide resolved
package.json Outdated Show resolved Hide resolved
src/core/WebGLPathTracer.js Outdated Show resolved Hide resolved
src/core/WebGLPathTracer.js Show resolved Hide resolved
src/core/utils/AlbedoNormalPass.js Outdated Show resolved Hide resolved
src/materials/fullscreen/BlendMaterial.js Outdated Show resolved Hide resolved
Comment on lines 90 to 91
// Run the denoiser
const denoisedWebGLTexture = await this.denoiser.execute();
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see some logs about tiling happening - are the tiles being rendered over multiple frames? It would be nice if the page didn't lock up quite so much when denoising is happening. The tiles could also be used to display a completion percentage.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The tile generation we could probably wire up to wait for a frame before going to the next tile (allowing less lockup) but I wonder if this would slow things down on higher speed machines.Tensorflow has a method for this

Tile reconstruction (which is the second half) should happen as fast as possible, I'm not sure we want to divide that up.

I'll log down both strategies to try.

One thing to remember, With the WebGL backend the first execution is SIGNIFICANTLY longer than the others. Tensorflow waits to compile shaders until you actually execute.It's suggested we run a few images through to "prime" the model but this would cause it to be slow at load time which I was avoiding.

I could see a "ready()" callback that fires after a warm up image is passed.

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

With the WebGL backend the first execution is SIGNIFICANTLY longer than the others. Tensorflow waits to compile shaders until you actually execute

Is there any way to use the KHR_parallel_shader_compile extension so the shaders will compile asynchronously? Three.js implements this with WebGLRenderer.compileAsync, which the PathTracer uses here so the page does not block while the path tracing shader compiles.

- Moved the denoise class external
- Renamed class OIDNDenoiser
- Moved many props to denoiser directly
- Updated example with moved location
- removed weird imports
- removed yarn lockfile
- minor cleanups
Simplified example and removed blocks for renderScale.
Now works with different render sizes.
Oddly, normals/albedos also work
ALbedos still seem off. Cant for the life of me get the background to go into srgb
@DennisSmolek
Copy link
Author

Screenshot 2024-07-25 022324
I've done most of the things on your list,
Most importantly made the denoiser external OIDNDenoiser and I seem to have resolved the coloring issue for the denoised output at least.

It turns out we were passing Linear to the denoiser. Even though the render target and texture were set to SRGB it still output Linear. Adjusting the output manually in the shader corrected this.

Albedo & Normals were rewritten, still needs some work and the background always renders linear (weird)

I modified the BlendMaterial again as it's the best place to debug outputs/colors. I also used a more correct LinearToSRGB function.

Something helpful to check colors and interesting to compare is the "Split" property in the denoiser panel. This will divide the page between the pathtracer output and either the denoised result or the AuxTexture for comparison purposes.

Here you can see the sky is still wrong
Screenshot 2024-07-25 022450

A little messy still. I need a cleanup pass tomorrow.

@gkjohnson
Copy link
Owner

gkjohnson commented Jul 25, 2024

Here you can see the sky is still wrong

This looks like a tone mapping issue - as in tone mapping is not applied to the final denoised image. If you disable "tone mapping" in the example UI the colors look the same:

image

It looks like you're using "BlendMaterial" which doesn't support tone mapping since it's just intended to be an internal utility shader for the PathTracer. Using something like ClampedInterpolationMaterial should implicitly perform correct color conversion and tone mapping based on renderer settings (assuming the source texture has the color space set correctly). ClampedInterpolationMaterial was originally designed for this final write to the canvas.

@DennisSmolek
Copy link
Author

This looks like a tone mapping issue - as in tone mapping is not applied to the final denoised image. If you disable "tone mapping" in the example UI the colors look the same:

I actually already tonemap the texture prior to it going to the denoiser. The reason I use the blendMaterial (besides it being so useful) is that it doesn't apply anything else. At the point of the blend it should have the original perfect output (need to add the ability to remove tonemapping from mine) and the correct denoiser output.

At the moment I'm super happy with the color result. The pathtraced and denoised look great.

The Albedo I dont know and I'll try to find time to mess with it.

@gkjohnson
Copy link
Owner

I actually already tonemap the texture prior to it going to the denoiser.
...
At the moment I'm super happy with the color result. The pathtraced and denoised look great.

The quality does look really nice - but there is some color shift once it goes through the denoiser. It's just not clear why to me, yet. I'll have to take a look this weekend.

Removed complex albedo gen and added material pool.
Added PP step to albedo gen so colors correct
@DennisSmolek
Copy link
Author

Material Pool system added.

We now create materials as needed and set the values based on the objects original material.
When the process is done we remove all references to the objects and their materials meaning they can be removed. We hold the materials in a pool for their next use and I created a destroy method for both the materialPool and albedo/normals pass.

Removed the complex albedo generator for just a meshBasic material. It works for now but is likely going to have transparency and reflectance issues.

Added a PP pass that correctly tonemaps and converts the albedo so now it matches the original pathtracer beauty output when we send it to the denoiser. I think this will fix any oddities you see in the colors.

@DennisSmolek
Copy link
Author

After battling render flow, making a bunch of charts I think I have things in a stable setup. There are still a few problems to resolve.

Notable changes:

  1. Simplified Index example, added Denoiser example with simpler controls but more denoiser controls
  2. Got the color pipeline to match.

What the problem was: Threejs does not apply toneMapping to renderTargets. So even though I was copying the exact same textures/materials I was getting different outputs with not only different tonemapping but different colorSpaces.
I have added (and removed) a few key conversions that take place so we are now sending a SRGB non-tonemapped Pathtraced Image, a SRGB converted Albedo, and a standard Normal to the Denoiser then doing the tonemapping in the final step.

If I recall correctly though this is the wrong order and the tonemapping should happen BEFORE we send to the denoiser for the best outcome (standard for every pipeline with ANY denoiser, not just OIDN) but I could be wrong.

Problems:

  1. Albedos just use basicMesh. I think this is the cause for some issues, most notably edges around the floor. If you set the floor to a 0 opacity and it has to render the background there is very little noise around the edges. But with the reflective floor (and a black albedo) it has issues. Transparency we'll have to test too. (this is a longer term issue to resolve)
  2. Tiling seams. All the color conversions/pipelines/flow before & after denoising and not being tonemapped prior to the denoiser running, are causing seams in the tiling to become slightly visible. I dont know if this is the tiling from the pathtracer but I'm treating is as an issue with the denoiser.
    I am working on better blending upstream in the main denoiser class but it's a little difficult to test. I'll setup something with this PR and a special version of the denoiser with the new tiling to test both systems out.
  3. Speed. In general the speed is not a problem (especially on fast machines) but it can always be better. Initializing and tiling both lock up the thread. Upstream I am working on this so the denoiser will report progress of it's batches. This is a non blocking problem and one I'll continue to address.

Please take a look and let me know what you think.

PathtracerDenoiserRenderFlow

@gkjohnson
Copy link
Owner

Got the color pipeline to match.

I can take a look at the colors

If I recall correctly though this is the wrong order and the tonemapping should happen BEFORE we send to the denoiser for the best outcome

Do you have a reference for this? I would think ideally tone mapping is always applied before colors have been quantized into 8 bits to avoid banding but it shouldn't be a huge deal. We can take a look at this later I think.

  1. ... Transparency we'll have to test too.

We should be able to just set the mesh opacity and alpha maps to help with this. Transmission is more complicated because we can't correctly model the refraction that happens in that case.

  1. Tiling seams. ...

Interesting - I haven't seen any tiling seems before so my assumption is that this is related to the denoiser. I'll try to keep an eye on this why looking at the color processing.

  1. Speed. ... Initializing and tiling both lock up the thread

Avoiding the lock up I think would be great. The two things that come to mind are performing tiles over multiple frames and performing async material compile so other tasks can run while compiling is happening.

An event or flag that specifies how complete the denoising process is (just 0 to 1) would let us display it. And likewise a way to "cancel" the denoiser for cases where the camera moves, scene changes, etc would be nice.

example/index.html Outdated Show resolved Hide resolved
example/index.js Outdated Show resolved Hide resolved
example/index.js Outdated Show resolved Hide resolved
@@ -0,0 +1,779 @@
import {
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think ideally this example would just include a single model to keep it focused and simple. An ideal model would include something with opacity and transmission so we can evaluate these kinds of problem cases. See the HDR example or DoF example. Though this is something that can be adjusted later if needed.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I get that, and we can certainly limit the models. But I think it easier to switch to models that highlight the edge cases specifically vs trying to find a single model that fits every one.
LMK what you think and I'll adjust it

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think just one model is good for now - for debugging we can swap between models to check issues. I think it's best to have a more simple example for users to reference.

src/core/OIDNDenoiser.js Outdated Show resolved Hide resolved
Comment on lines +63 to +67
get hdr() {

return this.denoiser.hdr;

}
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is hdr functional right now? It looks like we always pass sRGB into the denoiser

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It was.
Technically the denosier can already support it.
While working out the color issues I matching the original system as much as possible. this included passing the hdr texture.

When passing HDR and using "fast" quality I got HORRIBLE results, but using balanced was excellent. That's why I added the auto change to those props.

When trying to address the color issues in LDR I resolved it and found the system to be more stable with fast/balanced quality while using LDR.

If you don't ever plan to pass HDR to the denoiser or don't like leaving future handling code I can remove it.

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Gotcha - is it the case that we just have to pass the original LinearSRGB Path Traced texture when hdr is true and we'll get a LinearSRGB texture out? I can try it a bit to see how it looks.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, if we pass that texture as the input the denoiser will return it in the same range (or at least it should, I've only tested a few times)
OIDN says that in HDR 1 should be a luminance level of 100 cd/m^2
so for best results it should be similar.

If you do that you may want to disable the SRGBToLinear conversion that takes place on the denoiser result. I do this so we can correctly apply the ToneMapper.

One thing to consider would be dropping the SRGB > Denoiser requirement. This would mean textures dont need to be converted when returned and can be tonemapped directly

@DennisSmolek
Copy link
Author

I suppose we can leave it for now but all of these models can be checked in the main example with denoising and urls copied into the denoiser example with more options for dev. These example files have a significant maintenance burden whenever APIs change or need to be updated so keeping them simple is a priority for me.

For sure not trying to add burden, I'm just copying the examples that are already in use with the PBR example so if they change or get removed it's just as easy to remove them here. Yes you can run the denoiser in the PBR example, but you can't run the split comparison or look at how the transparent, reflective, metallic, etc objects generate albedos/normals etc. Instead of finding a super-model that tests everything in one go, it might be easier to use the same models we already load and see how they compare to each other.

Whatever you want do do though is fine with me. I can always make my own example on the main denoiser site with more models.

  • Normals are currently stored as sRGB which is atypical. Are we sure the normal image is supposed to be sRGB? The data stored in a normals texture is not considered color data and is usually loaded and read without color space transformations, even when reading from image files.

This is a mistake. This comes from battling color conversions. Regardless of the colorspace of inputs Normals should be in Linear. I will adjust this, The denosier doesn't actually use normals in the common sense, but this will better match training data.

It's actually implemented from what I can tell but it's burried in github discussions and not doccumented. I will experiment with this upstream.

  • I've noticed the denoiser can stall for quite awhile at 99% but it's not clear to me from profiling what's causing this. Are you seeing it, as well?

I am going to work on a flowchart of the denoiser so it's easier to explain, but right now the progress is only measuring the inference/tiling stage which executes with frame blocking to allow the callbacks. The next steps execute in a single step and theres no way to call back progress until it's totally done. The biggest delay is probably syncing from gpu to cpu.
Over time the upstream denoiser will get better at "progress" but for now I block at 99% waiting for the renderable WebGLTexture to return before finalizing with 100%. 100% is also a "done" marker at this stage too.

  • It looks like instantiating the Denoiser has a side effect enabling the scissor test for some reason:
const gl = renderer.getContext();
console.log( gl.getParameter( gl.SCISSOR_TEST ) ); // false
this.denoiser = new Denoiser( 'webgl', renderer.domElement );
console.log( gl.getParameter( gl.SCISSOR_TEST ) ); // true

Weird, I've added it to my list to put in the WebGLStateManager class (although I thought it was there)

  • I'm not sure if you saw this issue here - but it looks like once the desnoiser has started you can't perform any other draw calls without corrupting the denoised image.

Calling a draw call means Threejs takes over the state. I'm sure this messes things up. With the new abort flag the denoiser shouldn't output anything after the abort is called so if you need to draw, abort the denoiser and it should stop anything returning. It's a new feature so if it doesn't work right lmk and I'll try and test for it

  • White colors are still getting significantly blown out by the denoise process and it doesn't seem right? We wind up losing a lot of detail in brighter parts of the image for some reason. I'm not sure how difficult it is to test just passing the canvas directly into the denoiser again to see if it's causing the same problem? It looks like a color conversion may be happening somewhere causing us to clamp or lose precision or something.

Weird, I thought this got resolved with the correct tonemapping etc.

We can actually test pretty easily. using usepassThrough Honestly, I think this is probably an issue with all the color conversion s and working in SRGB not linear.
I've read a lot of people saying to send input in as SRGB but I think thats not actually the right way. OIDN itself defaults to Linear input, and I"ve actually dug in and found that when you use SRGB it converts the inputs to linear before processing and then reapplies SRGB to output.
Along with the normals I'll do some tests with a Linear version and I'll add a trigger for passthrough on the denoiser example (we can remove it later)

Hopefully it's just a colorspace thing. The next thing I could think would be an albedos/normal issue? One thing I did also try is using more samples. Using more samples seemed to stop the denoiser guessing so much on things..

@gkjohnson
Copy link
Owner

Yes you can run the denoiser in the PBR example, but you can't run the split comparison or look at how the transparent, reflective, metallic, etc objects generate albedos/normals etc.

For users trying to understand the denoiser settings available I don't think this necessary to demonstrate for multiple models. And for dev we can change the models for testing. I'll update this after the PR is merged, though.

The denosier doesn't actually use normals in the common sense, but this will better match training data.

The relative scale of the values being correct should be an improvement

Calling a draw call means Threejs takes over the state. I'm sure this messes things up. With the new abort flag the denoiser shouldn't output anything after the abort is called so if you need to draw, abort the denoiser and it should stop anything returning. It's a new feature so if it doesn't work right lmk and I'll try and test for it

I figured the webgl state for TF was being saved after the TF function exited and would be restored to what TF needs when re starting the TF work. Or are these "starts" and "ends" something not in our control?

Hopefully it's just a colorspace thing. The next thing I could think would be an albedos/normal issue? One thing I did also try is using more samples. Using more samples seemed to stop the denoiser guessing so much on things..

Using linear data would simplify things. I'd be surprised if the reason is just number of samples, though. The top of those lego studs are almost all gray converted to white and some of the studs in the bottom right are erased completely. It seems to primarily be a problem with white?

@DennisSmolek
Copy link
Author

Something in the recent commits has broken things...

When you rotate after denoise the denoised image turns to the blank canvas.

Also, the denoiser is being marked dirty on every run so some property is being set over and over.

I'll try to go through the commits to find it

@DennisSmolek
Copy link
Author

Ok the marking dirty is from removing the check if the size has changed. I will work on this upstream to protect against it.

However the blend not blending is still a problem. Both introduced in "some cleanup"

@gkjohnson
Copy link
Owner

gkjohnson commented Jul 31, 2024

Something in the recent commits has broken things...

When you rotate after denoise the denoised image turns to the blank canvas.

Just push a line I forgot to commit - sorry about that. You're talking specifically about the denoiser.html example, right? I'm not seeing issues in the index.html page.

However the blend not blending is still a problem.

What does "blend not blending" mean? I see, now. I'm less worried about this since it's caused from bypassing the the WebGLPathTracer render in favor of a custom overlay for "split" but I may be able to address it easily.

@DennisSmolek
Copy link
Author

Something in the recent commits has broken things...
When you rotate after denoise the denoised image turns to the blank canvas.

Just push a line I forgot to commit - sorry about that. You're talking specifically about the denoiser.html example, right? I'm not seeing issues in the index.html page.

However the blend not blending is still a problem.

What does "blend not blending" mean? I see, now. I'm less worried about this since it's caused from bypassing the the WebGLPathTracer render in favor of a custom overlay for "split" but I may be able to address it easily.

Interesting I fixed it a different way. You change the scissorTest value from false to true but never reset it, if you set it to false after the render it also fixes.

I thought it was an issue with the blending of the denoised over the noisy not the scissor test. Once I realized it wasn't on the index test I found the scissor issue.

@DennisSmolek
Copy link
Author

Latest Version changes:

  • Upstream: Protect against same values marking dirty (stops denoiser rebuilding every run making it SIGNIFICANTLY faster)

  • Upstrream: Warmstart using Parallel shader compile

  • Upstream: WebGLStateManager improvements

  • Upstream: Cleanup

  • Upstream: Slight model change looking for speed improvement

  • Linear colorspace: All render targets (pathtracer, albedo, normals) now run in Linear colorspace no conversion to SRGB used.

  • Names changed to reflect operation not colorspace

  • Minor adjustments and fixes

  • Debug controls added for now (probably removed later)

I spent time with the models to check color/brightness. In general with the linear flow I like the output better but I need to check the lego model and see if it's doing the same thing.

My main check was colors, the floor, the sky
Screenshot 2024-07-31 231020

Color areas usually checked out.
I did notice if I zoomed out the NASA label got slightly bright, and when I checked the albedos at this level, it matched the higher brightness. So this is still one thing to keep messing with.

Regarding the state. I added restoring external state to the state manager. so it not only restores it's own state but the running state. This works fine with my original pathtracer setup, but this version it has conflicts every other execution. Luckily I added a "ignoreRestore" flag to the state manager and this seems to correct the every other WebGL error (instance draw without buffer) My guess is something in three changes and when the state restore is run it's out of date. Mixing webGL state is super fragile so for now I think this the best we can get.

Shader compiling/warm starting added, however, there is a minor speed improvement, but for the most part it seems it still waits to send data to the GPU util execution. There might be some exploration here but I don't suspect much gain.

I still get 20-30 second load times to compile the GPU Pathtracer shaders when first starting btw.

Minor model changes. My Mac version now executes 500-1000ms and my PC I've dipped below 1000 but that was probably a smaller size. most of the time my PC (3060) gets ~3500ms

I'm curious to see how WebGPU compares but I'll save that for another day.

@DennisSmolek
Copy link
Author

DennisSmolek commented Jul 31, 2024

Ok spent some time on the X-Wing..
This has a split slightly left of center. You can see the colors themselves are all matching correct so doubt it's a color space issue
Frame 2

Tested the normals and albedo and got this result:
compared

My theory: The 100% white Albedo is being treated as a transparent surface. Combined with the fairly uniform normals it thinks it's a super reflective/transparent surface and is taking the albedo map as the color source.

I'm not sure what else it could be and it's something we can mess with. The colors in the scene are all correct looking including the backgrounds.

One thing to note, in my other renderer, the normals are significantly different than the ones in the new pass.
Normals

A key to remember for the denoiser is not so much that normals are 100% light correct, but that I think they act as a segmentation mask helping find the edges of shapes.

This is all just a theory, but I tested the regular X-Wing which uses grey not white, and it renders fine.

this.denoiser.weightsUrl = 'https://cdn.jsdelivr.net/npm/denoiser/tzas';
this.denoiser.onProgress( progress => this.handleProgress( progress ) );
//this.denoiser.hdr = true;
this.denoiser.webglStateManager.ingoreRestore = true;
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

spelling issue - ingoreRestore -> ignoreRestore. Can we add an inline comment describing why we need this

@gkjohnson
Copy link
Owner

gkjohnson commented Aug 3, 2024

Upstrream: Warmstart using Parallel shader compile
...
Shader compiling/warm starting added, however, there is a minor speed improvement, but for the most part it seems it still waits to send data to the GPU util execution.

Nice! Can you point to where this parallel compile is being done? If you attempt to use the shader after starting the parallel compile before it's done then the driver will still stall the page. The goal here is less of a speed improvement and more preventing the page from stalling.

I still get 20-30 second load times to compile the GPU Pathtracer shaders when first starting btw.

This is an issue with DirectX that's unavoidable. For some reason it takes forever to compile shaders. The Chrome / ANGLE team have said it's a known issue. The use of async compile should mean the page isn't completely locked while the shader compiles, though. This isCompiling flag is used in renderSample to prevent the path tracing material from being used until it's ready.

Ok spent some time on the X-Wing..

I took a look today and it seems to be due to clamping colors when converting from the Float texture to Uint8 target. This clamps and loses data above 1.0 used in the final tone mapping step which is why we're losing some of the bright highlights in the final image.

  1. The solution is to either denoise the final tone mapped image or deoise the hdr path traced image and get an hdr result. Denoising the hdr image would be the ideal solution since it can be nice to adjust tone mapping, exposure, and post effects like bloom (which requires hdr anyway) without another denoising step. But I'm not sure how much work getting hdr working is.

One thing to note, in my other renderer, the normals are significantly different than the ones in the new pass.

The normals on the left are packed into the range [ 0, 1 ] - looking at the OIDN docs again it looks like they're supposed to be in the range [ -1, 1 ], which I must have misread previously:

Just like any other input image, the normal image should be anti-aliased (i.e. by accumulating the normalized normals per pixel). The final accumulated normals do not have to be normalized but must be in the [-1, 1] range (i.e. normals mapped to [0, 1] are not acceptable and must be remapped to [−1, 1]).

  1. What format are we supposed to feed into the denoiser, then? The current format used for the normals texture will clamp the values of the output normals and negative values will just be black.

This is looking good, though. Looks like just a few final things.

@DennisSmolek
Copy link
Author

Finally have some time to circle back on this and wanted to see what remaining things are needed.

Regarding HDR. I've had hit or miss results with it and found the standard def had better results. We can start working in HDR but wouldn't that make everything HDR or would we clamp it down at the end?

Regarding Normals: I'll have to double check but I'm pretty sure I'm already remapping to [-1, 1] during the texture setting phase (I assume most users and shaders will use [0, 1])

@DennisSmolek
Copy link
Author

  1. The solution is to either denoise the final tone mapped image or deoise the hdr path traced image and get an hdr result. Denoising the hdr image would be the ideal solution since it can be nice to adjust tone mapping, exposure, and post effects like bloom (which requires hdr anyway) without another denoising step. But I'm not sure how much work getting hdr working is.

So, we can try working in HDR, but is that the output you want from the pathtracer? The denoiser will output HDR if we want it to. But I didn't know if that's what the pathtracer should send to the texture. We'll need to test a bit as I recall it had some issues. Otherwise I wonder if we can move the clamping to after the tonemap step.

  1. What format are we supposed to feed into the denoiser, then? The current format used for the normals texture will clamp the values of the output normals and negative values will just be black.

I'm not sure what you mean exactly, right now I take a standard normal texture as input and re-map it. In my base renderer this seems to work ok.

@gkjohnson
Copy link
Owner

Regarding HDR. I've had hit or miss results with it and found the standard def had better results. We can start working in HDR but wouldn't that make everything HDR or would we clamp it down at the end?

The goal would be to denoise in HDR then tone map the image to SDR. We could tone map before denoising to fix the color clamping but this way is more flexible and let's the user change the tone mapping without denoising and also let's you save a denoised HDR image. Right now the final result from the path tracer in a rendertarget is an HDR float buffer that is tone mapped when when written to the canvas.

Regarding Normals: I'll have to double check but I'm pretty sure I'm already remapping to [-1, 1] during the texture setting phase (I assume most users and shaders will use [0, 1])

What I'm wondering is what happens when a component is negative. If using an rgb8 texture (as the normal textures are here) any negative components will be clamped to 0 and there will be no distinction in normals. What's right thing to do here? Should we be using float textures? Does OIDN support that?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants