You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Let's consider customRenderTree example provided on wiki.
I wanted to modify it and allow rendering depth buffer into texture aswell.
When I'm setting deprthAsRenderbuffer to false, backbuffer.depthTarget.isTexture is false, and XML throws an error when trying to use this buffer as a texture (I've checked in XML3D code that with this option set to false such texture is supposed to be created.)
Code is following
var backBuffer = this.renderInterface.createRenderTarget({
width: context.canvasTarget.width,
height: context.canvasTarget.height,
colorFormat: context.gl.RGBA,
depthFormat: context.gl.DEPTH_COMPONENT_16,
depthAsRenderbuffer: false,
stencilFormat: null
});
console.log(backBuffer.depthTarget.isTexture) // FALSE should be TRUE according to xml3d code
The text was updated successfully, but these errors were encountered:
Part of the problem is that WebGL doesn't accept depth textures without activating the WebGL_depth_texture extension. Add this line to the initialization of your RenderTree:
You should also check to make sure depthTextureExt is not null, if it is then the device you're on doesn't support this extension.
The other problem is that we internally use gl.FLOAT as the format type to create depth textures. This is left over from the days when the floating point texture extensions in WebGL were working fairly well, which isn't the case anymore. Right now this isn't configurable from outside so you'll have to change the affected lines in src/renderer/webgl/base/rendertarget.js. Just search for gl.FLOAT and replace with gl.UNSIGNED_SHORT, then create your depth texture like so:
backBuffer.depthTarget.isTexture will still show false until the target is actually created during the next frame render, then it should show true afterward.
Note that you'll be limited to 256 values in the depth texture (since there are no floating point textures), so accuracy could be a problem. You could try to get floating point textures working again but this depends more on the current status of the relevant WebGL extensions than on XML3D, I haven't looked into it in a while.
Let's consider customRenderTree example provided on wiki.
I wanted to modify it and allow rendering depth buffer into texture aswell.
When I'm setting
deprthAsRenderbuffer
to false,backbuffer.depthTarget.isTexture
is false, and XML throws an error when trying to use this buffer as a texture (I've checked in XML3D code that with this option set to false such texture is supposed to be created.)Code is following
The text was updated successfully, but these errors were encountered: