You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
# some intialization code
# rendering code for SSAO and no transparency
# rendering code for no SSAO, no transparency
# rendering code for transparency
# FXAA
which hard-codes how things are rendered and in which order rendering takes place. While it is possible to skip a step this is still pretty inflexible. I think one of the main goals with GLMakie is to improve this, as a more flexible renderloop would allow us to add more rendering options without having to decide for one thing over another or worry about bottom line performance with high fidelity post-processors.
Rendering order
The current implementation essentially takes the order
for step in render_steps
for scene in scenes
# do some renderingendend
which is reasonable when each scene uses the same rendering steps to render. With a more flexible renderloop this doesn't make much sense though. In that case it would be better if each Scene could define its own rendering steps and order would be changed to
for scene in scenes
for step in scene.render_steps
# do some renderingendend
As an example we can look at a figure with a 3D plot, a 2D plot and some UI elements. For the 3D plot we likely want some 3D specific rendering options like lighting, SSAO, shadows, order independent transparency, etc. For the 2D plot we probably just want transparency based on draw order (which could follow plot z order) and FXAA. For UI elements we're often fine without FXAA, as lines, linesegments, scatter and text are anti-aliased without and pixel-aligned rectangles don't need AA either. So we would have 3+ different setups here to optimally draw the scenes.
Scene order
As far as I can tell Makie doesn't have a well defined rendering order for scenes. GLMakie currently follows insertion order, where a scene is inserted when the first plot is added to it. For example
will change the order of scenes displayed depending on which scatter! runs first. This may further be different in CairoMakie and WGLMakie. (See also #2650)
Before refactoring the rendering loop we should figure out how we want scenes to be ordered so we can stay consistent across backends. I think the right order here is probably depth first, perhaps with an added restriction that a child scene does not draw outside its parent. (This would fit an interpretation of a Scene being like a window in another parent window.)
Rendering steps
I think a good representation for render steps would probably something similar to (material) nodes in Blender. It's not really clear to me how this would look in implementation, but we can collect some requirements/ideas:
Different rendering steps should be able to use the same buffers
Rendering steps should be able to declare a lifetime for data in a buffer.
Temporary: FXAA uses a temporary colorbuffer which is freely usable at any point.
conditionally reusable: SSAO includes geometry buffers, which accumulate data from multiple draw calls, but can be used after SSAO runs.
never reusable: Ray tracing needs an accumulation color buffer which it continues to draw to over multiple render loop iterations.
Buffer specifications should be somewhat lose (e.g. allow 8 bit color buffers to use a 16 bit buffer if that's already available.)
Rendering steps need to affect draw calls globally (e.g. everything rendering with SSAO needs to write to geometry buffers)
Plots/Render objects need a more general way to specify which render step they belong to. (I.e. generalize ssao/transparency/fxaa = true/false to something like stage = :ssao)
A render step should be able repeat draw calls (e.g. for depth peeling)
A render step should be able to overwrite uniforms (e.g. camera matrices for shadows, 3D anaglyph)
It should be possible for a render step to not include plot draw calls (e.g. a blur step)
The connectivity between inputs and outputs of different render steps should be adjustable. (E.g. a blur step should be able to attach to pretty much any other step, regardless on whether it acts on a depth buffer, a normalized 8 bit color buffer or a 32 bit float buffer)
It should be possible for a render step to affect stencil buffers, i.e. perform clipping operations
The rendering order of plots/renderobjects should be adjustable through a rendering step (E.g. for z - sorting to deal with transparency in 2D)
Other Notes
Currently there is one depth buffer which applies to every scene. I don't think this all that useful, so I'm leaning towards removing this or making it not the default.
It might be good for performance to merge scenes with the same rendering steps.
Potentially following improvements/additions
I think reworking the rendering pipeline is necessary before we can add more rendering options. Some of the things we could look into after this are:
transparent screenshots and subscene screenshots (e.g. for inserting a 3D GLMakie render into CairoMakie)
other postprocessors: motion blur, depth of field, bloom, ...
Other Goals
Clipping
With clipping one can discard fragments drawn outside a certain region. This would be helpful for Axis3, for example, to discard anything outside its bounding box. With that we could add zooming to it. Another use case could be cutting through a plot to view, for example, the inside of a mesh or volume plot (e.g. for medical imaging). (See #2783)
Sprite sheets
A sprite sheet is essentially a collection of images which can be passed to the GPU without the need for individual textures. We use one for text and scatter rendering, where the "images" are representations of different letters and/or markers. Some other uses include 2D animation (character animation, maybe gif and video playback) and drawing tiled maps (mostly 2D but also 3D like Minecraft). While this might be a bit more of a game engine feature, I think it would be nice to have.
Upload/Update on Demand
With the "push" structure of observables any change to a plot argument or attribute will trickle down to GLMakie, potentially triggering calculations necessary for rendering and eventually an causing data to be uploaded to the GPU. This may happen more often than a frame is actually rendered (either due to multiple triggers of the same observable or repeated function calls caused by multiple observables) which i ultimately a waste of effort. To avoid this we could try to switch to a "pull" structure, where a render object pulls updates before rendering a frame rather.
Mesh improvements
Materials
Currently Makie doesn't have a material system. You can set a color or texture and maybe do a bit more by messing with lighting. It would be nice to expand on this with a material system and add things like a roughness modifier, a metallic modifier and whatever else is out there.
Skinning/Skeletons/Animation
I think this is more of a far future thing, but it would be cool if Makie could handle meshes with movable joints and animations. The quick guide for glTF might helpful for this https://github.com/KhronosGroup/glTF#overview
planningFor discussion and planning developmentGLMakieThis relates to GLMakie.jl, the OpenGL backend for Makie.Collectioncontains multiple issues
1 participant
Converted from issue
This discussion was converted from issue #2995 on August 26, 2024 11:36.
Heading
Bold
Italic
Quote
Code
Link
Numbered list
Unordered list
Task list
Attach files
Mention
Reference
Menu
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
More flexible renderloop/rendering
Our current renderloop essentially boils down to
which hard-codes how things are rendered and in which order rendering takes place. While it is possible to skip a step this is still pretty inflexible. I think one of the main goals with GLMakie is to improve this, as a more flexible renderloop would allow us to add more rendering options without having to decide for one thing over another or worry about bottom line performance with high fidelity post-processors.
Rendering order
The current implementation essentially takes the order
which is reasonable when each scene uses the same rendering steps to render. With a more flexible renderloop this doesn't make much sense though. In that case it would be better if each Scene could define its own rendering steps and order would be changed to
As an example we can look at a figure with a 3D plot, a 2D plot and some UI elements. For the 3D plot we likely want some 3D specific rendering options like lighting, SSAO, shadows, order independent transparency, etc. For the 2D plot we probably just want transparency based on draw order (which could follow plot z order) and FXAA. For UI elements we're often fine without FXAA, as lines, linesegments, scatter and text are anti-aliased without and pixel-aligned rectangles don't need AA either. So we would have 3+ different setups here to optimally draw the scenes.
Scene order
As far as I can tell Makie doesn't have a well defined rendering order for scenes. GLMakie currently follows insertion order, where a scene is inserted when the first plot is added to it. For example
will change the order of scenes displayed depending on which
scatter!
runs first. This may further be different in CairoMakie and WGLMakie. (See also #2650)Before refactoring the rendering loop we should figure out how we want scenes to be ordered so we can stay consistent across backends. I think the right order here is probably depth first, perhaps with an added restriction that a child scene does not draw outside its parent. (This would fit an interpretation of a Scene being like a window in another parent window.)
Rendering steps
I think a good representation for render steps would probably something similar to (material) nodes in Blender. It's not really clear to me how this would look in implementation, but we can collect some requirements/ideas:
ssao/transparency/fxaa = true/false
to something likestage = :ssao
)Other Notes
Potentially following improvements/additions
I think reworking the rendering pipeline is necessary before we can add more rendering options. Some of the things we could look into after this are:
Other Goals
Clipping
With clipping one can discard fragments drawn outside a certain region. This would be helpful for
Axis3
, for example, to discard anything outside its bounding box. With that we could add zooming to it. Another use case could be cutting through a plot to view, for example, the inside of a mesh or volume plot (e.g. for medical imaging). (See #2783)Sprite sheets
A sprite sheet is essentially a collection of images which can be passed to the GPU without the need for individual textures. We use one for text and scatter rendering, where the "images" are representations of different letters and/or markers. Some other uses include 2D animation (character animation, maybe gif and video playback) and drawing tiled maps (mostly 2D but also 3D like Minecraft). While this might be a bit more of a game engine feature, I think it would be nice to have.
Upload/Update on Demand
With the "push" structure of observables any change to a plot argument or attribute will trickle down to GLMakie, potentially triggering calculations necessary for rendering and eventually an causing data to be uploaded to the GPU. This may happen more often than a frame is actually rendered (either due to multiple triggers of the same observable or repeated function calls caused by multiple observables) which i ultimately a waste of effort. To avoid this we could try to switch to a "pull" structure, where a render object pulls updates before rendering a frame rather.
Mesh improvements
Materials
Currently Makie doesn't have a material system. You can set a color or texture and maybe do a bit more by messing with lighting. It would be nice to expand on this with a material system and add things like a roughness modifier, a metallic modifier and whatever else is out there.
Skinning/Skeletons/Animation
I think this is more of a far future thing, but it would be cool if Makie could handle meshes with movable joints and animations. The quick guide for glTF might helpful for this https://github.com/KhronosGroup/glTF#overview
Beta Was this translation helpful? Give feedback.
All reactions