Fix point cloud performance issue when points are filtered via "show" #12317
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Discard hidden points in the fragment shader, rather than moving them to 0 in the vertex shader. Fixes #11270.
Description
The problem
I was running into an issue when points from a 3D Tiles point cloud dataset were being filtered out via a
show
condition. For example showing only Ground points or only Noise points - the more points that were filtered away, the worse the performance got. Some investigation showed that this was only an issue on Apple silicon, and is most noticeable in Firefox. On my M2 MacBook with Firefox, the second link in #12140 slows my entire computer to a crawl until the tab is closed. I submitted that issue which was found to be a duplicate of #11270, which has some investigation and suggestions. Unfortunately the one-line fix described here had no effect for me.Investigation details
The offending line is here, which multiplies the point position by
show
, in order to zero out the position when when the point should be hidden due toshow == 0
.It seems that when the point position is set to
0
though, that something in the rendering pipeline degenerates significantly on Apple silicon.Changes
float v_pointCloudShow
, populated in the vertex shaderv_pointCloudShow
is 0,discard
the point in the fragment shaderIt's not quite clear to my why the line
gl_Position = show * positionClip
still causes the issue even with the other changes applied. I'd have thought that since the point will be discarded immediately in the fragment shader, that this line could be left as-is. But as mentioned in the code comment there, this position adjustment must be removed to fix the performance issue.That said, I just started learning about shaders a few hours ago to look into this issue, so I don't really know what any of this means. Suggestions are very welcome from anyone who knows what they're doing.
For example, is this the best place to perform the discard? Maybe somewhere around here would be more appropriate? There may easily be tens of thousands of points hidden in common scenarios, so I tried to discard them as early as possible.
Issue number and link
Testing plan
The fix should be tested both on hardware that is affected as well as hardware that is not affected by the original issue to make sure this doesn't have any side-effects on non-affected hardware.
I've built a bundle containing these changes, so people can compare their performance, making sure that the "fix" both fixes the issue and does not cause new ones. Open the below links, with only one open at a time, and try to drag the point cloud around in little circles to see the performance.
Here is an example of what it looks like if you are experiencing the issue (yes, it's a recording of a screen, but I can't capture it via screen recording since the computer itself is lagging):
cesium-lag-480.mov
Author checklist
CONTRIBUTORS.md
CHANGES.md
with a short summary of my change