Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Suggestion: Compatibility with various renderpipeline and use cases #14

Open
weiping-playnext opened this issue Mar 26, 2024 · 15 comments
Assignees

Comments

@weiping-playnext
Copy link

This is just a suggestion.

The following line uses Graphics to draw.

Graphics.DrawMeshInstancedProcedural(_mesh, 0, Material, _bounds, _entityCount, _materialPropertyBlock);

This would result in unexpected drawing order when drawing in a sophisticated environment such as having multiple cameras.
Would it be better to use a command buffer to perform the draw operation and add it to necessary pipeline?

One could also add it back to Graphics when desired
https://docs.unity3d.com/ScriptReference/Graphics.ExecuteCommandBuffer.html
or add to specific desired camera
https://docs.unity3d.com/ScriptReference/Camera.AddCommandBuffer.html

This would provide support for hybrid rendering

@Antoshidza
Copy link
Owner

That is a great suggestion if such improve can tight NSprites closely to unity standard rendering pipeline. But I'm completely lost with all this rendering spaghetti in unity. Can you enlighten me? For example if NSprites.SpriteRenderingSystem call to Graphics.DrawMeshInstancedProcedural(_mesh, 0, Material, _bounds, _entityCount, _materialPropertyBlock); then instead at the same position in code we need to do the same command but with command buffer (like it works with EntityCommandBuffer), but then where should it be executed? Or does it happens automatically by unity SRP?

@weiping-playnext
Copy link
Author

weiping-playnext commented Mar 27, 2024

In Unity SRP, the command buffers are hooked into the target respective Camera.
The Draw occurs only when Camera.Render is being called. <- This is automatic.
So in retrospect, we would need an actual Monobehaviour (e.g. RenderingHook : Monobehaviour) to
query the current active/target camera to bind the sprite rendering to.

public class RenderingHook : Monobehaviour
{
[SerializeField]
Camera _targetCam;

RenderArchetype[] _renderArchetypes; // <- Somehow we need to register the archetypes that the targetcamera will render

void OnEnable()
{
foreach(var archetype in _renderArchetypes)
{
_targetCam.AddCommandBuffer(Rendering.CameraEvent.BeforeForwardAlpha, archetype.AcquireCommandBuffer()); // <- acquiring of commandBuffer
}
}

The above code is a simple sample of how it might work but there still is work about the cleaning up of rendering CommandBuffer and also for HybridRendering with URP.

With URP(this is what I am more familiar with), one could use RenderFeature to add a NSpriteRenderFeature/NSpriteRenderPass for when to draw.
スクリーンショット 2024-03-27 11 33 17

@Antoshidza
Copy link
Owner

So, if I understand you correctly with any SRP (URP in particular in your example) we can just use command buffer and it will automatically render sprites + we can edit URP asset settings to tell system when we want to render our sprites. But for legacy renderer we would need to adapt using of command buffer with custom render hook and also manually recycle command buffer.

If I describe things right then can you please write a simple example where command buffer used instead of direct access to Graphics?

@weiping-playnext
Copy link
Author

weiping-playnext commented Mar 28, 2024

Yes.
The following is a simple example of how it might go for SRP.

class ArcheTypeBridge
{
    public void ProcessDraw(CommandBuffer cmd)
    {
         cmd.DrawMeshInstanced(); //<-Basically where the RenderArcheType draw code is and has no problems for being in any Job
    }
}
class A : MonoBehavior
{
    [SerializeField] Camera _targetCamera;
    CommandBuffer cmdBuffer = new CommandBuffer();
    ArcheTypeBridge _bridge; //<--- I have yet to figure a way out to bridge the RenderArcheType
    
    void OnEnable()
    {
        _targetCamera.AddCommandBufferAsync(RenderEvent.AfterOpaque, cmdBuffer);
    }
    void Update()
    {
        cmdBuffer.Clear();// <- this step is rather important as we reuse the same buffer.
        _bridge.ProcessDraw(cmdBuffer)
    }
}

The above example is just a simple illustration of how it might go but the dependencies are in a mess. Have yet to figure out a proper dependency design for the system.

  • The owner of the CommandBuffer could be the ArcheTypeBridge instead and the hooking class would just need to connect the CommandBuffer to the Camera
    -- In retrospect, this might be better as it allows the ArcheType to clear the buffer for each Job Execute

@Antoshidza
Copy link
Owner

  • Do we need to connect CommandBuffer to Camera in any rendering pipeline (legacy and SRPs) ?

The idea where we need a MonoBehaviour to bridge things looks inconvenient to me, because package lives inside unity ECS and I think it is better for devs to think in Entities terms like baking, entities, components, etc and try to avoid using monos.
Can we design such a solution with minimal changes of how NSprites works for now? The only thing client code must do outside for now is only register RenderArchetypes and then system works.

@weiping-playnext
Copy link
Author

weiping-playnext commented Mar 29, 2024

Do we need to connect CommandBuffer to Camera in any rendering pipeline (legacy and SRPs) ?

For SRPs, we do not need a Monobehaviour/Camera to execute the CommandBuffer. SRPs render execution provides a ScriptableRenderContext which we use for execution of the CommandBuffer. https://docs.unity3d.com/ScriptReference/Rendering.ScriptableRenderContext.ExecuteCommandBuffer.html

SRP render hierarchy:

  • Camera.Render
    -- RenderPipline.Execute(ScriptableRenderContext)

For most SRPs, we could simply provide a CommandBuffer getter for however relevant render pipeline chooses to integrate NSprites.
The most minimal way is just instead of Drawing, prepare a CommandBuffer in a Singleton/static space for SRPs to retrieve manually and I could help provide a URP/HDRP extension for NSprites as a separate package.
I intend to prepare a NSpriteRenderFeature/NSpriteRenderPass for URP/HDRP which would act as some sort of bridge/integration into the mainstream SRP use-cases.
https://docs.unity3d.com/Packages/[email protected]/api/UnityEngine.Rendering.Universal.ScriptableRendererFeature.html

Another alternative is to use the following hook on RenderPipelineManager:
https://docs.unity3d.com/ja/2023.2/ScriptReference/Rendering.RenderPipelineManager-beginCameraRendering.html
But this hook makes it hard to control when the Draw occurs or rather the draw occurs before any of the actual rendering.

As for legacy, things are a bit trickier. Because we do not have a way of interacting with the current rendering context other than Camera, it is kind of restrictive if operating in multi-camera environment(i.e. HUD over gameplay). If we restrict it to only operating on a single camera, we could always utilize Camera.main or Camera.current.

Note + Apologies: In my previous comment, I mistook SRP for legacy rendering, which led to the use of Monobehaviour in the example. Sorry for the confusion.

@Antoshidza
Copy link
Owner

We can make SRP's a default way of working with NSprites, especially when unity make it more and more default. So this is how we can avoid using mono-bridge. Though for legacy we can have #if statement in code and ask NSprites-users to add directive to Player Settings if they want to use NSprites with legacy rendering and for this rare legacy case we can have all dirty mono helper classes.

@weiping-playnext
Copy link
Author

weiping-playnext commented Mar 29, 2024

Yes.
So, if there is some form of method to obtain a reference to the RenderArchetype or just the CommandBuffer generated within the RenderArchetype from external packages, it would make creating SRP extensions easier.

PackageDependencies:

MyExtension
⎣Core RP
⎣NSprites

As a simple check for whether one is using SRP(from projects that I have worked on), we could actually do a simple
if(QualitySettings.renderPipeline == null) //<-implies using legacy.

@Antoshidza
Copy link
Owner

There is managed IComponentData RenderArchetypeStorage which can be extended to provide public access to CommandBuffer which would be generated inside RenderArchetype.

graph TD;
    SRP_Extension-->RenderArchetypeStorage;
    Legacy-Mono-Bridge-->RenderArchetypeStorage;
Loading

Can you please extend this diagram because I'm completely lost in terms of what is SRP feature and what is SRP extensions and how things should be provided.

@weiping-playnext
Copy link
Author

weiping-playnext commented Mar 30, 2024

So the SRP-Extensions would be extending Core RP modules for SRPs to implement NSpritesRendering.
Most custom(non-URP/HDRP) renderPipelines would just be interested in having a NSpritesRenderPass which can be simply coded into the pipeline code.

graph TD;
    NSpritesRenderFeature-->NSpritesRenderPass;
    NSpritesRenderPass-->RenderArchetypeStorage;
    Legacy-Mono-Bridge-->RenderArchetypeStorage;
Loading

For most SRPs, a simple NSpritesRenderPass would suffice.

public class NSpritesRenderPass : ScriptableRenderPass
{
    RenderArchetypeStorage _storage
    public NSpritesRenderPass(RenderArchetypeStorage storage)
    {
      _storage = storage;
    }
    public void SetRenderArchetypeStorage(RenderArchetypeStorage storage)
    {
      _storage = storage;
    }

    public override void Execute(ScriptableRenderContext context, ref RenderingData renderingData)
    {
      var cmdbuffer = _storage.GetCommandBuffers();
      context.ExecuteCommandBuffer(cmdbuffer);
    }
}

For the standard URP/HDRP, we can go on to prepare NSpritesRenderFeature that will wrap the NSpritesRenderPass for the renderers.

  [Serializable]
  public class NSpritesRenderFeature : ScriptableRendererFeature
  {
    [SerializeField] RenderPassEvent _renderPassEvent = RenderPassEvent.AfterOpaques;
    protected NSpritesRenderPass _renderPass;
    
    public override void Create()
    {
      _renderPass = new NSpritesRenderPass(null); //This is completely independent of any Monobehaviours, I would need a way of obtaining the RenderArchetypeStorage
    }
    
    public override void AddRenderPasses(ScriptableRenderer renderer, ref RenderingData renderingData)
    {
        _renderPass.renderPassEvent = _renderPassEvent;
        renderer.EnqueuePass(_postProcessPass);
    }
    
    public override void SetupRenderPasses(ScriptableRenderer renderer, in RenderingData renderingData)
    {
      _renderPass.SetRenderArchetypeStorage(null);//for updating scene archetypeStorage changes
    }
  }

@weiping-playnext
Copy link
Author

I have just got an idea for the Legacy-Bridge.
We could use the ancient ImageEffects script design for the Legacy-Bridge, as it attaches to a Camera.

[RequireComponent(typeof(Camera))]
public class NSpritesCamera : Monobehaviour
{
  void OnEnable{
  //Add CommandBuffer to Camera
  }
  void OnDisable{
  //Remove CommandBuffer on Camera
  }
}

@Antoshidza
Copy link
Owner

  • Can you advice some reading for better understanding what is feature and what is render pass? Because unity talking in their docs like it common things (they are no doubt, but this terms barely familiar to me). I understand only few things

    • CommandBuffer is a common command buffer
    • ScriptableRenderPass is a thing where we render something and in particular execute command buffer (or buffers?)
    • ScriptableRendererFeature is...? Some form of class to script already written SRPs like URP/HDRP. As I understand URP/HDRP don't let us use passes directly but instead write feature where we can also wrap ScriptableRenderPass to process rendering we want. That is why we want to have ScriptableRenderPass for custom SRP and ScriptableRendererFeature for URP/HDRP (which just use former inside). Know when I write it I understand that I'd made the same diagram as you've made.
  • Would it be better to have CommandBuffer per RenderArchetype because you can process effects or stuff for particular archetype, or this things unrelated?

  • Does it acceptable for you to have a branch in that repo to implement pass/feature/legacy-bridge in cooperation with me to keep all related things in main repo, because it is handy for users.

I'm sorry for being that laggy in understanding things. That is because I need to combine my eng interpretation with new for me SRP stuff and how architecture should be changed to keep package handy for users.

@weiping-playnext
Copy link
Author

weiping-playnext commented Apr 1, 2024

Can you advice some reading for better understanding what is feature and what is render pass? Because unity talking in their docs like it common things (they are no doubt, but this terms barely familiar to me). I understand only few things

Unfortunately, it is difficult to get any form of articles or documentation regarding SRP. The people working with it(me included) have gotten used to it after practice and digging into the source code.
However, I can provide you with examples of custom RenderPipelines that I usually refer to when making my own(for company projects).

  • Keijiro's Retro3dPipeline(https://github.com/keijiro/Retro3DPipeline)
    -- A minimalistic RenderPipeline which illustrates the working of SRP in a bare state with no ScriptableRenderPass and having all the draw code in the RenderPipline code itself.
  • Keijiro's BibCamUrp(https://github.com/keijiro/BibcamUrp)
    -- A simple showcase of how ScriptableRenderPass and ScriptableRenderFeature works
  • Unity's own URP code
    -- That is the whole convoluted RenderPipeline bloated with loads of examples.

Would it be better to have CommandBuffer per RenderArchetype because you can process effects or stuff for particular archetype, or this things unrelated?

I was thinking about that too. Would probably need some form of management for the commandbuffers.

On a side note, I have gotten permission from my CTO to work part-time on this, as I really do not want to reinvent the wheel to make my own data structs for sprite controls

@Antoshidza
Copy link
Owner

Discussion here

@liyou54
Copy link

liyou54 commented Jul 29, 2024

I believe it can be divided into two steps: NSprite is responsible for data generation using the jobsystem, and then the data generated by NSprite is injected into SRP, which is driven by SRP.
`namespace NSprites_main.Rendering.NSpriteRenderPass
{
internal class NSpriteRendererPass : ScriptableRenderPass
{
internal List RenderArchetypes;

    public override void Execute(ScriptableRenderContext context, ref RenderingData renderingData)
    {
        // you can do something here like checkout render targets
        var cmd = CommandBufferPool.Get("NSpriteRendererPass");
        foreach (var renderArchetypeType in RenderArchetypes)
        {
            // you can do something here
            renderArchetypeType.Draw(cmd);
        }

        context.ExecuteCommandBuffer(cmd);
        CommandBufferPool.Release(cmd);
    }
}

}`

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

When branches are created from issues, their pull requests are automatically linked.

3 participants