Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Testing the FPS in code #8417

Closed
arothenberg opened this issue Feb 22, 2021 · 9 comments
Closed

Testing the FPS in code #8417

arothenberg opened this issue Feb 22, 2021 · 9 comments

Comments

@arothenberg
Copy link

arothenberg commented Feb 22, 2021


Required Info
Camera Model D435
Firmware Version 05.12.02.100
Operating System & Version Win (10)
Platform PC
SDK Version 2.32
Language C++
Segment desktop app

Issue Description

I'm trying to get the number of FPS that I am able to process from the camera.
I used rs-align-advanced.cpp as the template for my own project. But I don't use the color stream and I do other things...
Anyway, I tested the speed of rs-align-advanced.cpp adding the following code to the main procedure:

    int counter = 0;
   auto start_time = std::chrono::steady_clock::now();
   auto current_time = std::chrono::steady_clock::now();
  while (app) // Application still alive?
{  
     current_time = std::chrono::steady_clock::now();
    int tot_time = std::chrono::duration_cast<std::chrono::seconds>(current_time - start_time).count();
		if (tot_time>10)
    {
        std::cout << counter;
        return 1;

    }
   ...

    if (!aligned_depth_frame || !other_frame)
    {
        continue;
    }
    counter += 1;
    remove_background(other_frame, aligned_depth_frame, depth_scale, depth_clipping_distance);

I got around 80 frames in 10 seconds. That was with the remove_background function making a pass over the depth/video tables.

  1. Am I going about this correctly? If No, disregard 2 and 3 and the rest of the post and point me to the right way if you could.
  2. Is the number of frames that I should expect - 80 in 10 seconds?
  3. I still haven't checked the content of the frames. Can I assume they are all good?

This is the last phase of my project before I use real data(which costs) and I want to make sure I get every depth frame possible
for analysis. The motions captured can be quick. Right now I on some gestures I get 2 frames to work with. I assume that's something I'm doing and not the camera. I worked with the kinect, using c#, writing my own bitmap readers, and I think I got better FPS. And they were 30 FPS and I'm using the default settings in rs-align-advanced which may be 30FPS.

Thanks in advance.

.

@arothenberg
Copy link
Author

I want to add that I've been very happy with the camera so far.

@arothenberg
Copy link
Author

do you have a discord?

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Feb 23, 2021

Hi @arothenberg There is not a RealSense Discord support account. Public support for RealSense is provided by the GitHub forums and at Intel Support's Intel RealSense Help Center forum.

https://support.intelrealsense.com/hc/en-us/community/topics

Advice about FPS calculation and frame drops to compare to your own method is provided in the link below.

#7488 (comment)

If a "hiccup" occurs in the depth stream then the SDK will try to go back to the last good frame and continue onward from there, which is why you may see the same timestamp repeated twice in a row sometimes. So whilst you can assume that the SDK has done its best to collect "good" frames, examining the list of timestamps may provide indications of whether there are problems that might be occurring during streaming.

You can expect a greater burden on processing during align operations. This is why it can be useful to offload processing from the CPU onto a graphics GPU to accelerate processing. With computing devices that have an Nvidia GPU, this can be done with CUDA support. For non-Nvidia GPUs, GLSL processing blocks provide offloading and are 'vendor neutral', meaning that they should work with any GPU brand (though improvement may not be noticable on low-end computing devices).

The link below provides a good discussion of the pros and cons of using GLSL.

#3654

@arothenberg
Copy link
Author

arothenberg commented Feb 23, 2021

Thanks Marty. I know the align operation takes time so that's why I used it. It is probably comparable to the operation that I
do time wise.
So the other question was whether 8 fps on the rs-align-advanced is a reasonable number. It's certainly less than the 90FPS that is , maybe, possible. I'll try GSL processing blocks but if its only a nominal improvement over 8 FPS than it wont work for me.

If I just take the depth frame and I don't align or remove any background at 90fps (on the camera)
and render it using the sdk cpp method I get 45 FPS.

I'll try the GSL blocks and see if that helps.

@MartyG-RealSense
Copy link
Collaborator

The discussion in the link below about aligning in a project with low-power computers may provide helpful insights.

#5296

@arothenberg
Copy link
Author

Y'know Marty, I tested my code (not the rs-aligned-advanced - which i've been using as a benchmark) and I might be getting 65FPS which is much more than enough for me. I suppose the aligned tables take a lot of time. .

I'll close this

@MartyG-RealSense
Copy link
Collaborator

Thanks very much for the update @arothenberg - I'm pleased that your code produces results that are saisfactory for your needs.

@arothenberg
Copy link
Author

And thanks for that link to the gsl blocks. that might be important. tyvm.

@arothenberg
Copy link
Author

just in case someone browses this. I was not getting 65FPS. I was getting something comparable to above - around 10 fps
and it was from making a pass over the depth table(similar to remove_background above). I have yet to try GSLblocks
but the only solution I found was using half or a third of the image(every other pixel, or third ...). For some reason lowering the resolution didn't change speed. I have to play around with this more in order to figure it out .

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants