Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The captured video did not reach the specified frame rate #9659

Closed
nevynwong opened this issue Aug 20, 2021 · 14 comments
Closed

The captured video did not reach the specified frame rate #9659

nevynwong opened this issue Aug 20, 2021 · 14 comments
Labels

Comments

@nevynwong
Copy link

The video I collected was set at 60 frames, but in fact the results kept changing. Most of the time, there were only 20 or 30 frames per second, and the video I collected did not reach the specified frame rate. My videos are long, usually 30 minutes.
534E6DF9-5985-460a-B2FF-1AEAFAE1418D

@MartyG-RealSense
Copy link
Collaborator

Hi @nevynwong Are you using depth and RGB, or depth only please?

@nevynwong
Copy link
Author

I use both of them. Is this because of the long duration?

@MartyG-RealSense
Copy link
Collaborator

If both RGB and depth streams are used then there is a chance that the FPS may lag. If you have auto-exposure enabled then disabling the RGB option Auto-Exposure Priority may enable the FPS to remain consistently at the FPS that you have set it to. This can be tested with the RealSense Viewer's RGB controls.

image

@nevynwong
Copy link
Author

I set that up, too. Whether it will also be related to computer configuration?

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Aug 20, 2021

If you are using Windows then you could bring up the Windows Task Manager interface, which is accessed with the Ctrl-Alt-Delete keyboard shortcut, and left-click on its 'Performance' option to view CPU and memory usage in real-time as your program is running. This will let you see whether the CPU has a very high percentage usage, or if the amount of computer memory available is reducing over time (known as a 'memory leak'). A high CPU usage or a memory leak (or both) could negatively impact the program's performance.

image

If you are playing back a bag file with a script then you could also drag-and-drop the bag file into the center panel of the RealSense Viewer to play it back and compare its FPS performance to the method that you have been using to read the bag.

@nevynwong
Copy link
Author

I need to record the data for processing, and it is 60fps, can not be played back in the Viewer, will produce flash.

@nevynwong
Copy link
Author

If only the depth camera was used to collect the data, would this happen?

@MartyG-RealSense
Copy link
Collaborator

It may be less likely to, but depth-only streams sometimes experience this kind of FPS lag too.

What method are you using to record the data, please (e.g C# script).

@nevynwong
Copy link
Author

nevynwong commented Aug 20, 2021

Here is my c# script.
using Intel.RealSense;
using National_Tester.Common;
using System;
using System.Collections.Generic;
using System.Configuration;
using System.IO;
using System.Linq;
using System.Text;
using System.Threading.Tasks;

namespace National_Tester
{
public class IntelRealSenseCamera
{
private string fileName { get; set; }

    public IntelRealSenseCamera(string _fileName)
    {
        this.fileName = _fileName;
    }
    public bool doRecording = true;
    public void StartIntelRealSenseCameraRecord()
    {
        Task task = new Task(() =>
        {
            string path = System.Configuration.ConfigurationManager.AppSettings["IntelRealSenseSavePath"].ToString();
            try
            {
                if (!Directory.Exists(path))
                {
                    Directory.CreateDirectory(path);
                }
            }
            catch
            {

            }
            if (Directory.Exists(path))
            {
                CameraRecord(path + "\\" + fileName);
            }
            else
            {
                CameraRecord(fileName);
            }

        });
        task.Start();
    }
    void CameraRecord(string fileName)
    {
        var cfg = new Config();
        int DepthWidth = Convert.ToInt32(ConfigurationManager.AppSettings["DepthWidth"].ToString());
        int DepthHeight = Convert.ToInt32(ConfigurationManager.AppSettings["DepthHeight"].ToString());
        int DepthRate = Convert.ToInt32(ConfigurationManager.AppSettings["DepthRate"].ToString());

        float DepthAdcancesDepthUnits = float.Parse(ConfigurationManager.AppSettings["DepthAdcancesDepthUnits"].ToString());
        float DepthAdcancesDepthDisparityShift = float.Parse(ConfigurationManager.AppSettings["DepthAdcancesDepthDisparityShift"].ToString());
        float DepthAdcancesDisparityModulationAFactor = float.Parse(ConfigurationManager.AppSettings["DepthAdcancesDisparityModulationAFactor"].ToString());
        float DepthControlsDepthUnits = float.Parse(ConfigurationManager.AppSettings["DepthControlsDepthUnits"].ToString());

        //默认打开
        float DepthControlsPostProcessing = float.Parse(ConfigurationManager.AppSettings["DepthControlsPostProcessing"].ToString());

        string[] DepthControlsSpatialFilter = ConfigurationManager.AppSettings["DepthControlsSpatialFilter"].ToString().Split(',');
        float DepthControlsSpatialFilterFilterMagnitude = float.Parse(DepthControlsSpatialFilter[0]);
        float DepthControlsSpatialFilterFilterSmoothAlpha = float.Parse(DepthControlsSpatialFilter[1]);
        float DepthControlsSpatialFilterFilterSmoothDelta = float.Parse(DepthControlsSpatialFilter[2]);

        string[] DepthControlsTemporalFilter = ConfigurationManager.AppSettings["DepthControlsTemporalFilter"].ToString().Split(',');
        float DepthControlsTemporalFilterFilterSmoothAlpha = float.Parse(DepthControlsTemporalFilter[0]);
        float DepthControlsTemporalFilterFilterSmoothDelta = float.Parse(DepthControlsTemporalFilter[1]);


        int ColorWidth = Convert.ToInt32(ConfigurationManager.AppSettings["ColorWidth"].ToString());
        int ColorHeight = Convert.ToInt32(ConfigurationManager.AppSettings["ColorHeight"].ToString());
        int ColorRate = Convert.ToInt32(ConfigurationManager.AppSettings["ColorRate"].ToString());
        float ExposurePriority = float.Parse(ConfigurationManager.AppSettings["ExposurePriority"].ToString());

        cfg.EnableStream(Intel.RealSense.Stream.Depth, DepthWidth, DepthHeight, Format.Any, DepthRate);
        cfg.EnableStream(Intel.RealSense.Stream.Color, ColorWidth, ColorHeight, Format.Any, ColorRate);

        try
        {

            using (var pipe = new Pipeline())
            using (var pp = pipe.Start(cfg))
            //using (var pp = pipe.Start())
            {

                using (var frames = pipe.WaitForFrames())
                {
                    var depthFrame = frames.DepthFrame.DisposeWith(frames);
                    //Depth units:100; //不支持28
                    //UnitsTransform utf = new UnitsTransform();
                    //utf.Options.OptionValueDescription(Option.DepthUnits, 100);
                    //depthFrame.ApplyFilter(utf);
                    //Depth table 中
                    //disparity shift:70
                    //Disparity modulation中:A Factor:0.08

                    //Depth units:0.0001
                    depthFrame.Sensor.Options[Option.DepthUnits].Value = DepthControlsDepthUnits;
                    //Post-processing打开 //不支持65 //默认打开

                    //Temporal filter设定为:0.4, 20
                    TemporalFilter tf = new TemporalFilter();
                    tf.Options.OptionValueDescription(Option.FilterSmoothAlpha, DepthControlsTemporalFilterFilterSmoothAlpha);
                    tf.Options.OptionValueDescription(Option.FilterSmoothDelta, DepthControlsTemporalFilterFilterSmoothDelta);
                    depthFrame.ApplyFilter(tf);

                    //Spatial filter,设定为:5 0.8 50
                    SpatialFilter sf = new SpatialFilter();
                    sf.Options.OptionValueDescription(Option.FilterMagnitude, DepthControlsSpatialFilterFilterMagnitude);
                    sf.Options.OptionValueDescription(Option.FilterSmoothAlpha, DepthControlsSpatialFilterFilterSmoothAlpha);
                    sf.Options.OptionValueDescription(Option.FilterSmoothDelta, DepthControlsSpatialFilterFilterSmoothDelta);
                    depthFrame.ApplyFilter(sf);


                    var colorFrame = frames.ColorFrame.DisposeWith(frames);
                    //controls中:关闭auto exposure priority:0
                    colorFrame.Sensor.Options[Option.AutoExposurePriority].Value = ExposurePriority;

                    using (var dev = pp.Device)
                    {
                        var adv = AdvancedDevice.FromDevice(dev);
                        adv.JsonConfiguration = System.IO.File.ReadAllText(AppUtils.basedir + @"IntelRealsenseJson\realsenseConfig.json");
                        using (var recorder = new RecordDevice(dev, fileName + ".bag"))
                        {
                            while (doRecording)
                            {
                                //while空循环是为了让录制保持继续,循环停止是录制结束。
                            }

                        }
                    }
                }
            }
        }
        catch(Exception ex) 
        {
          
        }

    }
}

}

@MartyG-RealSense
Copy link
Collaborator

I must emphasize that my knowledge of C# coding in the SDK is limited, so thanks in advance for your patience.

As a starting point, you could try reversing the order of the Spatial and Temporal filters, as Intel's post-processing documentation recommends applying Spatial and then Temporal (whilst your script applies Temporal and then Spatial).

https://dev.intelrealsense.com/docs/post-processing-filters#section-using-filters-in-application-code

@nevynwong
Copy link
Author

2021-08-08.11.24.35.mp4

I used the Viewer to play back the bag, and there were two frame rates in it, and in fact the frame rate that we read using Matlab or Python was about the same as the viewer's frame rate.
WechatIMG32

@MartyG-RealSense
Copy link
Collaborator

Thanks very much for the image. The difference between these two FPS types is explained in the link below.

#7749 (comment)

@MartyG-RealSense
Copy link
Collaborator

Hi @nevynwong Do you require further assistance with this case, please? Thanks!

@MartyG-RealSense
Copy link
Collaborator

Case closed due to no further comments received.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants