Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How can I Using multiple pipelines with one realsense device on the rock pi rk3399 ? #10748

Closed
itgo067 opened this issue Aug 4, 2022 · 7 comments

Comments

@itgo067
Copy link

itgo067 commented Aug 4, 2022

Required Info
Camera Model D435i
Firmware Version 05.12.13.50
Operating System & Version Ubuntu 20.04
Kernel Version (Linux Only) Linux 5.10.103+ aarch64
Platform Rock Pi rk3399
SDK Version 2.50
Language cpp
Segment Robot

Issue Description

I am using the realsense d435i camera in ubuntu on a Rock Pi 4B Plus, and I want to read the Infrared data stream and the color data stream using two separate pipelines.

But on the Rock Pi I get an error, only one pipeline started ok ,and working well. But on my PC in windows 10, it works perfectly.

The code is as follows.

https://gist.github.com/itgo067/11be993ff30a98fe1f9fc9efe81486a5

When I call run_stereo and run_color function with aync thread . It will be run errors. The errors are as follows.

color 2 error calling rs2_pipeline_start_with_config(pipe:0xaaaad11ed8e0, config:0xaaaad11ed8a0):
    No device connected

In additonal , when I run run_color_stereo function . it using one pipeline , every stream working well . but frame rate will down . I set Infrared stream fps 90 , color stream 30 fps . but I get frame rate 60 fps .

Now I want run Infrared stream 90fps , and color stream 30 fps on my project .

I cann't found any infos for this problem .

how can I set some option to run two pipelines with one realsense device on rock pi platform . Or how to set some option to run one pipeline but frame rate keep it the same as I set ?

@MartyG-RealSense
Copy link
Collaborator

Hi @itgo067 Let's start with maintaining the same FPS rate on one pipeline. If auto-exposure is enabled and a separate RGB option called auto-exposure priority is disabled then the SDK should try to force FPS to be maintained at a constant rate instead of being allowed to vary. An example of C++ code for disabling auto-exposure priority is at #7285 (comment)

@itgo067
Copy link
Author

itgo067 commented Aug 4, 2022

@MartyG-RealSense Hello, I test disable auto exposure priority option in my code . but also get 60 fps . when I disable color stream , the ir stream can become 90 fps.

#include <thread>
#include <librealsense2/rs.hpp>
#include <opencv2/opencv.hpp>


cv::Size color_size(1280, 720);
int color_frameRate = 30;

cv::Size stereo_size(640, 480);
int stereo_frameRate = 90;

rs2::context ctx;
std::shared_ptr<rs2::config> cfg_color_stereo;

void run_color_stereo()
{
  rs2::colorizer color_map;
  rs2::pipeline pipe_color_stereo(ctx);
  auto last_fps_time = std::chrono::system_clock::now();
  auto last_fps_time_color = last_fps_time;
  auto last_fps_time_stereo = last_fps_time;

  int fps_num_color = 0;
  int fps_num_stereo = 0;

  int show_fps_color = 0;
  int show_fps_stereo = 0;

  std::cout << "cfg  color stereo" << std::endl;

  try
  {
    cfg_color_stereo = std::make_shared<rs2::config>();
    // cfg_color_stereo->enable_stream(RS2_STREAM_COLOR, color_size.width,
    //                                 color_size.height, RS2_FORMAT_BGR8,
    //                                 color_frameRate);
    cfg_color_stereo->enable_stream(RS2_STREAM_INFRARED, 1, stereo_size.width,
                                    stereo_size.height, RS2_FORMAT_Y8,
                                    stereo_frameRate);
    cfg_color_stereo->enable_stream(RS2_STREAM_INFRARED, 2, stereo_size.width,
                                    stereo_size.height, RS2_FORMAT_Y8,
                                    stereo_frameRate);
    cfg_color_stereo->enable_device(sn_realsense);
    
    auto profile_color_stereo = pipe_color_stereo.start(*cfg_color_stereo);

    auto sensor1 = profile_color_stereo.get_device().first<rs2::depth_sensor>();
    sensor1.set_option(RS2_OPTION_ENABLE_AUTO_EXPOSURE, 1);

    rs2::color_sensor color_sensor = profile_color_stereo.get_device().query_sensors()[1];
    color_sensor.set_option(RS2_OPTION_ENABLE_AUTO_EXPOSURE, 1);
    color_sensor.set_option(RS2_OPTION_AUTO_EXPOSURE_PRIORITY, 0);

    bool stop_color = false;
    while (1)
    {
      rs2::frameset frames = pipe_color_stereo.wait_for_frames();
      // if (!stop_color) {
      //   rs2::frame color = frames.get_color_frame().apply_filter(color_map);
      //   const int w2 = color.as<rs2::video_frame>().get_width();
      //   const int h2 = color.as<rs2::video_frame>().get_height();
      //   cv::Mat color_image(cv::Size(w2, h2), CV_8UC3,
      //                       (void *)color.get_data(), cv::Mat::AUTO_STEP);
      //   if (!color_image.empty())
      //   {
      //     fps_num_color++;
      //     cv::putText(color_image, "FPS: " + std::to_string(show_fps_color),
      //                 cv::Point(20, 50), cv::FONT_HERSHEY_COMPLEX, 2.,
      //                 cv::Scalar(0, 255, 255));
      //     // cv::imshow("color_image", color_image);
      //     // cv::waitKey(1);
      //     auto current_fps_time = std::chrono::system_clock::now();
      //     auto duration =
      //         std::chrono::duration_cast<std::chrono::milliseconds>(
      //             current_fps_time - last_fps_time_color);
      //     double diff_time = double(duration.count()) *
      //                       std::chrono::milliseconds::period::num /
      //                       std::chrono::milliseconds::period::den;
      //     if (diff_time >= 1.0)
      //     {
      //       std::cout << "color_spend time = " << diff_time
      //                 << " s , fps = " << fps_num_color << std::endl;
      //       show_fps_color = fps_num_color;
      //       last_fps_time_color = current_fps_time;
      //       fps_num_color = 0;
      //     }

      //     color_image.release();
      //   }
      //   //stop_color = true;
      // }
     
      rs2::frame stereo_left = frames.get_infrared_frame(1);
      rs2::frame stereo_right = frames.get_infrared_frame(2);
      const int w0 = stereo_left.as<rs2::video_frame>().get_width();
      const int h0 = stereo_left.as<rs2::video_frame>().get_height();
      const int w1 = stereo_right.as<rs2::video_frame>().get_width();
      const int h1 = stereo_right.as<rs2::video_frame>().get_height();
      cv::Mat left_image(cv::Size(w0, h0), CV_8UC1,
                         (void *)stereo_left.get_data(), cv::Mat::AUTO_STEP);
      cv::Mat right_image(cv::Size(w1, h1), CV_8UC1,
                          (void *)stereo_right.get_data(), cv::Mat::AUTO_STEP);
      if (!left_image.empty() && !right_image.empty())
      {
        fps_num_stereo++;
        cv::Mat stereo_image;
        cv::hconcat(left_image, right_image, stereo_image);
        cv::cvtColor(stereo_image, stereo_image, cv::COLOR_GRAY2BGR);
        cv::putText(stereo_image, "FPS: " + std::to_string(show_fps_stereo),
                    cv::Point(20, 50), cv::FONT_HERSHEY_COMPLEX, 2.,
                    cv::Scalar(0, 255, 255));
        // cv::imshow("stereo_image", stereo_image);
        // cv::waitKey(1);
        auto current_fps_time = std::chrono::system_clock::now();
        auto duration = std::chrono::duration_cast<std::chrono::milliseconds>(
            current_fps_time - last_fps_time_stereo);
        double diff_time = double(duration.count()) *
                           std::chrono::milliseconds::period::num /
                           std::chrono::milliseconds::period::den;
        if (diff_time >= 1.0)
        {
          cv::imwrite("stereo.jpg", stereo_image);
          cv::waitKey(1);
          // std::cout << "spend time = " << diff_time << " s , fps = " <<
          // (int)(1. / diff_time) << std::endl;
          std::cout << "stereo spend time = " << diff_time
                    << " s , fps = " << fps_num_stereo << std::endl;
          show_fps_stereo = fps_num_stereo;
          last_fps_time_stereo = current_fps_time;
          fps_num_stereo = 0;
        }

        left_image.release();
        right_image.release();
      }
    }
  }
  catch (const rs2::error &e)
  {
    std::cerr << "color stereo error calling " << e.get_failed_function() << "("
              << e.get_failed_args() << "):\n    " << e.what() << std::endl;
    // return EXIT_FAILURE;
  }
  catch (const std::exception &e)
  {
    std::cerr << "color stereo " << e.what() << std::endl;
    // return EXIT_FAILURE;
  }

  //pipe_color_stereo.stop();
}

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Aug 4, 2022

Increasing FPS increases the volume of new frames that are arriving from the camera per second, whilst increasing resolution also puts a demand on the resources of the computer / computing device. The Windows PC may have a hardware specification that can handle that processing workload (90 FPS on infrared and 1280x720 resolution on color) whilst the Rock Pi may be having more difficulty with keeping up with the same processing load.

You could test whether this is the case by reducing the infrared FPS to 30 and see whether infrared still drops below the defined speed when color is enabled.

You could also experiment with enabling support for GLSL graphics acceleration in your C++ script to offload processing work from the CPU onto the GPU.

GLSL can be enabled for RS2:: type instructions in a script by adding GL:: to it. For example, rs2::pointcloud becomes rs2::gl::pointcloud, rs2::align becomes rs2::gl::align, etc.

The RealSense SDK has a C++ example program for GLSL at the link below.

https://github.com/IntelRealSense/librealsense/tree/master/examples/gl

#3654 also has a very good pros and cons analysis of GLSL processing blocks and when it can be used.

@itgo067
Copy link
Author

itgo067 commented Aug 5, 2022

@MartyG-RealSense thanks . But in my project , i cloudn't use any other gpu etc.

Now I want use two pipeline to process IR stream and color stream . Can you give me some advice?

I use two pipeline working well on my pc . but on the Rock Pi , I can run only one pipeline to work .

@MartyG-RealSense
Copy link
Collaborator

If the script works fine on PC but not on the Rock Pi then that suggests that the code is correct and the problem may be with the 5.10.103+ aarch64 kernel that is used (one that is not officially supported by the librealsense SDK).

If librealsense is built from source code with the RSUSB backend installation method then the SDK bypasses the kernel, so that it is not dependent on Linux versions or kernel versions and does not require kernel patching.


Before creating an RSUSB build though, you could first try adding a hardware_reset() instruction before the pipe start to reset the camera when the program runs, as described by the C++ script at #2161 (comment)

@MartyG-RealSense
Copy link
Collaborator

Hi @itgo067 Do you require further assistance with this case, please? Thanks!

@MartyG-RealSense
Copy link
Collaborator

Case closed due to no further comments received.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants