Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Behaviour of output trigger #9810

Closed
xc-racer99 opened this issue Sep 30, 2021 · 13 comments
Closed

Behaviour of output trigger #9810

xc-racer99 opened this issue Sep 30, 2021 · 13 comments

Comments

@xc-racer99
Copy link

Required Info
Camera Model D435
Firmware Version 5.12.14.50
Operating System & Version Win10
Platform PC
SDK Version 2.47.0
Language C++
Segment Robot

Issue Description

We are trying to sync the realsense with an external encoder and obtain depth and colour frames in a known location. We previously tried genlock (see #9528) but have now switched to having the realsense output a pulse on its sync pin (ie pin 5) with RS2_OPTION_INTER_CAM_SYNC_MODE set to 1 (ie master). We are using the pipeline API with colour, depth, and infrared 1 streams enabled, all at 30 fps. The colour sensor has auto exposure priority disabled and both sensors have a fixed exposure - this is an attempt to keep the depth and colour synchronized based on the comments found at #8726 (comment)

Experimentally, we've found that the first infrared frame usually, but not always, lines up with the third pulse from the realsense. When this works, it works perfectly. I'd estimated 90% of the time this happens. When it doesn't, everything is thrown off such that the results are unusable for our purposes. So we've also experimented with setting RS2_OPTION_OUTPUT_TRIGGER_ENABLED to 1 instead. With this, we've been unable to find a "magic number" of frames to correlate encoder data with frames.

Breaking out the oscilloscope, here's the waveforms that we've found.

RGB streaming, RS2_OPTION_INTER_CAM_SYNC_MODE = 1, then depth started

IMG_20210930_153434
(note the ~40ms gap between the first two pulses with the expected 33ms gap afterwards)

RGB streaming, RS2_OPTION_OUTPUT_TRIGGER_ENABLED = 1, then depth started

IMG_20210930_153539
(note the two pulses in quick succession at the start, then the usual 33ms gap)

RGB not streaming, RS2_OPTION_INTER_CAM_SYNC_MODE = 1, then depth started

IMG_20210930_153655
(note that this is the same as when RGB was streaming)

RGB not streaming, RS2_OPTION_OUTPUT_TRIGGER_ENABLED = 1, then depth started

IMG_20210930_153733
(note the even 33ms gap between all pulses)

(side note - I've been unable to test this last situation in our application as it has the RGB stream heavily embedded and will need a bit of work to remove it for testing just the infrared/depth streams).

So the questions boil down to:

  • Is there a method which can be used to correlate frame counters with output pulses exactly one-to-one?
  • What is the technical difference between RS2_OPTION_INTER_CAM_SYNC_MODE = 1 and RS2_OPTION_OUTPUT_TRIGGER_ENABLED = 1?
  • How does the RGB stream influence these two options?
@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Oct 1, 2021

Hi @xc-racer99 RS2_OPTION_INTER_CAM_SYNC_MODE will set a camera to be a Master camera when set to 1 and the camera will therefore transmit a trigger pulse. A camera that is set to '2' will be a Slave that listens out for a trigger pulse on every frame and performs a depth-frame capture that is timestamp-synced with the Master camera if it recognizes the pulse.

When RS2_OPTION_OUTPUT_TRIGGER_ENABLED is set to '1', a trigger is output from the camera to any external device on every depth frame.

When the 400 Series launched in late 2017 (some time before hardware sync and Inter Cam Sync Mode was later introduced) there was a small amount of documentation about directly using the GPIO pins on the camera. The official version of that old documentation is now long-obsolete and unavailable but I found it at another source and have recreated the page below in case it helps with your particular situation with RS2_OPTION_OUTPUT_TRIGGER_ENABLED


Output Trigger
In addition to the USB3 connector, RealSense 400 Series cameras expose headers for interaction with external devices. The camera can provide trigger every time frame is being captured (with RealSense 400 Series as the master). To enable this mode, all you need to do is:

dev.set_option(RS2_OPTION_OUTPUT_TRIGGER, 1);

Timestamp from external sensors
If you wish to synchronize external sensor (for example a compass) with RealSense, all you need to do is trigger one of 4 GPIOs whenever you read data from the external sensors. The camera will generate timestamp for each GPIO event, allowing you to synchronize between the data streams in software:

rs2::util::config config;
config.enable_stream(RS2_STREAM_GPIO1, 0, 0, 0, RS2_FORMAT_GPIO_RAW);
auto stream = config.open(dev);

rs2::frame_queue events_queue;
stream.start(events_queue);

while (true)
{
    rs2::frame f = events_queue.wait_for_frame();
    std::cout << "External sensors connected to GPIO1 fired at " << f.get_timestamp() << "\n";
}

You can mix and match GPIO streams with other types of streams: you are free to use frame queues or callbacks, you can configure the stream directly or using utils::config helper class, you can use utils::syncer to group frames with closest GPIO event.


The table below is taken from page 52 of the current edition of the data sheet document for the 400 Series cameras.

image

I would not expect a RealSense RGB stream to have an influence on either RS2_OPTION_INTER_CAM_SYNC_MODE or RS2_OPTION_OUTPUT_TRIGGER_ENABLED, except when using RS2_OPTION_INTER_CAM_SYNC_MODE with a slave camera that has been set to Inter Cam Sync Mode '3' (Full Slave).


An alternative method of syncing depth and RGB of two cameras that has been discussed in the past and does not involve a sync trigger has been to use frame metadata, as described in #2186

@xc-racer99
Copy link
Author

Hi @xc-racer99 RS2_OPTION_INTER_CAM_SYNC_MODE will set a camera to be a Master camera when set to 1 and the camera will therefore transmit a trigger pulse. A camera that is set to '2' will be a Slave that listens out for a trigger pulse on every frame and performs a depth-frame capture that is timestamp-synced with the Master camera if it recognizes the pulse.

But when exactly does this trigger pulse start? Does it start simultaneously with the very first frame or does it start first? Is it synced with RGB or depth? Is it semi-random?

The slave option isn't really applicable to our use case.

When RS2_OPTION_OUTPUT_TRIGGER_ENABLED is set to '1', a trigger is output from the camera to any external device on every depth frame.

Output Trigger In addition to the USB3 connector, RealSense 400 Series cameras expose headers for interaction with external devices. The camera can provide trigger every time frame is being captured (with RealSense 400 Series as the master). To enable this mode, all you need to do is:

dev.set_option(RS2_OPTION_OUTPUT_TRIGGER, 1);

Yes, this appears to be exactly what we want, except it doesn't appear that the first frame line up with the first pulse. I've found #8171 with similar issues.

Output Trigger
In addition to the USB3 connector, RealSense 400 Series cameras expose headers for interaction with external devices. The camera can provide trigger every time frame is being captured (with RealSense 400 Series as the master). To enable this mode, all you need to do is:

Yes, I've investigated a variant of this as well (note that the stream name is now RS2_STREAM_GPIO and there's only the one stream). There's also the metadata option RS2_FRAME_METADATA_GPIO_INPUT_DATA which allows you to tell per frame if GPIO1 is high or low. Unfortunately, just toggling this when we actually want to keep a frame is too low resolution (ie too big of a time gap between encoder data and frame) even at 60fps.

I would not expect a RealSense RGB stream to have an influence on either RS2_OPTION_INTER_CAM_SYNC_MODE or RS2_OPTION_OUTPUT_TRIGGER_ENABLED, except when using RS2_OPTION_INTER_CAM_SYNC_MODE with a slave camera that has been set to Inter Cam Sync Mode '3' (Full Slave).

Well, based on the oscilloscope, it definitely does have an effect :) The above images are not one-offs, they are fairly repeatable.

An alternative method of syncing depth and RGB of two cameras that has been discussed in the past and does not involve a sync trigger has been to use frame metadata, as described in #2186

Unfortunately, this is too low resolution for us - even 2ms of skew between frame and encoder data is not good enough.

Other issues that I've found and read through here on GitHub are:
#2637
#2610
#2474
#2512
#2281
#1212
#4574
#5839

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Oct 1, 2021

In regard to your question But when exactly does this trigger pulse start? - in #2474 a RealSense team member advises that the trigger is generated at the end of exposure in the original mode 1 and 2 hardware sync system. They add though that "when triggering externally, the trigger pulse aligns with the frame start".

@xc-racer99
Copy link
Author

In regard to your question But when exactly does this trigger pulse start? - in #2474 a RealSense team member advises that the trigger is generated at the end of exposure in the original mode 1 and 2 hardware sync system. They add though that "when triggering externally, the trigger pulse aligns with the frame start".

Correct, I understand where in relation to the exposure the pulse occurs. However, when in stream process do these start occurring? Experimentally, there appear to be a few pulses that are emitted with no corresponding frame. Ie
Pulse 1

  • No frame
    Pulse 2
  • No frame
    Pulse 3
  • Depth frame with frame number 0
    Pulse 4
  • Depth frame with frame number 1

And unfortunately, it doesn't appear that there are always 2 extra pulses at the start.

@MartyG-RealSense
Copy link
Collaborator

As you noted at the beginning of this case, an apparently similar phenomenon is discussed at #8171 where it is mentioned that trigger behaviour performs as expected in the RealSense Viewer but not when using their own code.

You can test this yourself in the Viewer by enabling an option called Output Trigger Enabled that is disabled by default, and setting Inter Cam Sync Mode to '1'. The 'Output Trigger Enabled' option, when enabled, should generate a trigger from the camera to an external device once per frame.

image

@xc-racer99
Copy link
Author

I'm trying to use the GPIO stream, but I can't seem to figure out how to read it. I always seem to get

RealSense error calling rs2_pipeline_start_with_config(pipe:00000128153B6660, config:00000128153B6840):
    Couldn't resolve requests

When using the following code

// License: Apache 2.0. See LICENSE file in root directory.
// Copyright(c) 2015-2017 Intel Corporation. All Rights Reserved.

#include <librealsense2/rs.hpp> // Include RealSense Cross Platform API

#include <fstream>              // File IO
#include <iostream>             // Terminal IO
#include <sstream>              // Stringstreams

// 3rd party header for writing png files
#define STB_IMAGE_WRITE_IMPLEMENTATION
#include "stb_image_write.h"

// Helper function for writing metadata to disk as a csv file
void metadata_to_csv(const rs2::frame& frm, const std::string& filename);

// This sample captures 30 frames and writes the last frame to disk.
// It can be useful for debugging an embedded system with no display.
int main(int argc, char * argv[]) try
{
    // Declare depth colorizer for pretty visualization of depth data
    rs2::colorizer color_map;

    // Declare RealSense pipeline, encapsulating the actual device and sensors
    rs2::pipeline pipe;
    rs2::config config;
    rs2::context ctx;

    config.enable_stream(RS2_STREAM_GPIO, 0, 0, 0, RS2_FORMAT_GPIO_RAW);
    //config.enable_stream(RS2_STREAM_COLOR, 848, 480, rs2_format::RS2_FORMAT_BGR8, 60);
    config.enable_device(ctx.query_devices().front().get_info(RS2_CAMERA_INFO_SERIAL_NUMBER));

    // Start streaming with default recommended configuration
    pipe.start(config);

    // Capture 30 frames to give autoexposure, etc. a chance to settle
    //for (auto i = 0; i < 30; ++i) pipe.wait_for_frames();

    // Wait for the next set of frames from the camera. Now that autoexposure, etc.
    // has settled, we will write these to disk
    for (auto&& frame : pipe.wait_for_frames())
    {
        // We can only save video frames as pngs, so we skip the rest
        if (auto vf = frame.as<rs2::video_frame>())
        {
            auto stream = frame.get_profile().stream_type();
            // Use the colorizer to get an rgb image for the depth stream
            if (vf.is<rs2::depth_frame>()) vf = color_map.process(frame);

            // Write images to disk
            std::stringstream png_file;
            png_file << "rs-save-to-disk-output-" << vf.get_profile().stream_name() << ".png";
            stbi_write_png(png_file.str().c_str(), vf.get_width(), vf.get_height(),
                           vf.get_bytes_per_pixel(), vf.get_data(), vf.get_stride_in_bytes());
            std::cout << "Saved " << png_file.str() << std::endl;

            // Record per-frame metadata for UVC streams
            std::stringstream csv_file;
            csv_file << "rs-save-to-disk-output-" << vf.get_profile().stream_name()
                     << "-metadata.csv";
            metadata_to_csv(vf, csv_file.str());
        }
        else if (auto fp = frame.as<rs2::pose_frame>()) {
            std::stringstream csv_file;
            csv_file << "rs-save-to-disk-output-" << "gpio"
                << "-metadata.csv";
            metadata_to_csv(frame, csv_file.str());

            system("pause");
        }
    }

    return EXIT_SUCCESS;
}
catch(const rs2::error & e)
{
    std::cerr << "RealSense error calling " << e.get_failed_function() << "(" << e.get_failed_args() << "):\n    " << e.what() << std::endl;
    return EXIT_FAILURE;
}
catch(const std::exception & e)
{
    std::cerr << e.what() << std::endl;
    return EXIT_FAILURE;
}

void metadata_to_csv(const rs2::frame& frm, const std::string& filename)
{
    std::ofstream csv;

    csv.open(filename);

    //    std::cout << "Writing metadata to " << filename << endl;
    csv << "Stream," << rs2_stream_to_string(frm.get_profile().stream_type()) << "\nMetadata Attribute,Value\n";

    // Record all the available metadata attributes
    for (size_t i = 0; i < RS2_FRAME_METADATA_COUNT; i++)
    {
        if (frm.supports_frame_metadata((rs2_frame_metadata_value)i))
        {
            csv << rs2_frame_metadata_to_string((rs2_frame_metadata_value)i) << ","
                << frm.get_frame_metadata((rs2_frame_metadata_value)i) << "\n";
        }
    }

    csv.close();
}

or variants thereof with different width/height or stream indexes.

Based on 28371bc it looks like this is only possible when using the tracking add-on on a bare board? I don't believe an HID device is being created for the raw D435 (ie not the D435i).

The only place in the code I can see referencing the GPIO stream is the HID device is at https://github.com/IntelRealSense/librealsense/blob/master/src/sensor.cpp#L788 but this is within a check for the sensor name being "custom". So it looks like this feature wouldn't work even on the D435i, although I don't have one to test with.

@MartyG-RealSense
Copy link
Collaborator

The Couldn't resolve requests message suggests that the problem is with the config instruction for GPIO. Have you tried it with RS2_STREAM_GPIO1, 0, 0, 0 like the obsolete documentation suggested, with a '1' after RS2_STREAM_GPIO?

You could also try a simpler form of the instruction:

config.enable_stream(RS2_STREAM_GPIO);

Given that RS2_STREAM_GPIO is listed in the SDK documentation but the script comes from obsolete documentation, I would also consider the possibility that this format is supported in the SDK but now not exposed to the end-user to access through SDK scripting.

Yes, D435i (as an IMU-equipped camera) is classified as a HID device whilst D435 would not be.

I believe the aspect of 28371bc that you are quoting refers to early plans at the late-2017 launch of RealSense to allow a Tracking Module board to be attached. These plans were dropped, and the D435i model with an included IMU was launched in late 2018. So the reference in the document is obsolete.

@xc-racer99
Copy link
Author

The Couldn't resolve requests message suggests that the problem is with the config instruction for GPIO. Have you tried it with RS2_STREAM_GPIO1, 0, 0, 0 like the obsolete documentation suggested, with a '1' after RS2_STREAM_GPIO?

Yes, I've tried all sorts of variants of 1's and 0's here along with the simpler form. They all ended up with the same error. I suspect this just isn't supported any more.

@xc-racer99
Copy link
Author

And to clarify - RS2_STREAM_GPIO1 doesn't exist and won't build with that. I traced back the git history and it was renamed somewhere in 2017 IIRC.

@MartyG-RealSense
Copy link
Collaborator

Thanks very much for the Git history information.

@MartyG-RealSense
Copy link
Collaborator

I apologize for the delay in responding further. Do you require further assistance with this case, please? Thanks!

@xc-racer99
Copy link
Author

Issue hasn't been resolved - but it looks like this has reached a dead-end and that the sensor outputs an indeterminate number of pulses before actually capturing and that the GPIO stream isn't functional on the D435. So I'll close this issue.

@MartyG-RealSense
Copy link
Collaborator

Thanks very much @xc-racer99 for the update.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants