-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The issue of "No stream match for pointcloud chosen texture Process" #2466
Comments
Hi @Pran-Seven The warning No stream match for pointcloud chosen texture process - Color can appear every time that there is an unavailable texture, likely because of a frame drop. The point cloud will not be published in the frame that the message occurs. In a previous case of this error, a RealSense ROS user was using a laptop running off the battery only and found that the error disappeared if the laptop was connected to a mains power socket. Are you using a battery-driven laptop, please? |
Hey @MartyG-RealSense thanks for the prompt response. Yes, i could gather that from the previous issues posted, but I am facing an issue where the error appears and the stream does not even start. In some case like you mentioned, the stream is running and I get the warning once in a while and I can see the frame drop on rviz then, but I encounter the stream does not start issue more than this. The laptop is connected to a mains socket during most of my testing and I have tested it on a Beelink PC which runs only on power and still face the same issue. Update: Camera works fine @ 15fps when connected to a USB 2.0 interface but does not work on a 3.0 interface. Unfortunately could not test with both connected to USB 2.0 interfaces due to unavailability of ports. |
Have you tried launching the two cameras with a single roslaunch command using rs_multiple_devices.launch When launching with rs_camera.launch, are you launching the two cameras in separate ROS terminals as recommended by the multiple camera section of the ROS wrapper documentation? https://github.com/IntelRealSense/realsense-ros#work-with-multiple-cameras Also, does the camera work on a USB 3.0 interface if you use the custom launch command below to set both depth and color to 640x480 and 30 FPS: roslaunch realsense2_camera rs_camera.launch depth_width:=640 depth_height:=480 depth_fps:=30 color_width:=640 color_height:=480 color_fps:=30 |
Yeah I am launching it as per the documentation. |
Are you using the official 1 meter cable supplied with the camera, please? There have been cases where a USB 2 cable has unknowingly been used instead of a USB 3 one when selecting an own choice of cable. As you have tested with multiple cameras and multiple computers and still have the problem, that would suggest that the camera hardware, the computer and the ROS wrapper are not at fault (as you would have had to install the wrapper individually on each computer). The cameras also work correctly in the RealSense Viewer tool, indicating that the librealsense SDK installation is okay too. What method do you use to install librealsense and the ROS wrapper on your computers, please? And which Ubuntu kernel version are you using? |
Hi @Pran-Seven Do you require further assistance with this case, please? Thanks! |
Will get back shortly regarding this |
Thanks very much @Pran-Seven for the update. I look forward to your next report. Good luck! |
Hi @Pran-Seven Do you have an update about this case that you can provide, please? Thanks! |
Yeah after multiple trials, somehow changing the realsense-ros and librealsense versions and cables seemed to fix the issue. There is a new issue I am facing now, when I am trying to launch two D415's using rs_rgbd.launch and passing an external manager, only one camera gets launched, any possible fixes for this? Or any insight into what lets realsense_camera_manager launch both cameras and run the nodelet thread without any issues. |
Have you tried launching the two cameras with rs_rgbd.launch in separate ROS terminals as suggested in the ROS wrapper documentation, please? https://github.com/IntelRealSense/realsense-ros#work-with-multiple-cameras |
That would again launch with the realsense_camera_manager. The issue arises when we try to pass an external manager, when launching from our launch file which points to rs_rgbd.launch. The launch file is as attached.
When I launch this with the parameter external manager, only one camera gets launched. |
I note that you define the two cameras with two sets of |
But the same lines with both includes right after each other seem to work when realsense_camera_manager is the nodelet manager. |
Hey was able to fix it, I think this issue can be closed thank you. But I am getting a bond broken error between the two camera topics and a different nodelet subscribing to it, but I don't suppose that would be a question for this forum right? |
Thanks very much, I was just conducting further research on your case. It's great to hear that you were successful. :) I would recommend defining depth_fps and color_fps too in your external manager. Since wrapper 2.2.22 onwards, if you do not provide 3 factors (width, height and FPS) in a custom stream configuration then the ROS launch deems the custom configuration invalid and instead applies the default stream profile of the particular RealSense camera model being used. |
Well thank you! I did try settig the fps at launch but I am still plagued by the same issue of bond broken after a few seconds. |
There was a bond broken case with the ROS1 wrapper where decreasing the resolution resolved the error in that particular case, as described at #1619 (comment) |
Tried that as well, but the results are the same |
Someone else with the error used the workaround of a script at #934 (comment) that automatically repeated the roslaunch if the launch exited because of bond broken. Have you tested whether the error still occurs if you add initial_reset:=true to your roslaunch instruction to reset the camera at launch? In your case it may be best to add it as an arg to your external manager.
|
Hi @Pran-Seven Do you require further assistance with this case, please? Thanks! |
Yeah, I was able to figure out that the issue lies with different frame rates being published by the two cameras and I am looking for a hardware sync solution, but even after connecting the cameras I still can't get them to sync. Looking at the thread hardware_sync and other related for solutions |
Hardware sync will not synchronize the frame rate (FPS) of cameras, but instead synchronizes their timestamps with a master camera or trigger pulse. Hardware sync would therefore not be a solution for differences in FPS between one camera and another. Normally, if auto_exposure is set to true and auto_exposure_priority is set to false then the FPS rates of depth and color should be forced to try to maintain the same FPS rate instead of being permitted to vary the FPS. I note that in your launch file above you are applying these two settings. However, these particular settings should be specified for the particular type of camera sensor (stereo_module for the depth sensor and rgb_camera for the color sensor), and auto_exposure_priority is a color sensor option. So I wonder whether the settings should be defined like this:
|
The reason i need a fps sync is I am trying to stitch two pointclouds together. Since they are coming at different frequencies, the code I am using keeps breaking with the error bond broken. The Approximate Time Sync methods also don't seem to be working which is why we went for an hardware sync. I did change he settings in the launch file as you mentioned but no luck. Would you happen to any robust methods/codebases for pointcloud stitching in ros that does not pertain to the conix solutions or others mentioned in the common threads? |
Intel have a RealSense ROS guide for stitching together pointclouds from two cameras attached to the same computer at the link below. https://github.com/IntelRealSense/realsense-ros/wiki/Showcase-of-using-2-cameras There is also a guide for three cameras spread across two computers. |
Yeah I have gone through those and have implemented them and they work well, but I can't incorporate those into my ros project, therefore looking for other alternatives |
Some ROS users have taken the approach of combining laser scans into a single pointcloud. There is a ROS package that has been used by some RealSense ROS users called pointcloud_to_laserscan http://wiki.ros.org/pointcloud_to_laserscan A guide at the link below discusses using pointcloud_to_laserscan to combine laserscans into a single pointcloud. https://medium.com/@amritgupta1999/merging-data-from-multiple-lidar-s-in-ros-e890fb60cbbf |
Hi @Pran-Seven Do you require further assistance with this case, please? Thanks! |
Nope, thanks a lot. You have been very helpful. |
You are very welcome, @Pran-Seven - thanks very much for the update! |
This issue has been posted multiple times and I have tried most of the fixes suggested but none seem to work.
System specs: Real sense D415 cameras (2), ubuntu 18.04, ros melodic
Everything works great on real sense viewer, but when try to launch 2 cameras via rs_rgbd or rs_camera it either fails to start depth stream - hardware error or when I try to select the dept point cloud from rviz I get no stream match error and nothing is visualized. Sometimes one of the camera publishes and that degrades as well within a few minutes. Tried with multiple computers, multiple cameras but the issue persists. How to go about fixing this issue?
The text was updated successfully, but these errors were encountered: