-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Design a MoveIt2 demo for PI16 review and prepare for it #13
Comments
To be discussed w/ @maggia80. |
@martinaxgloria good point. We should ask Davide De Tommaso and/or Francesco Rea. They are responsible for the iCubs @ Erzelli if there is any robot available. I would avoid the use of the old robots that still have the PC104. |
In case we use a Erzelli's iCub we could schedule a visit there to install what we need on that setup, after agreeing w/ Rea/De Tommaso |
About the use of iCub @ Erzelli, on icub-head they have a version of Ubuntu 20.04, but ros humble requires version 22.04. For this reason, we could think of moving the demo to ergocub. I will look at it starting from the next sprint. |
As proposed in the previous comment, today I worked on the generalization of the ros2_control framework in order to be used both with iCub and ergoCub. In particular, I:
ergocub.webmProbably there is a problem in ergocub urdf since it's clearly not stable, but on the moveit/ros side it seems to work everything |
Super! The arm doesn't oscillate; only the body itself is wobbling. |
You can use the |
Today, @Nicogene and I investigated the problem of the wobbling of ergoCubGazeboV1 model and we found out that it comes from icub-tech-iit/ergocub-software@aae9cd6 (I opened a issue for this problem, but it's out of this issue). Moreover, regarding the MoveIt2 demo, we could anyway switch to the |
After this fix icub-tech-iit/ergocub-software#176, the contacts on the soles are more stable and the robot is not wobbling anymore. For the time being, I'd stay with the standard |
Ok! |
Yesterday I started playing with MoveIt Task Constructor to understand if it could be a useful tool for our demo. In particular, it uses the poses of a group of joints defined inside the .srdf file and breaks down a complex task in a set of steps. So, one con is that the poses to be reached have to be previously defined, but if we are not planning to do really complex movements, it would not be a problem What is your opinion about? |
We're not going to use the hands, just the torso and the arm. With the latter kinematic chain, it'd be nice to showcase some choreography:
If the tool above can be helpful for this, let's go with it, otherwise it's mere overkill. During the demo, I'd expect the following questions to be raised (appetite comes with eating):
Just let's gear up in view of them by preparing a sensible answer in prospect. |
Thanks for the tips @pattacini! Ok, so I'm going to think about that and understand which could be the best solution.
Sure, probably the hands question would be the most popular one |
You may find the grasping sandbox relevant to this. You could record the reaching trajectory of the end-effector's pose in the Cartesian space and replicate it1 with MoveIt2. Footnotes
|
A few updates on this activity. In the end, I gave up on MoveIt Task Constructor since it was not so essential in our case and I tried to plan a circular trajectory by defining a set of points and computing the cartesian path that follows them. Here the result: circular_path.webm |
Super nice! I guess you got inspired by our tutorial 😄 In this case, the process of designing the trajectory in MoveIT2 is also important to be showcased, I'd say. Perhaps, we can make the circle wider and we can control the orientation of the hand as well to be with the palm always down. |
Hi @pattacini, I tried with a wider circular path and with the palm down as you suggested in the previous comment and the result was the following: incomplete_path.webmAs you can see, the solver wasn't able to retrieve the complete trajectory and trying to print the percentage of the computed trajectory, I obtained:
After the f2f chat we had, I checked the TRAC-IK parameters trying to tuning them to obtain the result we aim to but the only implemented parameters to be set are Moreover, I realized that in the .srdf file, there's a check of the collisions between adjacent links and during the trajectory planning stage, I set a parameter to avoid collisions to true. Just to test, I disabled those collisions forcing the kinematics to be solved and the result was: forced_complete_path.webmIn the video, it's possible to see that the wrist became red, which means that it came in collision with some other parts (from the visualization I'm not able to see where or which links are in contact). This is why the solver didn't retrieve the 100% of the execution. |
Great debugging @martinaxgloria 🚀
Can you check that the collisions among parts take place even if the joints remain within their bounds? If this is the case, then it's not a real problem for us and we can keep the collision flag disabled. |
I don't know if I got the question, but I checked the joints' positions from the yarpmotogui during the entire movement and they remained within the bounds. |
Correct 👍🏻 |
Today I worked on this activity. In particular, I scaled the trajectory already implemented to work with iCub, which has the root_link reference frame with the x and y axis opposite oriented with respect to ergoCub, see below: After that, I tried to include a reaching-like trajectory and in the end I obtained this: demo_first_attempt.webmObviously, this is not the final result, but an attempt to resize the movements (especially for the final part), and then I still have to log the data in order to see if the joints remain within their bounds during the execution. Moreover, this afternoon together with @Nicogene, we tried to install ros_humble on iCubGenova11 head. Firstly, we tried to install |
Today, @Nicogene and I tried to find a solution to have ros2 installed on icub-head in order to run the demo. Firstly, I installed within the conda environment
this was the error I obtained. For this purpose, we decided to try to install ros-foxy and compile the repo with the moveit demo on the iCubGenova11 laptop and use the head (in which this distro was already installed with apt dependencies) to launch the yarprobotinterface and expose ros2 topics. I started this activity, but I had to change some methods and attributes in my code since they differs from ros-foxy to ros-humble distros. Tomorrow I'll test the changes with this new configuration and I'll let you know cc @pattacini |
Some updates: yesterday, with the help of @Nicogene, I tried to solve some problems related to this activity. First of all, we wrote a custom “all joints remapper” in which we excluded the joints that are not present at urdf level (i.e. eyes and fingers): this was because on the /joint_state topic they were all published in terms of position, velocity and effort, but when the robot_state_publisher node tried to publish of the /tf topic the poses of the joints it read from the urdf model, it didn’t find a perfect match and retrieved an error. After solving this issue, we successfully visualized the model on rviz2, with the poses properly published on the /tf topic by the robot_state_publisher node. But other problems with other nodes (in particular with the ros2_control_node) arise. We tried to patch them but in the end, after asking HSP people, we decided to try to update to Ubuntu 22.04 the icub-head OS, and to install ros_humble to have compatibility with what I have done so far in simulation. cc @pattacini |
After the upgrade of icub-head OS to Ubuntu 22.04 and the installation of ros-humble on that machine, me and @Nicogene succeded in make my laptop and icub-head communicating with each other by passing a configuration file to the cyclone dds in which the ip addresses of the two machines are specified as a closed network. Today I tried to use ros2_control on iCubGenova11 and this was the result both in simulation and on the real robot: demo_attempt.webmIMG_1621.mp4Now, I have to refine the trajectories and improve something (i.e. the possibility of coming back to the starting position after all the movements in order to restart the demo without homing all the joints from the yarpmotorgui). |
Voilà! Superb! 🚀 |
Amazing!!! |
Hi @pattacini, IMG_1679.mp4Maybe it's just me, but it seems different with respect to the one seen in the sandbox. Do I have to rescale it? Or maybe do some other acquisitions and try to replicate them within the MoveIt environment? cc @Nicogene |
Update F2F. @martinaxgloria could you quickly summarize the action points? |
After a f2f chat with @pattacini, we came up with the conclusion that:
|
With the help of @Nicogene, we tried to set the correct end effector orientation for the grasping task. Starting from the Then, since it is in axis-angle convention and MoveIt works with quaternion, we use tf2::Quaternion() axis angle constructor to obtain the corresponding quaternion. We tried to move the eef to that pose but it wasn't able to reach it. We visualized in rviz the pose we set and we obtained the circled tf: If you compare this last frame with the reference one on gazebo, you can see that they are very different. It seems that we missed a transformation between yarp and ros2 conventions. So far, we have two movements that we can show (hand down and circle trajectory), and we can show how it's possible to set poses and compute the cartesian trajectory also from the GUI. |
We know that the pose retrieved from our pipeline is referred to the root_link, we have to see if the pose we are giving to move it is referred to the world frame. In that case premultiplying by root_link_H_world should be sufficient cc @traversaro |
Nonetheless, the reference orientations we used to command the circular path seemed to be ok. |
It's ok because I manually computed the transformation between the hand and the root link in rpy angles and then I transformed them into a quaternion to get the final pose (which is the hand oriented downward). |
BTW, the reference frame attached to the wrist visible in the figure above is not the one used for reaching. The correct one is documented. |
Importing the iCubGazeboV2_5 model (both with and without hands) on gazebo, I noticed that the hand reference frame located on the palm is not visible from the urdf: cc @Nicogene |
Sorry, I am not sure I got the problem from the issue, can we have a quick chat on it? |
The most important point is that we have to find a way to express within the URDF the standard End-Effector used in reaching as defined in https://icub-tech-iit.github.io/documentation/icub_kinematics/icub-forward-kinematics/icub-forward-kinematics-arms/. As of now, it seems that the last frame available from URDF is the one in #13 (comment), which is attached to the wrist though and not to the palm. |
That frames are |
Gazebo does lump the fixed joint of model, so all the "additional frames" present in the URDF model are ignored and only the frame of actual links with mass are displayed, so it is not a good way of visualizing all the frames of the model. To visualize all the frames of the models, you can either use RViz or (harder solution) write a custom visualizer combining iDynTree's Visualizer API and KinDynComputations, see for example (for the visualizer API) https://github.com/ami-iit/yarp-device-openxrheadset/blob/c81176c3a5b535876f611ab490e8bbb09f0ffe64/src/utils/OpenXrFrameViz/main.cpp#L165 . |
Thank you @traversaro. I noticed that the |
After fixing the end effector reference frame from The bold rf is the hand's one, while the other is the reference pose we want to reach. As you can see, now the two are oriented in the same way (so the hand is reaching the desired orientation) but this is still the wrong one with respect to the one seen in the sandbox (and from which I logged it). With @Nicogene, we made a comparison between the and the one of and they are the same. Moreover, in order to see which is the reference frame for MoveIt, we logged the pose reference frame:
so MoveIt is aligned with our pipeline. |
Today I did some more tests in order to understand if the conversion between the axis-angles retrieved by yarp and the quaternion transformation done inside the ros2 environment was correct (I gave MoveIt an RPY rotation and I transformed it into a quaternion): It seems to be all coherent with the rpy rotation I imposed. For this reason, I tried again to launch the icub grasping sandbox simulation and I logged the poses for the pre-grasp and grasp phases in a condition of side grasping (like the one I used from the beginning). I gave those poses to MoveIt and I launched the simulation first, and then the demo on the real robot: Screencast.from.10-19-2023.01.56.41.PM.webmIMG_1696.mp4Now the poses are consistent with the ideal movement! Probably, I was supposed to log those values more than once (like with |
It's not clear to me what made everything work now, but cool that you managed to sort it out 🚀 |
The poses I gave this time were not the same as before, i don't know why
Did you mean by sampling the poses during the entire movement? |
Yep 👍🏻 To present this, we can show a video of the sandbox doing the movement (using the Cartesian Control) and then we can run it on the robot using MoveIt2. The robot will be doing slightly different movements (because of different IK's and problem formulations). Also, the robot won't gaze. This is ok, but perhaps we could close the hand in open-loop to signal that we reached the endpoint. |
Hi @pattacini, I implemented the closing of the hand during the grasping-like task and this is the entire demo with the hand-down, the circle trajectory and the reaching (with only 3 poses given to the solver): demo.mp4As you can see, the torso is really involved in the movement, but we knew this "problem". What do you think about the tasks now? cc @Nicogene |
Superb! 🚀
|
Awsome work indeed! 🎖️ |
Demo done! |
Task description
It would be nice to have a MoveIt2 demo for the next PI16 review with simulated and/or real robot. In particular, we should understand which robot is available and what to show.
Definition of done
The moveit controller works on ergoCub in simulation, for moving then on the real robot.
The text was updated successfully, but these errors were encountered: