-
Notifications
You must be signed in to change notification settings - Fork 104
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Getting 3D skin data files #93
Comments
The skinManager uses taxel positions from https://github.com/robotology/icub-main/tree/master/app/skinGui/conf/positions . |
@MoritzN89 please consider that those points are expressed in the reference frames defined by the iKin library. Do you plan to use them for something related to this gazebo-yarp-plugins issue: robotology/gazebo-yarp-plugins#168 ? |
I want to improve the icub.sdf file by adding cubes in the body-parts representing the several sensors. Every sensor sends its collision data to yarp where it can evaluate them. Ive allready done to programming a script which creates the cubes but the coordinates are as expected not compatible with the one from the reference frame. I looked at the iKin page but dont find anything which can help to compute the provided skin data into the pose format of the sdf file. |
iKin convention is described here. |
Something on the data must be wrong or they are not compatible with gazebo. In the image attached you can see the cubes as sensors. They are for the most part right placed on the skin and you can see a triangle pattern. Nevertheless it looks strange https://www.dropbox.com/s/nean48k6suq19f5/icub.sdf?dl=0 and a bit sensors are under the skin. I considerd the reference frame too. |
I guess there are two main reasons behind of this discrepancies:
|
@MoritzN89 furthermore, the skin data (point 1 listed by @traversaro ) is supposedly better on the forearm. Try to visualize them instead of the upper arm. |
Hi there, As an alternative to the calibration of the type Andrea performed (which becomes more difficult and error-prone for some of the body parts due to the positioning of force/torque sensors), one can use data available from the robot's CAD models. For the palm data, I used the coordinates I got from our mechanical guys to create the positions file - assuming the palm is 2D - that was easy. For the other skin parts that are 3D, the 2D geometry of individual skin patches that cover the robot is known. In addition, certain attachment points of these patches in the 3D CAD model of the robot are also known (midpoints of triangular modules - middle taxel of every triangle). Taken together, these data form constraints that can be used for an optimization algorithm to find the most likely 3D positions of every tactile sensor. However, this is quite laborious. @MoritzN89 , can we ask more information about you and your project? Perhaps we could see how we could jointly proceed foreward in this direction. cheers, |
I agree with @metejhof. On the other hand if we distribute the same amount of taxels of the real robot on the surfaces of the covers we can still get a very useful simulation even if triangles are not precise... so I would prioritize getting a realistic software interface rather than striving to get exactly the same positioning as the real robot. |
@matejhof has a point when saying that the procedure in http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=6094896 is not accurate enough. I have playing with that procedure when calibrating both feet (see results in http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=6943124). It is quite an elaborate procedure with poor results. As pointed out again by @matejhof, we should extract as much data as possible from the CAD. Then adjustment (i.e. calibration) can be performed on this preliminary positioning, trying to be realistic on the variables to be optimised (e.g. rotation and displacement of the skin patches). In my opinion, this can be done in two different ways.
|
Regarding the manually, we (me and @matejhof ) talked with alberto parmiggiani and he said that for a skilled CAD user it would have these costs:
|
Two days?! Oh my goodness! :) |
Probably :) |
Let me add that there is no contrast between manually and automatically extracting this skin information. |
I just re-stumbled over this abandoned iCubSkinGUI3D: http://eris.liralab.it/iCub/contrib/dox/html/group__icub__iCubSkinGUI3D.html by Eric Sauser from EPFL. Has anybody seen it working or knows how to operate it? Perhaps this is relevant (perhaps not). |
While playing with the positions files we have from Andrea Del Prete's method for forearms and upper arms, I noticed some new facts:
So, the positions files should be used with care - the taxels of each triangle are more like a scrambled point cloud that may also be systematically shifted (corresponding to the picture posted by @MoritzN89 above). So my estimate - assuming the CAD data is correct - is that individual taxels can be off by 1, sometimes up to 2 cm. At the same time, we now have some data from the CAD and some knowledge on how to formulate an optimization problem that could be used to recalibrate the skin. So, volunteer needed! :) |
Hi @matejhof After a year, there's been any progress on this? |
Actually, yes. : D |
yes, in fact the progress is happening at this very moment with @traversaro and @fiorisi |
New forearm V2 files coming out of the new pipeline using CAD (thanks to @traversaro and @fiorisi) have been committed here: 7d81ded |
I cant find data files with the positions and orientations from each sensor. I used google and searched in the icub contribution page but only get a page which refers to the icub 3D skin gui which points to such files (http://eris.liralab.it/iCub/contrib/dox/html/group__icub__iCubSkinGUI3D.html), that I didnt found too (maybe because I cub 3d skin gui doesent exists anymore?).
The text was updated successfully, but these errors were encountered: