Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Getting 3D skin data files #93

Closed
BlendingInfinite opened this issue Jan 28, 2015 · 20 comments
Closed

Getting 3D skin data files #93

BlendingInfinite opened this issue Jan 28, 2015 · 20 comments
Labels

Comments

@BlendingInfinite
Copy link

I cant find data files with the positions and orientations from each sensor. I used google and searched in the icub contribution page but only get a page which refers to the icub 3D skin gui which points to such files (http://eris.liralab.it/iCub/contrib/dox/html/group__icub__iCubSkinGUI3D.html), that I didnt found too (maybe because I cub 3d skin gui doesent exists anymore?).

@BlendingInfinite BlendingInfinite changed the title Getting 3D skin data Getting 3D skin data files Jan 28, 2015
@traversaro
Copy link
Member

The skinManager uses taxel positions from https://github.com/robotology/icub-main/tree/master/app/skinGui/conf/positions .

@traversaro
Copy link
Member

@MoritzN89 please consider that those points are expressed in the reference frames defined by the iKin library.

Do you plan to use them for something related to this gazebo-yarp-plugins issue: robotology/gazebo-yarp-plugins#168 ?
In that case I suggest that you clearly state what you want to implement and the amount of the work that you can commit on it, so we can provide all the support needed.

@BlendingInfinite
Copy link
Author

I want to improve the icub.sdf file by adding cubes in the body-parts representing the several sensors. Every sensor sends its collision data to yarp where it can evaluate them. Ive allready done to programming a script which creates the cubes but the coordinates are as expected not compatible with the one from the reference frame. I looked at the iKin page but dont find anything which can help to compute the provided skin data into the pose format of the sdf file.

@pattacini
Copy link
Member

iKin convention is described here.

@BlendingInfinite
Copy link
Author

Something on the data must be wrong or they are not compatible with gazebo. In the image attached you can see the cubes as sensors. They are for the most part right placed on the skin and you can see a triangle pattern. Nevertheless it looks strange
screenshot from 2015-02-01 17 43 17

https://www.dropbox.com/s/nean48k6suq19f5/icub.sdf?dl=0

and a bit sensors are under the skin. I considerd the reference frame too.

@traversaro
Copy link
Member

I guess there are two main reasons behind of this discrepancies:

@alecive
Copy link
Member

alecive commented Feb 2, 2015

@MoritzN89 furthermore, the skin data (point 1 listed by @traversaro ) is supposedly better on the forearm. Try to visualize them instead of the upper arm.

@matejhof
Copy link
Contributor

matejhof commented Feb 2, 2015

Hi there,
I cannot judge the accuracy of the Gazebo models of meshes. I will speak to the accuracy of the taxel positions in the positions files that are a result of the calibration using force/torque sensing peformed by Andrea Del Prete.
I plotted some of them (forearm) in matlab and they indeed are not perfect triangles - the position with respect to the position if one would project the triangular modules whose 2D geometry is known differs - the error is perhaps up to 1 cm in each taxel's position.
The other problem is that this data is not available for other body parts (torso, legs).

As an alternative to the calibration of the type Andrea performed (which becomes more difficult and error-prone for some of the body parts due to the positioning of force/torque sensors), one can use data available from the robot's CAD models. For the palm data, I used the coordinates I got from our mechanical guys to create the positions file - assuming the palm is 2D - that was easy. For the other skin parts that are 3D, the 2D geometry of individual skin patches that cover the robot is known. In addition, certain attachment points of these patches in the 3D CAD model of the robot are also known (midpoints of triangular modules - middle taxel of every triangle). Taken together, these data form constraints that can be used for an optimization algorithm to find the most likely 3D positions of every tactile sensor. However, this is quite laborious.

@MoritzN89 , can we ask more information about you and your project? Perhaps we could see how we could jointly proceed foreward in this direction.
Equipping the Gazebo simulator with the skin would also be our interest.

cheers,
Matej Hoffmann

@lornat75
Copy link
Member

lornat75 commented Feb 3, 2015

I agree with @metejhof. On the other hand if we distribute the same amount of taxels of the real robot on the surfaces of the covers we can still get a very useful simulation even if triangles are not precise... so I would prioritize getting a realistic software interface rather than striving to get exactly the same positioning as the real robot.

@iron76
Copy link
Contributor

iron76 commented Feb 3, 2015

@matejhof has a point when saying that the procedure in http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=6094896 is not accurate enough. I have playing with that procedure when calibrating both feet (see results in http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=6943124). It is quite an elaborate procedure with poor results. As pointed out again by @matejhof, we should extract as much data as possible from the CAD. Then adjustment (i.e. calibration) can be performed on this preliminary positioning, trying to be realistic on the variables to be optimised (e.g. rotation and displacement of the skin patches).

In my opinion, this can be done in two different ways.

  • manually : volunteers?
  • automatically : with @traversaro we are considering automatic procedures to extract URDF from CAD, including sensors. At present, we are only considering accelerometers, gyros and force/torque sensors. Skin is not a priority but we might include it as well.

@alecive
Copy link
Member

alecive commented Feb 3, 2015

Regarding the manually, we (me and @matejhof ) talked with alberto parmiggiani and he said that for a skilled CAD user it would have these costs:

  • a couple of hours to retrieve the taxels positions for the center of each triangle (the others 9 taxels will still be left out, but still it would be a sensible improvement)
  • a couple of days to retrieve their normals wrt the covers. This might seem not that useful at a first glance, but it may be (e.g. it is something without which my sw does not work 😜 )

@pattacini
Copy link
Member

Two days?! Oh my goodness! :)
Maybe, they don't know all the magic behind the software :)

@alecive
Copy link
Member

alecive commented Feb 3, 2015

Probably :)

@traversaro
Copy link
Member

Let me add that there is no contrast between manually and automatically extracting this skin information.
Even if we extract this information manually, adding appropriate reference frames to the model (or perhaps the simplified model, see https://www.icub.org/wiki/Creo_Mechanism_to_URDF for more information) would drastically simplify even automatic extractions of this information. For example, having the iKin link reference frames in the CAD model would simplify a lot sensor extraction.

@matejhof
Copy link
Contributor

matejhof commented Feb 4, 2015

I just re-stumbled over this abandoned iCubSkinGUI3D: http://eris.liralab.it/iCub/contrib/dox/html/group__icub__iCubSkinGUI3D.html by Eric Sauser from EPFL. Has anybody seen it working or knows how to operate it? Perhaps this is relevant (perhaps not).

@matejhof
Copy link
Contributor

While playing with the positions files we have from Andrea Del Prete's method for forearms and upper arms, I noticed some new facts:

  1. They were done on an older version of iCub skin - the one with 12-taxel triangles (confirmed by Andrea). The triangles there had the same size, but the individual taxel arrangement was different. At least, the majority of triangles had the same orientation w.r.t. the new skin versions.
  2. For the left vs. right files, the positions and orientations are the same - just remapped into the different FoR orientation. So calibration was probably performed only for one half of the body.
  3. The triangle midpoints should roughly match with the holes in the covers where they are attached. The coordinates of these are available in CAD. When plotted against the positions files, there are systematic offsets - e.g. for the small patch on top of the forearm, the CAD data has the triangle centres shifted cca 1 cm distally.

So, the positions files should be used with care - the taxels of each triangle are more like a scrambled point cloud that may also be systematically shifted (corresponding to the picture posted by @MoritzN89 above). So my estimate - assuming the CAD data is correct - is that individual taxels can be off by 1, sometimes up to 2 cm.
(this is not true for the palm that was extracted directly from CAD)

At the same time, we now have some data from the CAD and some knowledge on how to formulate an optimization problem that could be used to recalibrate the skin. So, volunteer needed! :)

@pattacini
Copy link
Member

Hi @matejhof

After a year, there's been any progress on this?

@traversaro
Copy link
Member

Actually, yes. : D
But we can open new issue to track the progress on this activity.

@matejhof
Copy link
Contributor

matejhof commented Sep 2, 2016

yes, in fact the progress is happening at this very moment with @traversaro and @fiorisi
We will update this very soon.

@matejhof
Copy link
Contributor

New forearm V2 files coming out of the new pipeline using CAD (thanks to @traversaro and @fiorisi) have been committed here: 7d81ded
The optimization framework thanks to @traversaro is here: https://github.com/robotology-playground/icub-model-generator

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

7 participants