-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Thoughts about path-planning within roboticslab-uc3m #40
Comments
Regarding trajectories in Cartesian space, new related issues for keeping close integration: roboticslab-uc3m/kinematics-dynamics#134 and roboticslab-uc3m/kinematics-dynamics#135 |
Regarding trajectories in joint space, just a reminder that we have some tools in the, well, tools repository. Namely, as commented here:
|
See also: https://github.com/personalrobotics/aikido.
|
Not exactly path-planning, but kinda related: https://github.com/robotology/navigation. |
I think the demo developed by @elisabeth-ms has some bits of path planning. It's an DL-based object detection app for grabbing stuff with one of TEO's arms. |
Cool, nice catch! I'm totally seeing some OMPL at https://github.com/elisabeth-ms/teo-sharon/blob/cfc3a62270e130d0f3a8a8418c18b1a901508bea/programs/TrajectoryGeneration/TrajectoryGeneration.hpp#L17-L24 in addition to the KDL and FCL code. Thanks! |
See also this grasping demo featuring the iCub: robotology/community#573. |
Moar grasping straight from the ongoing Nvidia GTC AI conference (thanks @imontesino): |
Thoughts about path-planning within roboticslab-uc3m, and considerations on creating a path-planning repository within the organization.
Considerations to be taken into account before blindly doing this:
As seen, the above candidates already have their place. Therefore, my recommendations are the following:
The text was updated successfully, but these errors were encountered: