Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Python bindings for MathematicalProgram #4771

Closed
rdeits opened this issue Jan 12, 2017 · 32 comments
Closed

Python bindings for MathematicalProgram #4771

rdeits opened this issue Jan 12, 2017 · 32 comments

Comments

@rdeits
Copy link
Contributor

rdeits commented Jan 12, 2017

I was investigating today what it would take to make MathematicalProgram usable from Python/Matlab. So far, I haven't been able to wrap decision_variable.h because of its use of alias-using directives (like template <int rows> using VectorDecisionVariable = MatrixDecisionVariable<rows, 1>;).

Mainline SWIG added support for these directives in 3.0.11, which came out in December 2016, but the SWIG + matlab fork we're using doesn't have those changes.

It might be possible to pull in the changes from mainline SWIG and get this working.

Of course, this is all moot if the SWIG code is going to be thrown away soon. What's the status of that?

@hongkai-dai
Copy link
Contributor

hongkai-dai commented Jan 12, 2017

Does swig work with typedef instead of using? I can change it to

template<int rows>
typedef MatrixDecisionVariable<rows, 1> VectorDecisionVariable;

How did swig works with eigen_autodiff_types.h? It uses the same using directives.

@rdeits
Copy link
Contributor Author

rdeits commented Jan 12, 2017

Making that change produces 35,841 lines of error messages, the first of which is:

/usr/local/bin/c++   -DEIGEN_MPL2_ONLY -DHAVE_SPDLOG -DdrakeOptimization_EXPORTS -I/home/rdeits/locomotion/drake-distro/drake/.. -I/home/rdeits/locomotion/drake-distro/build/install/lib/cmake/nlopt/../../../include/nlopt -isystem /home/rdeits/locomotion/drake-distro/build/install/include/eigen3 -isystem /home/rdeits/locomotion/drake-distro/build/install/include -isystem /home/rdeits/locomotion/drake-distro/build/install/include/ipopt/coin -isystem /home/rdeits/locomotion/drake-distro/build/install/include/ipopt/coin/ThirdParty -Werror=all -Werror=ignored-qualifiers -Werror=overloaded-virtual -Werror=extra -Werror=return-local-addr -Wno-unused-parameter -Wno-missing-field-initializers -O3 -DNDEBUG -fPIC   -std=gnu++14 -MMD -MT solvers/CMakeFiles/drakeOptimization.dir/decision_variable.cc.o -MF solvers/CMakeFiles/drakeOptimization.dir/decision_variable.cc.o.d -o solvers/CMakeFiles/drakeOptimization.dir/decision_variable.cc.o -c /home/rdeits/locomotion/drake-distro/drake/solvers/decision_variable.cc
In file included from /home/rdeits/locomotion/drake-distro/drake/solvers/decision_variable.cc:1:0:
/home/rdeits/locomotion/drake-distro/drake/../drake/solvers/decision_variable.h:18:1: error: template declaration of ‘typedef’
 typedef Eigen::Matrix<drake::symbolic::Variable, rows, cols> MatrixDecisionVariable;
 ^
/home/rdeits/locomotion/drake-distro/drake/../drake/solvers/decision_variable.h:20:1: error: template declaration of ‘typedef’
 typedef MatrixDecisionVariable<rows, 1> VectorDecisionVariable;
 ^
/home/rdeits/locomotion/drake-distro/drake/../drake/solvers/decision_variable.h:20:9: error: ‘MatrixDecisionVariable’ does not name a type
 typedef MatrixDecisionVariable<rows, 1> VectorDecisionVariable;
         ^

I think you can't typedef inside a teplate, at least not in that way 😉

We haven't seen this with eigen_autodiff_types.h only because we've never swig-wrapped that file.

@rdeits
Copy link
Contributor Author

rdeits commented Jan 12, 2017

I've pulled in the 3.0.11 changes from upstream here: https://github.com/rdeits/swig/tree/rd-merge-upstream and nothing exploded. That might actually be a good solution, and doesn't require any changes to your code.

@rdeits
Copy link
Contributor Author

rdeits commented Jan 12, 2017

The more fundamental issue remains, however: should I even bother? This will be a substantial effort, which might just get tossed out when we move away from SWIG.

@jamiesnape
Copy link
Contributor

@mwoehlke-kitware should start working on the Boost::Python port next week.

@rdeits
Copy link
Contributor Author

rdeits commented Jan 14, 2017

I guess I won't bother, then.

@RussTedrake
Copy link
Contributor

Have been thinking a bit more, and it's becoming clear to me just how enabling it would be if we could get python bindings working for MathematicalProgram. In addition to @rdeits , I could see @patmarion and others using these for working with point cloud perception right in director. MathematicalProgram is at a point where having more (expert) users could make it get a lot better fast. @hongkai-dai and @soonho-tri are adding features fast, but they've got a long list!

@jwnimmer-tri , @david-german-tri , @jamiesnape , et al -- what's the best way to make this happen?

@RussTedrake RussTedrake reopened this Jan 21, 2017
@RussTedrake RussTedrake changed the title swig_matlab can't wrap MathematicalProgram Python bindings for MathematicalProgram Jan 21, 2017
@RussTedrake
Copy link
Contributor

FWIW - there would be a ton of value for getting the bindings to matlab, too. @mposa and others might start using it. That might be a strong argument for swig.

@jwnimmer-tri
Copy link
Collaborator

Maybe the Boost::Python spike should be redirected to creating new python bindings for MathematicalProgram, instead of backend-swapping the current pydrake APIs? We could choose to wrap only the APIs that we feel are more stabilized, for now.

@david-german-tri
Copy link
Contributor

Summary of conversation elsewhere:

The swig MATLAB bindings have no users and @RussTedrake has decided we don't need to continue support for them. @rdeits will spike-test Python bindings for MathematicalProgram using SWIG 3.0.11, and will also nuke the MATLAB swig outputs. If the spike test works well, we have the option to replace swig-matlab with modern upstream SWIG (most likely as a binary dependency), or to use Boost::Python.

@mwoehlke-kitware: As a user of MathematicalProgram, @rdeits knows exactly the API he wants to expose, and plans to have the SWIG bindings done ASAP. Once they're available, we should focus Boost::Python effort on replicating them. In the meantime, the existing bindings also matter, and will doubtless teach us something, so you can proceed with them as planned.

Sounds good? Consistent with everyone's understanding?

@jamiesnape
Copy link
Contributor

we have the option to replace swig-matlab with modern upstream SWIG (most likely as a binary dependency)

SWIG 3.0.2 is in trusty-backports, SWIG 3.0.8 is in xenial, which makes life easier.

@rdeits
Copy link
Contributor Author

rdeits commented Jan 23, 2017

Annoyingly, only SWIG 3.0.11 supports the using alias syntax, which is used in MathematicalProgram.

@jamiesnape
Copy link
Contributor

That is very unfortunate. I retract my "makes life easier" comment.

@rdeits
Copy link
Contributor Author

rdeits commented Jan 24, 2017

Brief update on this:

Basically, I'm stuck on how to handle the Eigen Vectors of decision variables (which, ironically, I was a strong proponent of in the first place). Nothing I've tried has gotten SWIG to even try to generate proper bindings for VectorXDecisionVariable. The best I can get is a bare pointer that just leaks memory.

I decided to also give pybind11 a try, just to see how it was. Handling Eigen matrices with numeric scalar types is amazingly easy in pybind11 (as in, took basically no effort instead of the days that were required to get SWIG working). Unfortunately, their support for Eigen types with non-numeric arguments seems to be weaker. So far, I haven't gotten it working.

I'm going to keep playing with both approaches.

For reference, the particular API I'm trying to wrap is basically what's used in solvers/test/mixed_integer_optimization_test.cc. That is, I'd like to be able do do something like:

import numpy as np
import pydrake.mathematicalprogram as mp

prog = mp.MathematicalProgram()
x = prog.NewBinaryVariables(3, "x")
c = np.array([-1.0, -1.0, -2.0])
prog.AddLinearCost(c)
a1 = np.array([[1.0, 2.0, 3.0]])
prog.AddLinearConstraint(a1, -np.inf, 4)

etc.

Anyway, not giving up quite yet, but wrapping Eigen with user-defined scalar types has turned out to be difficult.

@rdeits
Copy link
Contributor Author

rdeits commented Jan 24, 2017

Follow-up: from discussion with one of the pybind11 devs, automagic wrapping of something like Eigen::Matrix<symbolic::Variable, -1, -1> is unlikely to work on its own. We can still return and use instances of that type in python, but they'll act as opaque objects rather than magically transforming into numpy arrays. Fortunately, we can implement our own methods for that opaque type and essentially re-create the subset of the matrix API that we need (realistically, not much more than basic math and indexing).

@rdeits
Copy link
Contributor Author

rdeits commented Jan 24, 2017

The conclusion is probably similar in SWIG: magic wrapping of Matrix to and from Numpy arrays is unlikely to work, but we can create a wrapper and implement the API ourselves. This is, in fact, exactly what the swigmake autodiff helpers already do for Eigen autodiff types: https://github.com/rdeits/swigmake/blob/master/swig/common/autodiff.i

@rdeits
Copy link
Contributor Author

rdeits commented Jan 24, 2017

Does anyone have input/suggestions on wrapping Eigen types like this? Going down the route that autodiff.i uses is possible, but is a pain to maintain and adds a lot of indirection.

@hongkai-dai
Copy link
Contributor

Unfortunately, their support for Eigen types with non-numeric arguments seems to be weaker.

That is unfortunate, we are using other types as the scalar in Eigen::Matrix, such as Eigen::Matrix<symbolic::Expression, 2, 3>.

@rdeits
Copy link
Contributor Author

rdeits commented Jan 25, 2017

Yeah, and I still think that's the right thing to do in C++. It just makes this step more challenging 😉

@rdeits
Copy link
Contributor Author

rdeits commented Jan 25, 2017

Just to be clear, in a perfect world I would be able to take a C++ method that returns a Matrix<Variable...> and call it from python to get, ideally, a Numpy array. And then I would be able to multiply that Numpy array by a constant to get a Numpy array of Expressions, which I could pass to a C++ function expecting Matrix<Expression>. I'm not sure how close to that we can realistically get, but it's nice to have dreams.

@rdeits
Copy link
Contributor Author

rdeits commented Jan 25, 2017

@patmarion had some useful advice. I had been trying to get the conversion from Matrix to numpy array to happen magically in the binding layer, but it's really not that hard to make that magic happen by just writing some python code. The idea would be:

  • wrap C++ functions that expect and return Eigen::Matrix<Variable> types
  • make Eigen::Matrix<Variable> an opaque type in the wrapper layer (that is, don't automatically try to convert it to a python type)
  • implement a couple of methods like length and indexing for the opaque wrapped type in python
  • in python, take a returned opaque Eigen::Matrix<Variable> and use its length and indexing methods to populate a numpy array with the same underlying data, then return that array

That leaves the option open for a future clever person to come along and figure out all the magic type conversion under the hood without changing the API.

@rdeits
Copy link
Contributor Author

rdeits commented Jan 25, 2017

I'm going to try to make a quick prototype of this idea in SWIG and pybind, outside of Drake.

@rdeits
Copy link
Contributor Author

rdeits commented Jan 26, 2017

Prototype is here: https://github.com/rdeits/mathprog-wrapper-exploration

Pybind11

The summary is that implementing @patmarion's suggestion was really easy in pybind11. A surprising number of things just work. For example, if I overload + for Variables in C++, then I can do (in python):

x = Variable("x")
y = Variable("y")
xy = numpy.array([x, y])
return np.sum(xy)

and get an Expression out, because numpy will internally call the appropriate c++ addition function.

Integrating it with cmake was also amazingly easy: https://github.com/rdeits/mathprog-wrapper-exploration/blob/master/pybind/CMakeLists.txt#L24

Swig

Implementing it with SWIG was...well...I haven't actually gotten it to work yet. The problem is that in order to generate a wrapper for Matrix<Variable> in SWIG, I have to convince SWIG that Eigen::Matrix is really a thing. But SWIG can't actually parse the Eigen headers. So there's probably some manual type conversion that I'll have to write. Suffice it to say, this is much harder. The cmake configuration is also much more complicated: https://github.com/rdeits/mathprog-wrapper-exploration/blob/master/swig/CMakeLists.txt#L25

In SWIG's defense, the parts of the API that it could wrap, it did automatically, while in pybind11 I had to declare each method that I wanted to be wrapped.

By the way, I haven't been trying Boost::Python simply because I have no experience with it at all. I'm sure it's also a good option, but right now I'm more interested in the general comparison of SWIG vs not-SWIG, with pybind11 being my not-SWIG of choice.

@RussTedrake
Copy link
Contributor

hoorah!

@david-german-tri
Copy link
Contributor

david-german-tri commented Jan 26, 2017

This is super exciting, kudos @rdeits! @mwoehlke-kitware, you should read recent developments on this thread if you haven't already.

I decided to also give pybind11 a try, just to see how it was.

Are you planning to continue exploring the SWIG avenue, or is not-SWIG the approach of choice at this point?

Integrating it with cmake was also amazingly easy: https://github.com/rdeits/mathprog-wrapper-exploration/blob/master/pybind/CMakeLists.txt#L24

It looks like others have already worked out how to do it in Bazel as well: bazelbuild/bazel#1475

By the way, I haven't been trying Boost::Python simply because I have no experience with it at all. I'm sure it's also a good option, but right now I'm more interested in the general comparison of SWIG vs not-SWIG, with pybind11 being my not-SWIG of choice.

@mwoehlke-kitware From my notes, you thought pybind11 might have some advantages over Boost::Python. It certainly looks like a clean and simple external, and it avoids even thinking about the Boost battling-versions issues. On the other hand, TRI has internal commitments to Boost::Python, so if we're going to use something else in Drake, it's important to figure out how to make it interoperate.

EDIT: Thinking about it for a moment, though, why would interop be hard? There are Python functions, produced by whatever underlying wrapper technology, and Python code calls them. Right?

@jamiesnape
Copy link
Contributor

Personally, I like pybind11 best. I would surprised if there were issues interoperating.

@mwoehlke-kitware
Copy link
Contributor

By the way, I haven't been trying Boost::Python simply because I have no experience with it at all. I'm sure it's also a good option, but right now I'm more interested in the general comparison of SWIG vs not-SWIG, with pybind11 being my not-SWIG of choice.

From what I can tell, pybind11 is a fork of Boost::Python that has been extracted from the rest of Boost and is updated to use modern C++... with one important difference: it has some Eigen wrapping bits built in. We'd apparently need an extra external to get that with Boost::Python.

@mwoehlke-kitware From my notes, you thought pybind11 might have some advantages over Boost::Python.

Due to requiring a C++11 compiler (and C++14 awareness), it does some things much more "cleanly" than Boost::Python (less use of macros, particularly). I'm not sure if Boost::Python supports exporting lambdas as Python functions; pybind11 definitely does. CMake integration with pybind11 is really easy.

As discussed in today's meeting, I would guess that Boost::Python code can be converted fairly easily to pybind11. The API's are very similar, so it "ought" to be a matter of switching includes, and changing the namespace prefix on Boost::Python calls. (Possibly even just changing a namespace alias.)

@jwnimmer-tri
Copy link
Collaborator

FTR all of the above sounds great with me. (I had raised the concern with David about making sure we inter-operate with Boost::Python; I think the above plan is solid and I'm not worried anymore.)

@rdeits
Copy link
Contributor Author

rdeits commented Jan 26, 2017

I am currently rewriting the IRIS python bindings in pybind11, both in order to fix the build issues like #4914 and to make sure I'm happy with the tool. So far, it's great.

@rdeits
Copy link
Contributor Author

rdeits commented Jan 27, 2017

https://github.com/rdeits/drake/tree/pybind-rbtree now has all python tests passing using pybind11 bindings instead of SWIG. There's some more work to be done to expose some IK constraints that aren't being tested but that Director probably needs, and to clean up the build, but overall this is really nice.

@rdeits
Copy link
Contributor Author

rdeits commented Jan 30, 2017

https://github.com/rdeits/drake/tree/pybind-mathprog can now run a very basic mathematicalprogram example from python: https://github.com/rdeits/drake/blob/7bffb37628e62e59ea4181963e70a6bd2bdc5a2b/drake/bindings/python/pydrake/test/testMathematicalProgram.py#L15

prog = mathematicalprogram.MathematicalProgram()
x = prog.NewBinaryVariables(3, "x")
c = np.array([-1.0, -1.0, -2.0])
prog.AddLinearCost(c, x)
a = np.array([[1.0, 2.0, 3.0]])
prog.AddLinearConstraint(a, -np.inf, 4, x)
a2 = np.array([1.0, 1.0])
prog.AddLinearConstraint(a2, 1, np.inf, x[:2])
result = prog.Solve()
self.assertEqual(result, mathematicalprogram.SolutionResult.kSolutionFound)

# Test that we got the right solution for all x
x_expected = np.array([1.0, 0.0, 1.0])
self.assertTrue(np.all(np.isclose(prog.GetSolution(x), x_expected)))

@RussTedrake
Copy link
Contributor

hoorah!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

7 participants