Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

issue with pmap: document? #146

Closed
cecileane opened this issue May 20, 2020 · 7 comments
Closed

issue with pmap: document? #146

cecileane opened this issue May 20, 2020 · 7 comments

Comments

@cecileane
Copy link

This issue seems related to #12, but here is an example. The NLopt object is part of a type, to be reused multiple times. All is well, unless we want to distributed the work across processors.

using NLopt

struct Foo
  data::Vector{Float64}
  guess::Vector{Float64}
  opt::NLopt.Opt
end
function Foo(data, frel,fabs,maxeval)
  guess = similar(data)
  opt = NLopt.Opt(:LD_SLSQP, 1)
  NLopt.ftol_rel!(opt,frel)
  NLopt.ftol_abs!(opt,fabs)
  NLopt.maxeval!(opt, maxeval)
  return Foo(data,guess,opt)
end

foo = Foo([4.0,1.0,-5.0], .01,.01,5)

function optimizeone!(obj::Foo,i)
    function f(x, g)
        d = (x[1]-obj.data[i])
        objective = d*d
        if length(g) > 0
            g[1] = 2d
        end
        return objective
    end
    opt = foo.opt
    min_objective!(opt, f)
    optf, optx, ret = NLopt.optimize(opt, [obj.guess[i]])
    obj.guess[i] = optx[1]
    return obj.guess[i]
end

for i in 1:3
  optimizeone!(foo,i)
end
foo.guess  foo.data # true

so all is well here. The problem appears when we use multiple processors:

foo = Foo([4.0,1.0,-5.0], .01,.01,5)
using Distributed
addprocs(1) # 1 worker
@everywhere using NLopt
bar = Distributed.pmap(1:3) do i
  optimizeone!(foo,i)
end

I get an error starting with:

ERROR: On worker 2:
UndefVarError: #optimizeone! not defined
deserialize_datatype at /Users/julia/buildbot/worker/package_macos64/build/usr/share/julia/stdlib/v1.4/Serialization/src/Serialization.jl:1211

As said in #12, I can serialize the optimization parameters and create the Opt object by the worker. But still, it would be a nice feature to have, and if not, it would be nice to be documented.

@odow
Copy link
Member

odow commented May 21, 2020

The error is not related to NLopt. Here is the relevant section of the Julia manual: https://docs.julialang.org/en/v1/manual/parallel-computing/#code-availability-1

To avoid issues with serializing the NLopt.Opt, you probably want to construct it on each process.

@everywhere function optimizeone!(i)
    foo = Foo([4.0,1.0,-5.0], .01,.01,5)
    # ...
    return obj.guess
end
pmap(optimizeone!, 1:3)

@cecileane
Copy link
Author

Oh yes duh!! I am sorry, I read the error too fast. (it's an attempt to make a small example. In real life I'm running into this problem with things inside a module). So when I start julia with julia -p 1, say, and do @everwhere in front of almost everything above, the error is like this:

      From worker 2:	signal (11): Segmentation fault: 11
      From worker 2:	in expression starting at none:0
      From worker 2:	nlopt_get_errmsg at /Users/ane/.julia/artifacts/a98e645e32bc5ee2510d3fbf404e13941a7e2aa2/lib/libnlopt.0.10.0.dylib (unknown line)
      From worker 2:	errmsg at /Users/ane/.julia/packages/NLopt/KTzQh/src/NLopt.jl:198 [inlined]
      From worker 2:	_errmsg at /Users/ane/.julia/packages/NLopt/KTzQh/src/NLopt.jl:203
      From worker 2:	chk at /Users/ane/.julia/packages/NLopt/KTzQh/src/NLopt.jl:211
      From worker 2:	min_objective! at /Users/ane/.julia/packages/NLopt/KTzQh/src/NLopt.jl:406 [inlined]
      From worker 2:	optimizeone! at ./REPL[5]:11
      From worker 2:	#4 at ./REPL[9]:2
      From worker 2:	unknown function (ip: 0x109cc37a7)
      ...

To be replicate this:

@everywhere using NLopt

@everywhere struct Foo
  data::Vector{Float64}
  guess::Vector{Float64}
  opt::NLopt.Opt
end
@everywhere function Foo(data, frel,fabs,maxeval)
  guess = similar(data)
  opt = NLopt.Opt(:LD_SLSQP, 1)
  NLopt.ftol_rel!(opt,frel)
  NLopt.ftol_abs!(opt,fabs)
  NLopt.maxeval!(opt, maxeval)
  return Foo(data,guess,opt)
end
@everywhere function optimizeone!(obj::Foo,i)
    function f(x, g)
        d = (x[1]-obj.data[i])
        objective = d*d
        if length(g) > 0
            g[1] = 2d
        end
        return objective
    end
    opt = foo.opt
    min_objective!(opt, f)
    optf, optx, ret = NLopt.optimize(opt, [obj.guess[i]])
    obj.guess[i] = optx[1]
    return obj.guess[i]
end

foo = Foo([4.0,1.0,-5.0], .01,.01,5) # or @everywhere, same error
bar = Distributed.pmap(1:3) do i
  optimizeone!(foo,i)
end
(@v1.4) pkg> status
Status `~/.julia/environments/v1.4/Project.toml`
  [76087f3c] NLopt v0.6.0

@odow
Copy link
Member

odow commented May 21, 2020

You need to follow the second half of my previous comment.

@cecileane
Copy link
Author

oh yes, that's what I already did in my package. Documentation about the problem would have been helpful (and would have saved me time to diagnose the issue).

@odow
Copy link
Member

odow commented May 21, 2020

You can make a PR to improve the documentation by clicking on the pencil Icon at the top of the README on the homepage:
https://github.com/JuliaOpt/NLopt.jl/edit/master/README.md

@cecileane
Copy link
Author

yes! will do when I get the time.

@odow
Copy link
Member

odow commented Jan 25, 2023

Closing as a duplicate of #186.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

No branches or pull requests

2 participants