Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MethodError when using ComponentArrays #924

Closed
amrods opened this issue Jun 1, 2021 · 20 comments
Closed

MethodError when using ComponentArrays #924

amrods opened this issue Jun 1, 2021 · 20 comments

Comments

@amrods
Copy link

amrods commented Jun 1, 2021

see here: jonniedie/ComponentArrays.jl#91

using Optim
using ComponentArrays
using UnPack

function rosenbrock(ca)
    @unpack x, y = ca
    (1.0 - x)^2 + 100.0 * (y - x^2)^2
end

ca = ComponentArray{Float64}(x=2.0, y=2.0)
optimize(rosenbrock, ca, Newton())

errors with

ERROR: LoadError: MethodError: no method matching (::NLSolversBase.var"#fg!#45"{typeof(rosenbrock)})(::ComponentVector{Float64}, ::ComponentVector{Float64})
Stacktrace:
 [1] value_gradient!!(obj::TwiceDifferentiable{Float64, ComponentVector{Float64}, ComponentMatrix{Float64}, ComponentVector{Float64}}, x::ComponentVector{Float64})
   @ NLSolversBase ~/.julia/packages/NLSolversBase/geyh3/src/interface.jl:82
 [2] initial_state(method::Newton{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}}, options::Optim.Options{Float64, Nothing}, d::TwiceDifferentiable{Float64, ComponentVector{Float64}, ComponentMatrix{Float64}, ComponentVector{Float64}}, initial_x::ComponentVector{Float64})
   @ Optim ~/.julia/packages/Optim/uwNqi/src/multivariate/solvers/second_order/newton.jl:45
 [3] optimize
   @ ~/.julia/packages/Optim/uwNqi/src/multivariate/optimize/optimize.jl:35 [inlined]
 [4] #optimize#87
   @ ~/.julia/packages/Optim/uwNqi/src/multivariate/optimize/interface.jl:142 [inlined]
 [5] optimize(f::Function, initial_x::ComponentVector{Float64}, method::Newton{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}}, options::Optim.Options{Float64, Nothing}) (repeats 2 times)
   @ Optim ~/.julia/packages/Optim/uwNqi/src/multivariate/optimize/interface.jl:141
 [6] top-level scope
   @ ~/amrods/test.jl:52
in expression starting at /Users/amrods/test.jl:52

this happens only with finite differentiation, with autodiff=:forward the error is avoided:

julia> optimize(rosenbrock, ca, Newton(); autodiff=:forward)
 * Status: success

 * Candidate solution
    Final objective value:     0.000000e+00

 * Found with
    Algorithm:     Newton's Method

 * Convergence measures
    |x - x'|               = 3.77e-09 ≰ 0.0e+00
    |x - x'|/|x'|          = 3.77e-09 ≰ 0.0e+00
    |f(x) - f(x')|         = 3.65e-18 ≰ 0.0e+00
    |f(x) - f(x')|/|f(x')| = Inf ≰ 0.0e+00
    |g(x)|                 = 0.00e+00 ≤ 1.0e-08

 * Work counters
    Seconds run:   0  (vs limit Inf)
    Iterations:    15
    f(x) calls:    45
    ∇f(x) calls:   45
    ∇²f(x) calls:  15
@amrods
Copy link
Author

amrods commented Jun 1, 2021

It only happens with methods requiring a Hessian: Newton(), NewtonTrustRegion(), IPNewton().

@longemen3000
Copy link
Contributor

longemen3000 commented Jun 1, 2021

The error seems to stem from here https://github.com/JuliaNLSolvers/NLSolversBase.jl/blob/62d2199f70cce78479658da08d346bbc060fc2f7/src/objective_types/twicedifferentiable.jl#L116 a fix wold be to relax that type annotation on NLSolversBase

@amrods
Copy link
Author

amrods commented Jun 1, 2021

should those types be changed to AbstractVector and AbstractMatrix?
(how did you track that line? I'm bad at tracking those errors deep inside packages)
edit:
No, since ComponentArray <: Union{AbstractVector, AbstractMatrix} is false.

@longemen3000
Copy link
Contributor

I did a PR to NLSolversBase.jl once, so I had a rough idea 😅, I checked the FiniteDiff.jl Package and they accept AbstractVector{<:Number} and AbstractMatrix{<:Number}. (but the best course of action would be to eliminate that type parameter), I can make a PR tomorrow

@amrods
Copy link
Author

amrods commented Jun 1, 2021

but ComponentArray <: Union{AbstractVector, AbstractMatrix} is false.

@longemen3000
Copy link
Contributor

longemen3000 commented Jun 1, 2021

That a little subtlety of the type system, a ComponentArrayis an abstract type considering all ComponentArrays of all element type of all dimensions, where AbstractVector is all AbstractArrays of dimension 1 (a vector). Same with AbstractMatrix. Component Array has an alias for 1-dim Component arrays, ComponentVector,and I think that ComponentVector <: AbstractVector

@amrods
Copy link
Author

amrods commented Jun 7, 2021

A similar thing happens with ParticleSwarm, somewhere the ComponentArray gets converted to an Array:

using Optim
using ComponentArrays
using UnPack

function rosenbrock(ca)
    @unpack x, y = ca
    (1.0 - x)^2 + 100.0 * (y - x^2)^2
end

ca = ComponentArray{Float64}(x=2.0, y=2.0)
optimize(rosenbrock, ca, ParticleSwarm())

ERROR: LoadError: type Array has no field x
Stacktrace:
  [1] getproperty(x::Vector{Float64}, f::Symbol)
    @ Base ./Base.jl:33
  [2] unpack
    @ ~/.julia/packages/UnPack/EkESO/src/UnPack.jl:34 [inlined]
  [3] macro expansion
    @ ~/.julia/packages/UnPack/EkESO/src/UnPack.jl:101 [inlined]
  [4] rosenbrock(ca::Vector{Float64})
    @ Main ~/test.jl:6
  [5] value(obj::NonDifferentiable{Float64, ComponentVector{Float64, Vector{Float64}, Tuple{Axis{(x = 1, y = 2)}}}}, x::Vector{Float64})
    @ NLSolversBase ~/.julia/packages/NLSolversBase/geyh3/src/interface.jl:19
  [6] initial_state(method::ParticleSwarm{Any}, options::Optim.Options{Float64, Nothing}, d::NonDifferentiable{Float64, ComponentVector{Float64, Vector{Float64}, Tuple{Axis{(x = 1, y = 2)}}}}, initial_x::ComponentVector{Float64})
    @ Optim ~/.julia/packages/Optim/uwNqi/src/multivariate/solvers/zeroth_order/particle_swarm.jl:155
  [7] optimize
    @ ~/.julia/packages/Optim/uwNqi/src/multivariate/optimize/optimize.jl:35 [inlined]
  [8] optimize(f::Function, initial_x::ComponentVector{Float64}, method::ParticleSwarm{Any}, options::Optim.Options{Float64, Nothing}; inplace::Bool, autodiff::Symbol)
    @ Optim ~/.julia/packages/Optim/uwNqi/src/multivariate/optimize/interface.jl:142
  [9] optimize(f::Function, initial_x::ComponentVector{Float64}, method::ParticleSwarm{Any}, options::Optim.Options{Float64, Nothing}) (repeats 2 times)
    @ Optim ~/.julia/packages/Optim/uwNqi/src/multivariate/optimize/interface.jl:141
 [10] top-level scope
    @ ~/test.jl:11
in expression starting at /Users/amrods/test.jl:11

@longemen3000
Copy link
Contributor

That last problem is in Optim.jl, not in NLSolversBase.jl. The ParticleSwarmstruct is restricted to Vector

@amrods
Copy link
Author

amrods commented Jun 7, 2021

here?
https://github.com/JuliaNLSolvers/Optim.jl/blob/master/src/multivariate/solvers/zeroth_order/particle_swarm.jl#L1

@amrods
Copy link
Author

amrods commented Jun 7, 2021

There is also an issue in line 36, where the kwords are by default empty vectors []. How can I make an empty AbstractVector?

@amrods
Copy link
Author

amrods commented Jun 7, 2021

As far as I remember, one cannot make an instance of an abstract type, no?

@longemen3000
Copy link
Contributor

Nope, you can't, the main problem is how do ParticleSwarm() can know about the vector type used on x0?,I opened a new issue for this specific case

@amrods
Copy link
Author

amrods commented Jun 7, 2021

Pardon my ignorance, I'm trying to learn.

the main problem is how do ParticleSwarm() can know about the vector type used on x0

That is to be able to initialize lower and upper in line 36?
https://github.com/JuliaNLSolvers/Optim.jl/blob/master/src/multivariate/solvers/zeroth_order/particle_swarm.jl#L36

@longemen3000
Copy link
Contributor

As far as I understand, yes, but maybe it's not nessesary. I was reading the code, and the ParticleSwarm struct initializes a state, maybe if the lower and upper variables are moved to that state, the problem could be avoided, but its a conjecture until I have some time to test it on my machine

@pkofod
Copy link
Member

pkofod commented Jun 8, 2021

As far as I understand, yes, but maybe it's not nessesary. I was reading the code, and the ParticleSwarm struct initializes a state, maybe if the lower and upper variables are moved to that state, the problem could be avoided, but its a conjecture until I have some time to test it on my machine

Unfortunately ParticleSwarm uses a different lower/upper interface than Fminbox for example. It should ideally be optimize(f, x0, lower, upper, ParticleSwarm())

@longemen3000
Copy link
Contributor

this should be fixed once a new version of NLSolversBase.jl gets registered

@pkofod
Copy link
Member

pkofod commented Jul 29, 2021

Yeah, I don't know. JuliaRegistrator doesn't seem to be catching my registration for some reason. Worked for Optim the other day.

@pkofod
Copy link
Member

pkofod commented Aug 13, 2021

should be up to date now JuliaRegistries/General#41797

@JTaets
Copy link

JTaets commented Aug 27, 2022

I'm getting similar problems on the latest version, is there any update on this?
MWE:

using Optimization, OptimizationOptimJL, ForwardDiff, ComponentArrays
rosenbrock(x, p) = (p.p1 - x.x1)^2 + p.p2 * (x.x1 - x.x2^2)^2
cons(res, x, p) = (res .= [x.x1^2+x.x2^2, x.x1*x.x2])

x0 = ComponentArray{Float64}(x1=0;x2=0)
p = ComponentArray{Float64}(p1=1;p2=2)

optprob = OptimizationFunction(rosenbrock, Optimization.AutoForwardDiff(), cons = cons)
prob = OptimizationProblem(optprob, x0, p, lcons = [-Inf, -1.0], ucons = [0.8, 2.0])
sol = solve(prob, IPNewton())

@pkofod
Copy link
Member

pkofod commented Sep 4, 2022

I'm getting similar problems on the latest version, is there any update on this?

This is a different problem. I will have to see if Matrix is needed or not here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants