Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Switch Nonlinear API to JuMP.GenericNonlinearExpr #308

Merged
merged 44 commits into from
Sep 15, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
44 commits
Select commit Hold shift + click to select a range
b322901
Update to use NonlinearExpr in src
pulsipher Apr 28, 2023
380fde5
update optimizer_index for constraints
pulsipher Apr 28, 2023
0a48881
Begin updates for docs and tests
pulsipher Apr 29, 2023
5b6c08c
Update datatypes and tests
pulsipher May 4, 2023
23f661f
Add registration tests
pulsipher May 4, 2023
12faa3a
Added more tests
pulsipher May 4, 2023
c3da849
Merge branch 'master' into jump_nlp
pulsipher May 4, 2023
26b01d3
Reimplement all tests
pulsipher May 4, 2023
dbf1063
Update nlp types
pulsipher May 19, 2023
d4615e6
added comment
pulsipher May 22, 2023
21885f0
Update package dependencies
pulsipher Jun 2, 2023
b41bf51
Merge branch 'master' into jump_nlp
pulsipher Jun 2, 2023
0c0518f
incorporate `JuMP.flatten`
pulsipher Jun 21, 2023
cb6dd6a
fix @rewrite
pulsipher Jun 21, 2023
6d3af16
Update w/ JuMP
pulsipher Aug 16, 2023
566fdb1
remove use of `MA.@rewrite`
pulsipher Aug 18, 2023
a8834f7
Update user defined functions
pulsipher Aug 18, 2023
2a5cf06
Update registration tests
pulsipher Aug 18, 2023
1cfb843
`flatten` -> `flatten!`
pulsipher Aug 21, 2023
95f28bb
Update registration tests
pulsipher Aug 21, 2023
c1c73c4
Prepare tests for CI
pulsipher Aug 21, 2023
20828be
Merge branch 'master' into jump_nlp
pulsipher Aug 21, 2023
f37234c
fix syntax bug
pulsipher Aug 21, 2023
32f290d
test fix
pulsipher Aug 21, 2023
1f2245e
minor fixes
pulsipher Aug 21, 2023
1e961da
Minor doc fix
pulsipher Aug 21, 2023
8693a66
minor fix
pulsipher Aug 21, 2023
601f273
Merge branch 'master' into jump_nlp
pulsipher Aug 28, 2023
ded4b40
Update to JuMP changes
pulsipher Aug 28, 2023
976f4ce
fix `build_measure`
pulsipher Aug 28, 2023
bb17830
Minor fixes
pulsipher Aug 28, 2023
c09698d
fix tests
pulsipher Aug 28, 2023
76be615
Update to JuMP's latest release
pulsipher Aug 31, 2023
7e8c4d8
Fix doctests
pulsipher Aug 31, 2023
87c4324
Another doctest fix
pulsipher Aug 31, 2023
ac40507
Fix header link
pulsipher Aug 31, 2023
2711001
Update for NonlinearOperator change (#321)
odow Sep 8, 2023
5ffb6ac
test fix
pulsipher Sep 8, 2023
5d18d6f
Add AST mapper
pulsipher Sep 14, 2023
77779f9
Update dep test
pulsipher Sep 14, 2023
2fe7739
fix tests
pulsipher Sep 14, 2023
7dbc57c
minor fixes
pulsipher Sep 14, 2023
66a784d
minor test fix
pulsipher Sep 14, 2023
38405b7
update to JuMP 1.15
pulsipher Sep 15, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 3 additions & 8 deletions Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -4,33 +4,28 @@ authors = ["Joshua Pulsipher and Weiqi Zhang"]
version = "0.5.8"

[deps]
AbstractTrees = "1520ce14-60c1-5f80-bbc7-55ef81b5835c"
DataStructures = "864edb3b-99cc-5e75-8d2d-829cb0a9cfe8"
Distributions = "31c24e10-a181-5473-b8eb-7969acd0382f"
FastGaussQuadrature = "442a2c76-b920-505d-bb47-c5924d526838"
JuMP = "4076af6c-e467-56ae-b986-b466b2749572"
LeftChildRightSiblingTrees = "1d6d02ad-be62-4b6b-8a6d-2f90e265016e"
LinearAlgebra = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"
MutableArithmetics = "d8a4904e-b15c-11e9-3269-09a3773c0cb0"
Reexport = "189a3867-3050-52da-a836-e630ba90ab69"
SpecialFunctions = "276daf66-3868-5448-9aa4-cd146d93841b"

[compat]
AbstractTrees = "0.4"
DataStructures = "0.14.2 - 0.18"
Distributions = "0.19 - 0.25"
FastGaussQuadrature = "0.3.2 - 0.4, 0.5"
JuMP = "1.2"
LeftChildRightSiblingTrees = "0.2"
JuMP = "1.5"
MutableArithmetics = "1"
Reexport = "0.2, 1"
SpecialFunctions = "0.8 - 0.10, 1, 2"
julia = "^1.6"

[extras]
Random = "9a3f8284-a2c9-5f02-9a11-845980a1fd5c"
Suppressor = "fd094767-a336-5f1f-9728-57cf17d0bbfb"
Test = "8dfed614-e22c-5e08-85e1-65c5234f0b40"
Pkg = "44cfe95a-1eb2-52ea-b672-e2afdf69b78f"

[targets]
test = ["Test", "Random", "Suppressor"]
test = ["Test", "Random", "Suppressor", "Pkg"]
9 changes: 5 additions & 4 deletions docs/Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -10,15 +10,16 @@ Plots = "91a5bcdd-55d7-5caf-9e0b-520d859cae80"
Random = "9a3f8284-a2c9-5f02-9a11-845980a1fd5c"
SpecialFunctions = "276daf66-3868-5448-9aa4-cd146d93841b"
Test = "8dfed614-e22c-5e08-85e1-65c5234f0b40"
Pkg = "44cfe95a-1eb2-52ea-b672-e2afdf69b78f"

[compat]
HiGHS = "1"
Distributions = "0.25"
Documenter = "0.27"
InfiniteOpt = "0.5"
JuMP = "^1.11.1"
Ipopt = "1"
Ipopt = "1.4"
HiGHS = "1"
julia = "1.6"
JuMP = "1.15"
Literate = "2.14"
Plots = "1"
julia = "1.6"
SpecialFunctions = "2"
51 changes: 15 additions & 36 deletions docs/src/develop/extensions.md
Original file line number Diff line number Diff line change
Expand Up @@ -871,23 +871,19 @@ function _make_expression(
return _make_expression(opt_model, measure_function(expr))
end
# AffExpr/QuadExpr
function _make_expression(opt_model::Model, expr::Union{GenericAffExpr, GenericQuadExpr})
function _make_expression(opt_model::Model, expr::Union{GenericAffExpr, GenericQuadExpr, GenericNonlinearExpr})
return map_expression(v -> _make_expression(opt_model, v), expr)
end
# NLPExpr
function _make_expression(opt_model::Model, expr::NLPExpr)
return add_nonlinear_expression(opt_model, map_nlp_to_ast(v -> _make_expression(opt_model, v), expr))
end

# output
_make_expression (generic function with 8 methods)
_make_expression (generic function with 7 methods)

```
For simplicity in example, above we assume that only `DistributionDomain`s are
used, there are not any `PointVariableRef`s, and all `MeasureRef`s correspond to
expectations. Naturally, a full extension should include checks to enforce that
such assumptions hold. Notice that [`map_expression`](@ref) and
[`map_nlp_to_ast`](@ref) are useful for converting expressions.
such assumptions hold. Notice that [`map_expression`](@ref) is useful for
converting expressions.

Now let's extend [`build_optimizer_model!`](@ref) for `DeterministicModel`s.
Such extensions should build an optimizer model in place and in general should
Expand All @@ -908,13 +904,13 @@ function InfiniteOpt.build_optimizer_model!(
# clear the model for a build/rebuild
determ_model = InfiniteOpt.clear_optimizer_model_build!(model)

# add the registered functions if there are any
add_registered_to_jump(determ_model, model)
# add user-defined nonlinear operators if there are any
add_operators_to_jump(determ_model, model)

# add variables
for vref in all_variables(model)
if index(vref) isa InfiniteVariableIndex
start = NaN # easy hack
start = NaN # simple hack for sake of example
else
start = start_value(vref)
start = isnothing(start) ? NaN : start
Expand All @@ -932,29 +928,14 @@ function InfiniteOpt.build_optimizer_model!(

# add the objective
obj_func = _make_expression(determ_model, objective_function(model))
if obj_func isa NonlinearExpression
set_nonlinear_objective(determ_model, objective_sense(model), obj_func)
else
set_objective(determ_model, objective_sense(model), obj_func)
end
set_objective(determ_model, objective_sense(model), obj_func)

# add the constraints
for cref in all_constraints(model, Union{GenericAffExpr, GenericQuadExpr, NLPExpr})
for cref in all_constraints(model, Union{GenericAffExpr, GenericQuadExpr, GenericNonlinearExpr})
constr = constraint_object(cref)
new_func = _make_expression(determ_model, constr.func)
if new_func isa NonlinearExpression
if constr.set isa MOI.LessThan
ex = :($new_func <= $(constr.set.upper))
elseif constr.set isa MOI.GreaterThan
ex = :($new_func >= $(constr.set.lower))
else # assume it is MOI.EqualTo
ex = :($new_func == $(constr.set.value))
end
new_cref = add_nonlinear_constraint(determ_model, ex)
else
new_constr = build_constraint(error, new_func, constr.set)
new_cref = add_constraint(determ_model, new_constr, name(cref))
end
new_constr = build_constraint(error, new_func, constr.set)
new_cref = add_constraint(determ_model, new_constr, name(cref))
deterministic_data(determ_model).infconstr_to_detconstr[cref] = new_cref
end

Expand All @@ -975,13 +956,11 @@ print(optimizer_model(model))
# output
Min z + y[1] + y[2]
Subject to
2 y[1] - z ≤ 42.0
sin(z) - -1.0 ≥ 0
2 y[1] - z ≤ 42
y[2]² = 1.5
y[1] ≥ 0.0
y[2] ≥ 0.0
subexpression[1] - 0.0 ≥ 0
With NL expressions
subexpression[1]: sin(z) - -1.0
y[1] ≥ 0
y[2] ≥ 0
```
Note that better variable naming could be used with the reformulated infinite
variables. Moreover, in general extensions of [`build_optimizer_model!`](@ref)
Expand Down
4 changes: 2 additions & 2 deletions docs/src/guide/derivative.md
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ julia> d3 = @∂(q, t^2)
∂/∂t[∂/∂t[q(t)]]

julia> d_expr = @deriv(y * q - 2t, t)
∂/∂t[y(t, ξ)]*q(t) + ∂/∂t[q(t)]*y(t, ξ) - 2
∂/∂t[y(t, ξ)]*q(t) + y(t, ξ)*∂/∂t[q(t)] - 2
```
Thus, we can define derivatives in a variety of forms according to the problem at
hand. The last example even shows how the product rule is correctly applied.
Expand Down Expand Up @@ -216,7 +216,7 @@ defined up above with its alias name `dydt2`. This macro can also tackle complex
expressions using the appropriate calculus such as:
```jldoctest deriv_basic
julia> @deriv(∫(y, ξ) * q, t)
∂/∂t[∫{ξ ∈ [-1, 1]}[y(t, ξ)]]*q(t) + ∂/∂t[q(t)]*∫{ξ ∈ [-1, 1]}[y(t, ξ)]
∂/∂t[∫{ξ ∈ [-1, 1]}[y(t, ξ)]]*q(t) + ∫{ξ ∈ [-1, 1]}[y(t, ξ)]*∂/∂t[q(t)]
```
Thus, demonstrating the convenience of using `@deriv`.

Expand Down
Loading
Loading