Skip to content

Commit

Permalink
[breaking] drop invalid nonlinear support
Browse files Browse the repository at this point in the history
  • Loading branch information
odow committed Mar 1, 2024
1 parent 4ec565a commit ec7ed41
Show file tree
Hide file tree
Showing 3 changed files with 12 additions and 146 deletions.
64 changes: 2 additions & 62 deletions docs/src/Examples/example.md
Original file line number Diff line number Diff line change
Expand Up @@ -353,7 +353,7 @@ MOI.set(
)
```

To multiply a parameter in a quadratic term, the user will
To multiply a parameter in a quadratic term, the user will
need to use the `POI.QuadraticObjectiveCoef` model attribute.

```@example moi2
Expand Down Expand Up @@ -415,7 +415,7 @@ We use the same MOI function to add the parameter multiplied to the quadratic te
MOI.set(backend(model), POI.QuadraticObjectiveCoef(), (index(x),index(y)), 2index(p)+3)
```

If the user print the `model`, the term `(2p+3)*xy` won't show.
If the user print the `model`, the term `(2p+3)*xy` won't show.
It's possible to retrieve the parametric function multiplying the term `xy` with `MOI.get`.

```@example jump4
Expand All @@ -440,63 +440,3 @@ isapprox(objective_value(model), 128/9, atol=1e-4)
isapprox(value(x), 4/3, atol=1e-4)
isapprox(value(y), 4/3, atol=1e-4)
```
## JuMP Example - Non Linear Programming (NLP)

POI currently works with NLPs when users wish to add the parameters to the non-NL constraints or objective. This means that POI works with models like this one:

```julia
@variable(model, x)
@variable(model, y)
@variable(model, z in MOI.Parameter(10))
@constraint(model, x + y >= z)
@NLobjective(model, Min, x^2 + y^2)
```

but does not work with models that have parameters on the NL expressions like this one:

```julia
@variable(model, x)
@variable(model, y)
@variable(model, z in MOI.Parameter(10))
@constraint(model, x + y >= z)
@NLobjective(model, Min, x^2 + y^2 + z) # There is a parameter here
```

If users with to add parameters in NL expressions we strongly recommend them to read [this section on the JuMP documentation]((https://jump.dev/JuMP.jl/stable/manual/nlp/#Create-a-nonlinear-parameter))

Although POI works with NLPs there are some important information for users to keep in mind. All come from the fact that POI relies on the MOI interface for problem modifications and these are not common on NLP solvers, most solvers only allow users to modify variable bounds using their official APIs. This means that if users wish to make modifications on some constraint that is not a variable bound we are not allowed to call `MOI.modify` because the function is not supported in the MOI solver interface. The work-around to this is defining a [`POI.Optimizer`](@ref) on a caching optimizer:

```julia
ipopt = Ipopt.Optimizer()
MOI.set(ipopt, MOI.RawOptimizerAttribute("print_level"), 0)
cached =
() -> MOI.Bridges.full_bridge_optimizer(
MOIU.CachingOptimizer(
MOIU.UniversalFallback(MOIU.Model{Float64}()),
ipopt,
),
Float64,
)
POI_cached_optimizer() = POI.Optimizer(cached())
model = Model(() -> POI_cached_optimizer())
@variable(model, x)
@variable(model, y)
@variable(model, z in MOI.Parameter(10))
@constraint(model, x + y >= z)
@NLobjective(model, Min, x^2 + y^2)
```

This works but keep in mind that the model has an additional layer of between the solver and the [`POI.Optimizer`](@ref). This will make most operations slower than with the version without the caching optimizer. Keep in mind that since the official APIs of most solvers don't allow for modifications on linear constraints there should have no big difference between making a modification using POI or re-building the model from scratch.

If users wish to make modifications on variable bounds the POI interface will help you save time between solves. In this case you should use the [`ParametricOptInterface.ConstraintsInterpretation`](@ref) as we do in this example:

```julia
model = Model(() -> POI.Optimizer(Ipopt.Optimizer()))
@variable(model, x)
@variable(model, z in MOI.Parameter(10))
MOI.set(model, POI.ConstraintsInterpretation(), POI.ONLY_BOUNDS)
@constraint(model, x >= z)
@NLobjective(model, Min, x^2)
```

This use case should help users diminsh the time of making model modifications and re-solve the model. To increase the performance users that are familiar with [JuMP direct mode](https://jump.dev/JuMP.jl/stable/manual/models/#Direct-mode) can also use it.
51 changes: 8 additions & 43 deletions src/MOI_wrapper.jl
Original file line number Diff line number Diff line change
Expand Up @@ -948,18 +948,6 @@ function MOI.set(
return
end

#
# NLP
#

function MOI.supports(model::Optimizer, ::MOI.NLPBlock)
return MOI.supports(model.optimizer, MOI.NLPBlock())
end

function MOI.set(model::Optimizer, ::MOI.NLPBlock, nlp_data::MOI.NLPBlockData)
return MOI.set(model.optimizer, MOI.NLPBlock(), nlp_data)
end

#
# Other
#
Expand Down Expand Up @@ -1394,37 +1382,14 @@ function _poi_default_copy_to(dest::T, src::MOI.ModelLike) where {T}
MOI.empty!(dest)
vis_src = MOI.get(src, MOI.ListOfVariableIndices())
index_map = MOI.IndexMap()
# The `NLPBlock` assumes that the order of variables does not change (#849)
# Therefore, all VariableIndex and VectorOfVariable constraints are added
# seprately, and no variables constrained-on-creation are added.

# This is not valid for NLPs with Parameters, they should enter
has_nlp = MOI.NLPBlock() in MOI.get(src, MOI.ListOfModelAttributesSet())
constraints_not_added = if has_nlp
vcat(
Any[
MOI.get(src, MOI.ListOfConstraintIndices{F,S}()) for
(F, S) in MOI.get(src, MOI.ListOfConstraintTypesPresent()) if
MOI.Utilities._is_variable_function(F) &&
S != MOI.Parameter{Float64}
],
Any[MOI.Utilities._try_constrain_variables_on_creation(
dest,
src,
index_map,
MOI.Parameter{Float64},
)],
)
else
Any[
MOI.Utilities._try_constrain_variables_on_creation(
dest,
src,
index_map,
S,
) for S in MOI.Utilities.sorted_variable_sets_by_cost(dest, src)
]
end
constraints_not_added = Any[
MOI.Utilities._try_constrain_variables_on_creation(
dest,
src,
index_map,
S,
) for S in MOI.Utilities.sorted_variable_sets_by_cost(dest, src)
]
MOI.Utilities._copy_free_variables(dest, index_map, vis_src)
# Copy variable attributes
MOI.Utilities.pass_attributes(dest, src, index_map, vis_src)
Expand Down
43 changes: 2 additions & 41 deletions test/jump_tests.jl
Original file line number Diff line number Diff line change
Expand Up @@ -754,51 +754,12 @@ function test_jump_dual_delete_constraint()
end

function test_jump_nlp()
ipopt = Ipopt.Optimizer()
MOI.set(ipopt, MOI.RawOptimizerAttribute("print_level"), 0)
cached =
() -> MOI.Bridges.full_bridge_optimizer(
MOI.Utilities.CachingOptimizer(
MOI.Utilities.UniversalFallback(MOI.Utilities.Model{Float64}()),
ipopt,
),
Float64,
)
POI_cached_optimizer() = ParametricOptInterface.Optimizer(cached())
model = Model(() -> POI_cached_optimizer())
model = Model(() -> ParametricOptInterface.Optimizer(Ipopt.Optimizer()))
@variable(model, x)
@variable(model, y)
@variable(model, z in MOI.Parameter(10.0))
@constraint(model, x + y >= z)
@NLobjective(model, Min, x^2 + y^2)
optimize!(model)
objective_value(model)
@test value(x) 5
MOI.get(model, ParametricOptInterface.ParameterDual(), z)
MOI.set(model, ParametricOptInterface.ParameterValue(), z, 2.0)
optimize!(model)
@test objective_value(model) 2 atol = 1e-3
@test value(x) 1
ipopt = Ipopt.Optimizer()
MOI.set(ipopt, MOI.RawOptimizerAttribute("print_level"), 0)
model = Model(() -> ParametricOptInterface.Optimizer(ipopt))
@variable(model, x)
@variable(model, z in MOI.Parameter(10.0))
MOI.set(
model,
ParametricOptInterface.ConstraintsInterpretation(),
ParametricOptInterface.ONLY_BOUNDS,
)
@constraint(model, x >= z)
@NLobjective(model, Min, x^2)
optimize!(model)
objective_value(model)
@test value(x) 10
MOI.get(model, ParametricOptInterface.ParameterDual(), z)
MOI.set(model, ParametricOptInterface.ParameterValue(), z, 2.0)
optimize!(model)
@test objective_value(model) 4 atol = 1e-3
@test value(x) 2
@test_throws ErrorException optimize!(model)
return
end

Expand Down

0 comments on commit ec7ed41

Please sign in to comment.