Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[breaking] drop invalid nonlinear support #147

Merged
merged 2 commits into from
Jun 25, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
64 changes: 2 additions & 62 deletions docs/src/Examples/example.md
Original file line number Diff line number Diff line change
Expand Up @@ -353,7 +353,7 @@ MOI.set(
)
```

To multiply a parameter in a quadratic term, the user will
To multiply a parameter in a quadratic term, the user will
need to use the `POI.QuadraticObjectiveCoef` model attribute.

```@example moi2
Expand Down Expand Up @@ -415,7 +415,7 @@ We use the same MOI function to add the parameter multiplied to the quadratic te
MOI.set(backend(model), POI.QuadraticObjectiveCoef(), (index(x),index(y)), 2index(p)+3)
```

If the user print the `model`, the term `(2p+3)*xy` won't show.
If the user print the `model`, the term `(2p+3)*xy` won't show.
It's possible to retrieve the parametric function multiplying the term `xy` with `MOI.get`.

```@example jump4
Expand All @@ -440,63 +440,3 @@ isapprox(objective_value(model), 128/9, atol=1e-4)
isapprox(value(x), 4/3, atol=1e-4)
isapprox(value(y), 4/3, atol=1e-4)
```
## JuMP Example - Non Linear Programming (NLP)

POI currently works with NLPs when users wish to add the parameters to the non-NL constraints or objective. This means that POI works with models like this one:

```julia
@variable(model, x)
@variable(model, y)
@variable(model, z in MOI.Parameter(10))
@constraint(model, x + y >= z)
@NLobjective(model, Min, x^2 + y^2)
```

but does not work with models that have parameters on the NL expressions like this one:

```julia
@variable(model, x)
@variable(model, y)
@variable(model, z in MOI.Parameter(10))
@constraint(model, x + y >= z)
@NLobjective(model, Min, x^2 + y^2 + z) # There is a parameter here
```

If users with to add parameters in NL expressions we strongly recommend them to read [this section on the JuMP documentation]((https://jump.dev/JuMP.jl/stable/manual/nlp/#Create-a-nonlinear-parameter))

Although POI works with NLPs there are some important information for users to keep in mind. All come from the fact that POI relies on the MOI interface for problem modifications and these are not common on NLP solvers, most solvers only allow users to modify variable bounds using their official APIs. This means that if users wish to make modifications on some constraint that is not a variable bound we are not allowed to call `MOI.modify` because the function is not supported in the MOI solver interface. The work-around to this is defining a [`POI.Optimizer`](@ref) on a caching optimizer:

```julia
ipopt = Ipopt.Optimizer()
MOI.set(ipopt, MOI.RawOptimizerAttribute("print_level"), 0)
cached =
() -> MOI.Bridges.full_bridge_optimizer(
MOIU.CachingOptimizer(
MOIU.UniversalFallback(MOIU.Model{Float64}()),
ipopt,
),
Float64,
)
POI_cached_optimizer() = POI.Optimizer(cached())
model = Model(() -> POI_cached_optimizer())
@variable(model, x)
@variable(model, y)
@variable(model, z in MOI.Parameter(10))
@constraint(model, x + y >= z)
@NLobjective(model, Min, x^2 + y^2)
```

This works but keep in mind that the model has an additional layer of between the solver and the [`POI.Optimizer`](@ref). This will make most operations slower than with the version without the caching optimizer. Keep in mind that since the official APIs of most solvers don't allow for modifications on linear constraints there should have no big difference between making a modification using POI or re-building the model from scratch.

If users wish to make modifications on variable bounds the POI interface will help you save time between solves. In this case you should use the [`ParametricOptInterface.ConstraintsInterpretation`](@ref) as we do in this example:

```julia
model = Model(() -> POI.Optimizer(Ipopt.Optimizer()))
@variable(model, x)
@variable(model, z in MOI.Parameter(10))
MOI.set(model, POI.ConstraintsInterpretation(), POI.ONLY_BOUNDS)
@constraint(model, x >= z)
@NLobjective(model, Min, x^2)
```

This use case should help users diminsh the time of making model modifications and re-solve the model. To increase the performance users that are familiar with [JuMP direct mode](https://jump.dev/JuMP.jl/stable/manual/models/#Direct-mode) can also use it.
79 changes: 0 additions & 79 deletions src/MOI_wrapper.jl
Original file line number Diff line number Diff line change
Expand Up @@ -948,18 +948,6 @@ function MOI.set(
return
end

#
# NLP
#

function MOI.supports(model::Optimizer, ::MOI.NLPBlock)
return MOI.supports(model.optimizer, MOI.NLPBlock())
end

function MOI.set(model::Optimizer, ::MOI.NLPBlock, nlp_data::MOI.NLPBlockData)
return MOI.set(model.optimizer, MOI.NLPBlock(), nlp_data)
end

#
# Other
#
Expand Down Expand Up @@ -1369,73 +1357,6 @@ function MOI.get(
end
end

#
# Copy
#

function MOI.Utilities.default_copy_to(
dest::MOI.Bridges.LazyBridgeOptimizer{Optimizer{T,OT}},
src::MOI.ModelLike,
) where {T,OT}
return _poi_default_copy_to(dest, src)
end

function MOI.Utilities.default_copy_to(
dest::Optimizer{T,OT},
src::MOI.ModelLike,
) where {T,OT}
return _poi_default_copy_to(dest, src)
end

function _poi_default_copy_to(dest::T, src::MOI.ModelLike) where {T}
if !MOI.supports_incremental_interface(dest)
error("Model $(typeof(dest)) does not support copy_to.")
end
MOI.empty!(dest)
vis_src = MOI.get(src, MOI.ListOfVariableIndices())
index_map = MOI.IndexMap()
# The `NLPBlock` assumes that the order of variables does not change (#849)
# Therefore, all VariableIndex and VectorOfVariable constraints are added
# seprately, and no variables constrained-on-creation are added.

# This is not valid for NLPs with Parameters, they should enter
has_nlp = MOI.NLPBlock() in MOI.get(src, MOI.ListOfModelAttributesSet())
constraints_not_added = if has_nlp
vcat(
Any[
MOI.get(src, MOI.ListOfConstraintIndices{F,S}()) for
(F, S) in MOI.get(src, MOI.ListOfConstraintTypesPresent()) if
MOI.Utilities._is_variable_function(F) &&
S != MOI.Parameter{Float64}
],
Any[MOI.Utilities._try_constrain_variables_on_creation(
dest,
src,
index_map,
MOI.Parameter{Float64},
)],
)
else
Any[
MOI.Utilities._try_constrain_variables_on_creation(
dest,
src,
index_map,
S,
) for S in MOI.Utilities.sorted_variable_sets_by_cost(dest, src)
]
end
MOI.Utilities._copy_free_variables(dest, index_map, vis_src)
# Copy variable attributes
MOI.Utilities.pass_attributes(dest, src, index_map, vis_src)
# Copy model attributes
MOI.Utilities.pass_attributes(dest, src, index_map)
# Copy constraints
MOI.Utilities._pass_constraints(dest, src, index_map, constraints_not_added)
MOI.Utilities.final_touch(dest, index_map)
return index_map
end

#
# Optimize
#
Expand Down
43 changes: 2 additions & 41 deletions test/jump_tests.jl
Original file line number Diff line number Diff line change
Expand Up @@ -754,51 +754,12 @@ function test_jump_dual_delete_constraint()
end

function test_jump_nlp()
ipopt = Ipopt.Optimizer()
MOI.set(ipopt, MOI.RawOptimizerAttribute("print_level"), 0)
cached =
() -> MOI.Bridges.full_bridge_optimizer(
MOI.Utilities.CachingOptimizer(
MOI.Utilities.UniversalFallback(MOI.Utilities.Model{Float64}()),
ipopt,
),
Float64,
)
POI_cached_optimizer() = ParametricOptInterface.Optimizer(cached())
model = Model(() -> POI_cached_optimizer())
model = Model(() -> ParametricOptInterface.Optimizer(Ipopt.Optimizer()))
@variable(model, x)
@variable(model, y)
@variable(model, z in MOI.Parameter(10.0))
@constraint(model, x + y >= z)
@NLobjective(model, Min, x^2 + y^2)
optimize!(model)
objective_value(model)
@test value(x) ≈ 5
MOI.get(model, ParametricOptInterface.ParameterDual(), z)
MOI.set(model, ParametricOptInterface.ParameterValue(), z, 2.0)
optimize!(model)
@test objective_value(model) ≈ 2 atol = 1e-3
@test value(x) ≈ 1
ipopt = Ipopt.Optimizer()
MOI.set(ipopt, MOI.RawOptimizerAttribute("print_level"), 0)
model = Model(() -> ParametricOptInterface.Optimizer(ipopt))
@variable(model, x)
@variable(model, z in MOI.Parameter(10.0))
MOI.set(
model,
ParametricOptInterface.ConstraintsInterpretation(),
ParametricOptInterface.ONLY_BOUNDS,
)
@constraint(model, x >= z)
@NLobjective(model, Min, x^2)
optimize!(model)
objective_value(model)
@test value(x) ≈ 10
MOI.get(model, ParametricOptInterface.ParameterDual(), z)
MOI.set(model, ParametricOptInterface.ParameterValue(), z, 2.0)
optimize!(model)
@test objective_value(model) ≈ 4 atol = 1e-3
@test value(x) ≈ 2
@test_throws ErrorException optimize!(model)
return
end

Expand Down
Loading