You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/src/Examples/example.md
+76-63Lines changed: 76 additions & 63 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,20 +4,18 @@
4
4
5
5
Let's write a step-by-step example of `POI` usage at the MOI level.
6
6
7
-
First, we declare a [`ParametricOptInterface.Optimizer`](@ref) on top of a `MOI` optimizer. In the example, we consider `HiGHS` as the underlying solver:
7
+
First, we declare a [`ParametricOptInterface.Optimizer`](@ref) on top of a `MOI`
8
+
optimizer. In the example, we consider `HiGHS` as the underlying solver:
8
9
9
10
```@example moi1
10
-
using HiGHS
11
-
using MathOptInterface
12
-
using ParametricOptInterface
13
-
14
-
const MOI = MathOptInterface
15
-
const POI = ParametricOptInterface
16
-
11
+
import HiGHS
12
+
import MathOptInterface as MOI
13
+
import ParametricOptInterface as POI
17
14
optimizer = POI.Optimizer(HiGHS.Optimizer())
18
15
```
19
16
20
-
We declare the variable `x` as in a typical `MOI` model, and we add a non-negativity constraint:
17
+
We declare the variable `x` as in a typical `MOI` model, and we add a
18
+
non-negativity constraint:
21
19
22
20
```@example moi1
23
21
x = MOI.add_variables(optimizer, 2)
@@ -26,15 +24,19 @@ for x_i in x
26
24
end
27
25
```
28
26
29
-
Now, let's consider 3 `MOI.Parameter`. Two of them, `y`, `z`, will be placed in the constraints and one, `w`, in the objective function. We'll start all three of them with a value equal to `0`:
27
+
Now, let's consider 3 `MOI.Parameter`. Two of them, `y`, `z`, will be placed in
28
+
the constraints and one, `w`, in the objective function. We'll start all three
Let's add the constraints. Notice that we treat parameters and variables in the same way when building the functions that will be placed in some set to create a constraint (`Function-in-Set`):
37
+
Let's add the constraints. Notice that we treat parameters and variables in the
38
+
same way when building the functions that will be placed in some set to create a
We can also retrieve the dual values associated to each parameter, **as they are all additive**:
73
+
We can also retrieve the dual values associated to each parameter,
74
+
**as they are all additive**:
71
75
72
76
```@example moi1
73
77
MOI.get(optimizer, MOI.ConstraintDual(), cy)
74
78
MOI.get(optimizer, MOI.ConstraintDual(), cz)
75
79
MOI.get(optimizer, MOI.ConstraintDual(), cw)
76
80
```
77
81
78
-
Notice the direct relationship in this case between the parameters' duals and the associated constraints' duals.
79
-
The `y` parameter, for example, only appears in the `cons1`. If we compare their duals, we can check that the dual of `y` is equal to its coefficient in `cons1` multiplied by the constraint's dual itself, as expected:
82
+
Notice the direct relationship in this case between the parameters' duals and
83
+
the associated constraints' duals.
84
+
85
+
The `y` parameter, for example, only appears in the `cons1`. If we compare
86
+
their duals, we can check that the dual of `y` is equal to its coefficient in
87
+
`cons1` multiplied by the constraint's dual itself, as expected:
The same is valid for the remaining parameters. In case a parameter appears in more than one constraint, or both some constraints and in the objective function, its dual will be equal to the linear combination of the functions' duals multiplied by the respective coefficients.
97
+
The same is valid for the remaining parameters. In case a parameter appears in
98
+
more than one constraint, or both some constraints and in the objective
99
+
function, its dual will be equal to the linear combination of the functions'
100
+
duals multiplied by the respective coefficients.
101
+
102
+
So far, we only added some parameters that had no influence at first in solving
103
+
the model. Let's change the values associated to each parameter to assess its
104
+
implications.
86
105
87
-
So far, we only added some parameters that had no influence at first in solving the model. Let's change the values associated to each parameter to assess its implications.
88
-
First, we set the value of parameters `y` and `z` to `1.0`. Notice that we are changing the feasible set of the decision variables:
106
+
First, we set the value of parameters `y` and `z` to `1.0`. Notice that we are
107
+
changing the feasible set of the decision variables:
89
108
90
109
```@example moi1
91
110
MOI.set(optimizer, POI.ParameterValue(), y, 1.0)
92
111
MOI.set(optimizer, POI.ParameterValue(), z, 1.0)
93
112
```
94
113
95
-
However, if we check the optimized model now, there will be no changes in the objective function value or the in the optimized decision variables:
114
+
However, if we check the optimized model now, there will be no changes in the
115
+
objective function value or the in the optimized decision variables:
Although we changed the parameter values, we didn't optimize the model yet. Thus, **to apply the parameters' changes, the model must be optimized again**:
123
+
Although we changed the parameter values, we didn't optimize the model yet.
124
+
Thus, **to apply the parameters' changes, the model must be optimized again**:
104
125
105
126
```@example moi1
106
127
MOI.optimize!(optimizer)
107
128
```
108
129
109
-
The `MOI.optimize!()` function handles the necessary updates, properly fowarding the new outer model (`POI` model) additions to the inner model (`MOI` model) which will be handled by the solver. Now we can assess the updated optimized information:
130
+
The `MOI.optimize!()` function handles the necessary updates, properly fowarding
131
+
the new outer model (`POI` model) additions to the inner model (`MOI` model)
132
+
which will be handled by the solver. Now we can assess the updated optimized
If we update the parameter `w`, associated to the objective function, we are simply adding a constant to it. Notice how the new objective function is precisely equal to the previous one plus the new value of `w`. In addition, as we didn't update the feasible set, the optimized decision variables remain the same.
140
+
If we update the parameter `w`, associated to the objective function, we are
141
+
simply adding a constant to it. Notice how the new objective function is
142
+
precisely equal to the previous one plus the new value of `w`. In addition, as
143
+
we didn't update the feasible set, the optimized decision variables remain the
Let's write a step-by-step example of `POI` usage at the JuMP level.
130
158
131
-
First, we declare a `Model` on top of a `Optimizer` of an underlying solver. In the example, we consider `HiGHS` as the underlying solver:
159
+
First, we declare a `Model` on top of a `Optimizer` of an underlying solver. In
160
+
the example, we consider `HiGHS` as the underlying solver:
132
161
133
162
```@example jump1
134
163
using HiGHS
@@ -146,15 +175,18 @@ We declare the variable `x` as in a typical `JuMP` model:
146
175
@variable(model, x[i = 1:2] >= 0)
147
176
```
148
177
149
-
Now, let's consider 3 `MOI.Parameter`. Two of them, `y`, `z`, will be placed in the constraints and one, `w`, in the objective function. We'll start all three of them with a value equal to `0`:
178
+
Now, let's consider 3 `MOI.Parameter`. Two of them, `y`, `z`, will be placed in
179
+
the constraints and one, `w`, in the objective function. We'll start all three
180
+
of them with a value equal to `0`:
150
181
151
182
```@example jump1
152
183
@variable(model, y in MOI.Parameter(0.0))
153
184
@variable(model, z in MOI.Parameter(0.0))
154
185
@variable(model, w in MOI.Parameter(0.0))
155
186
```
156
187
157
-
Let's add the constraints. Notice that we treat parameters the same way we treat variables when writing the model:
188
+
Let's add the constraints. Notice that we treat parameters the same way we treat
189
+
variables when writing the model:
158
190
159
191
```@example jump1
160
192
@constraint(model, c1, 2x[1] + x[2] + 3y <= 4)
@@ -175,7 +207,8 @@ termination_status(model)
175
207
primal_status(model)
176
208
```
177
209
178
-
Given the optimized solution, we check that its value is, as expected, equal to `28/3`, and the solution vector `x` is `[4/3, 4/3]`:
210
+
Given the optimized solution, we check that its value is, as expected, equal to
211
+
`28/3`, and the solution vector `x` is `[4/3, 4/3]`:
179
212
180
213
```@example jump1
181
214
isapprox(objective_value(model), 28/3)
@@ -320,56 +353,36 @@ Users that just want everything to work can use the default value `POI.ONLY_CONS
0 commit comments