You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -206,6 +209,8 @@ This controller allocates memory at each time step for the optimization.
206
209
function, see [`DifferentiationInterface` doc](@extref DifferentiationInterface List).
207
210
- `jacobian=default_jacobian(transcription)` : an `AbstractADType` backend for the Jacobian
208
211
of the nonlinear constraints, see `gradient` above for the options (default in Extended Help).
212
+
- `hessian=nothing` : an `AbstractADType` backend for the Hessian of the objective function,
213
+
see `gradient` above for the options, use `nothing` for the LBFGS approximation of `optim`.
209
214
- additional keyword arguments are passed to [`UnscentedKalmanFilter`](@ref) constructor
210
215
(or [`SteadyKalmanFilter`](@ref), for [`LinModel`](@ref)).
211
216
@@ -265,8 +270,11 @@ NonLinMPC controller with a sample time Ts = 10.0 s, Ipopt optimizer, UnscentedK
265
270
coloring_algorithm = GreedyColoringAlgorithm()
266
271
)
267
272
```
268
-
Optimizers generally benefit from exact derivatives like AD. However, the [`NonLinModel`](@ref)
269
-
state-space functions must be compatible with this feature. See [`JuMP` documentation](@extref JuMP Common-mistakes-when-writing-a-user-defined-operator)
273
+
Also, the `hessian` argument defaults to `nothing` meaning the built-in second-order
274
+
approximation of `solver`. Otherwise, a sparse backend like above is recommended to test
275
+
different `hessian` methods. Optimizers generally benefit from exact derivatives like AD.
276
+
However, the [`NonLinModel`](@ref) state-space functions must be compatible with this
277
+
feature. See [`JuMP` documentation](@extref JuMP Common-mistakes-when-writing-a-user-defined-operator)
270
278
for common mistakes when writing these functions.
271
279
272
280
Note that if `Cwt≠Inf`, the attribute `nlp_scaling_max_gradient` of `Ipopt` is set to
0 commit comments