You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: src/controller/nonlinmpc.jl
+1-1Lines changed: 1 addition & 1 deletion
Original file line number
Diff line number
Diff line change
@@ -550,7 +550,7 @@ This method is really intricate and I'm not proud of it. That's because of 3 ele
550
550
and as efficient as possible. All the function outputs and derivatives are cached and
551
551
updated in-place if required to use the efficient [`value_and_jacobian!`](@extref DifferentiationInterface DifferentiationInterface.value_and_jacobian!).
552
552
- The `JuMP` NLP syntax forces splatting for the decision variable, which implies use
553
-
of `Vararg{T,N}` (see the [performance tip][@extref Julia Be-aware-of-when-Julia-avoids-specializing]
553
+
of `Vararg{T,N}` (see the (performance tip)[@extref Julia Be-aware-of-when-Julia-avoids-specializing]
554
554
) and memoization to avoid redundant computations. This is already complex, but it's even
555
555
worse knowing that most automatic differentiation tools do not support splatting.
556
556
- The signature of gradient and hessian functions is not the same for univariate (`nZ̃ == 1`)
0 commit comments