|
8 | 8 |
|
9 | 9 | ForwardDiff implements methods to take **derivatives**, **gradients**, **Jacobians**, **Hessians**, and higher-order derivatives of native Julia functions (or any callable object, really) using **forward mode automatic differentiation (AD)**.
|
10 | 10 |
|
11 |
| -While performance can vary depending on the functions you evaluate, the algorithms implemented by ForwardDiff **generally outperform non-AD algorithms in both speed and accuracy.** |
| 11 | +While performance can vary depending on the functions you evaluate, the algorithms implemented by ForwardDiff generally outperform non-AD algorithms (such as finite-differencing) in both speed and accuracy. |
12 | 12 |
|
13 | 13 | Here's a simple example showing the package in action:
|
14 | 14 |
|
15 | 15 | ```julia
|
16 | 16 | julia> using ForwardDiff
|
17 | 17 |
|
18 |
| -julia> f(x::Vector) = sum(sin, x) + prod(tan, x) * sum(sqrt, x); |
| 18 | +julia> f(x::Vector) = sin(x[1]) + prod(x[2:end]); # returns a scalar |
19 | 19 |
|
20 |
| -julia> x = rand(5) # small size for example's sake |
21 |
| -5-element Array{Float64,1}: |
22 |
| - 0.986403 |
23 |
| - 0.140913 |
24 |
| - 0.294963 |
25 |
| - 0.837125 |
26 |
| - 0.650451 |
| 20 | +julia> x = vcat(pi/4, 2:4) |
| 21 | +4-element Vector{Float64}: |
| 22 | + 0.7853981633974483 |
| 23 | + 2.0 |
| 24 | + 3.0 |
| 25 | + 4.0 |
27 | 26 |
|
28 |
| -julia> g = x -> ForwardDiff.gradient(f, x); # g = ∇f |
29 |
| - |
30 |
| -julia> g(x) |
31 |
| -5-element Array{Float64,1}: |
32 |
| - 1.01358 |
33 |
| - 2.50014 |
34 |
| - 1.72574 |
35 |
| - 1.10139 |
36 |
| - 1.2445 |
| 27 | +julia> ForwardDiff.gradient(f, x) |
| 28 | +4-element Vector{Float64}: |
| 29 | + 0.7071067811865476 |
| 30 | + 12.0 |
| 31 | + 8.0 |
| 32 | + 6.0 |
37 | 33 |
|
38 | 34 | julia> ForwardDiff.hessian(f, x)
|
39 |
| -5x5 Array{Float64,2}: |
40 |
| - 0.585111 3.48083 1.7706 0.994057 1.03257 |
41 |
| - 3.48083 1.06079 5.79299 3.25245 3.37871 |
42 |
| - 1.7706 5.79299 0.423981 1.65416 1.71818 |
43 |
| - 0.994057 3.25245 1.65416 0.251396 0.964566 |
44 |
| - 1.03257 3.37871 1.71818 0.964566 0.140689 |
45 |
| - ``` |
46 |
| - |
47 |
| - Trying to switch to the latest version of ForwardDiff? See our [upgrade guide](http://www.juliadiff.org/ForwardDiff.jl/stable/user/upgrade/) for details regarding user-facing changes between releases. |
| 35 | +4×4 Matrix{Float64}: |
| 36 | + -0.707107 0.0 0.0 0.0 |
| 37 | + 0.0 0.0 4.0 3.0 |
| 38 | + 0.0 4.0 0.0 2.0 |
| 39 | + 0.0 3.0 2.0 0.0 |
| 40 | +``` |
| 41 | + |
| 42 | +Functions like `f` which map a vector to a scalar are the best case for reverse-mode automatic differentiation, |
| 43 | +but ForwardDiff may still be a good choice if `x` is not too large, as it is much simpler. |
| 44 | +The best case for forward-mode differentiation is a function which maps a scalar to a vector, like this `g`: |
| 45 | + |
| 46 | +```julia |
| 47 | +julia> g(y::Real) = [sin(y), cos(y), tan(y)]; # returns a vector |
| 48 | + |
| 49 | +julia> ForwardDiff.derivative(g, pi/4) |
| 50 | +3-element Vector{Float64}: |
| 51 | + 0.7071067811865476 |
| 52 | + -0.7071067811865475 |
| 53 | + 1.9999999999999998 |
| 54 | + |
| 55 | +julia> ForwardDiff.jacobian(x) do x # anonymous function, returns a length-2 vector |
| 56 | + [sin(x[1]), prod(x[2:end])] |
| 57 | + end |
| 58 | +2×4 Matrix{Float64}: |
| 59 | + 0.707107 0.0 0.0 0.0 |
| 60 | + 0.0 12.0 8.0 6.0 |
| 61 | +``` |
| 62 | + |
| 63 | +See [ForwardDiff's documentation](https://juliadiff.org/ForwardDiff.jl/stable) for full details on how to use this package. |
| 64 | +ForwardDiff relies on [DiffRules](https://github.com/JuliaDiff/DiffRules.jl) for the derivatives of many simple function such as `sin`. |
| 65 | + |
| 66 | +See the [JuliaDiff web page](https://juliadiff.org) for other automatic differentiation packages. |
48 | 67 |
|
49 | 68 | ## Publications
|
50 | 69 |
|
|
0 commit comments