You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Second-order operators can also be used with a combination of backends inside the [`SecondOrder`](@ref) struct.
120
+
There are many possible combinations, a lot of which will fail.
121
+
Due to compilation overhead, we do not currently test them all to display the working ones in the documentation, but we might if users deem it relevant.
Copy file name to clipboardExpand all lines: docs/src/overview.md
+22-3Lines changed: 22 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -49,10 +49,14 @@ Several variants of each operator are defined:
49
49
# mistakenly keep working with grad_in: NOT OK
50
50
```
51
51
Note that we don't guarantee `grad_out` will have the same type as `grad_in`.
52
+
Its type can even depend on the choice of backend.
52
53
53
54
## Second order
54
55
55
-
Second-order differentiation is also supported, with the following operators:
56
+
Second-order differentiation is also supported.
57
+
You can either pick a single backend to do all the work, or combine an "outer" backend with an "inner" backend using the [`SecondOrder`](@ref) struct, like so: `SecondOrder(outer, inner)`.
58
+
59
+
The available operators are similar to first-order ones:
56
60
57
61
| operator | input `x`| output `y`| result type | result shape |
@@ -97,9 +101,24 @@ By default, all the preparation functions return `nothing`.
97
101
We do not make any guarantees on their implementation for each backend, or on the performance gains that can be expected.
98
102
99
103
!!! warning
100
-
We haven't fully figured out what must happen when an `extras` object is prepared for a specific operator but then given to a lower-level one (i.e. prepare it for `jacobian` but then give it to `pushforward` inside `jacobian`).
104
+
We haven't yet figured out how to deal with extras for second-order operators, because closures make our life rather complicated.
105
+
For now, consider that preparation doesn't work there in general, although some individual backends may be okay already.
106
+
107
+
## FAQ
101
108
102
-
## Multiple inputs/outputs
109
+
###Multiple inputs/outputs
103
110
104
111
Restricting the API to one input and one output has many coding advantages, but it is not very flexible.
105
112
If you need more than that, use [ComponentArrays.jl](https://github.com/jonniedie/ComponentArrays.jl) to wrap several objects inside a single `ComponentVector`.
113
+
114
+
### Sparsity
115
+
116
+
If you need to work with sparse Jacobians, you can pick one of the [sparse backends](@ref Sparse) from [ADTypes.jl](https://github.com/SciML/ADTypes.jl).
117
+
The sparsity pattern is computed automatically with [Symbolics.jl](https://github.com/JuliaSymbolics/Symbolics.jl) during the preparation step.
118
+
119
+
If you need to work with sparse Hessians, you can use a sparse backend as the _outer_ backend of a `SecondOrder`.
120
+
This means the Hessian is obtained as the sparse Jacobian of the gradient.
121
+
Since preparation does not yet work for second order, the sparsity pattern is currently recomputed every time, so you may not gain much time as things stand.
122
+
123
+
!!! danger
124
+
Sparsity support is still experimental, use at your own risk.
0 commit comments