You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: DifferentiationInterface/README.md
+2-1Lines changed: 2 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -22,9 +22,10 @@ This package provides a backend-agnostic syntax to differentiate functions of th
22
22
23
23
## Features
24
24
25
-
- First- and second-order operators (gradients, Jacobians, Hessians and [more](https://gdalle.github.io/DifferentiationInterface.jl/DifferentiationInterface/stable/overview/))
25
+
- First- and second-order operators (gradients, Jacobians, Hessians and more)
26
26
- In-place and out-of-place differentiation
27
27
- Preparation mechanism (e.g. to create a config or tape)
28
+
- Built-in sparsity handling
28
29
- Thorough validation on standard inputs and outputs (numbers, vectors, matrices)
29
30
- Testing and benchmarking utilities accessible to users with [DifferentiationInterfaceTest](https://github.com/gdalle/DifferentiationInterface.jl/tree/main/DifferentiationInterfaceTest)
Copy file name to clipboardExpand all lines: DifferentiationInterface/docs/src/overview.md
+3-4Lines changed: 3 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -111,10 +111,9 @@ We offer two ways to perform second-order differentiation (for [`second_derivati
111
111
112
112
### Sparsity
113
113
114
-
[ADTypes.jl](https://github.com/SciML/ADTypes.jl) provides `AutoSparse` to accelerate the computation of sparse Jacobians and Hessians:
115
-
116
-
- for sparse Jacobians, wrap `AutoSparse` around a first-order backend.
117
-
- for sparse Hessians, wrap `AutoSparse` around a [`SecondOrder`](@ref) backend.
114
+
[ADTypes.jl](https://github.com/SciML/ADTypes.jl) provides [`AutoSparse`](@ref) to accelerate the computation of sparse Jacobians and Hessians.
115
+
Just wrap it around any backend, with an appropriate choice of sparsity detector and coloring algorithm, and call `jacobian` or `hessian`: the result will be sparse.
116
+
See the [tutorial section on sparsity](@ref sparsity-tutorial) for details.
Copy file name to clipboardExpand all lines: DifferentiationInterface/docs/src/tutorial.md
+46Lines changed: 46 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -138,3 +138,49 @@ And you can run the same benchmarks to see what you gained (although such a smal
138
138
In short, DifferentiationInterface.jl allows for easy testing and comparison of AD backends.
139
139
If you want to go further, check out the [DifferentiationInterfaceTest.jl tutorial](https://gdalle.github.io/DifferentiationInterface.jl/DifferentiationInterfaceTest/dev/tutorial/).
140
140
It provides benchmarking utilities to compare backends and help you select the one that is best suited for your problem.
141
+
142
+
## [Handling sparsity](@id sparsity-tutorial)
143
+
144
+
To compute sparse Jacobians or Hessians, you need three ingredients (read [this survey](https://epubs.siam.org/doi/10.1137/S0036144504444711) to understand why):
145
+
146
+
1. A sparsity pattern detector
147
+
2. A coloring algorithm
148
+
3. An underlying AD backend
149
+
150
+
ADTypes.jl v1.0 defines the [`AutoSparse`](@ref) wrapper, which brings together these three ingredients.
151
+
At the moment, this new wrapper is not well-supported in the ecosystem, which is why DifferentiationInterface.jl provides the necessary objects to get you started:
152
+
153
+
1.[`SymbolicsSparsityDetector`](@ref) (requires [Symbolics.jl](https://github.com/JuliaSymbolics/Symbolics.jl) to be loaded)
154
+
2.[`GreedyColoringAlgorithm`](@ref)
155
+
156
+
!!! warning
157
+
These objects are not part of the public API, so they can change unexpectedly between versions.
158
+
159
+
!!! info
160
+
The symbolic backends have built-in sparsity handling, so `AutoSparse(AutoSymbolics())` and `AutoSparse(AutoFastDifferentiation())` do not need additional configuration.
See how the computed Hessian is sparse, whereas the underlying backend alone would give us a dense matrix:
175
+
176
+
```@example tuto
177
+
hessian(f, sparse_backend, x)
178
+
```
179
+
180
+
```@example tuto
181
+
hessian(f, dense_backend, x)
182
+
```
183
+
184
+
The sparsity detector and coloring algorithm are called during the preparation step, which can be fairly expensive.
185
+
If you plan to compute several Jacobians or Hessians with the same pattern but different input vectors, you should reuse the `extras` object created by `prepare_jacobian` or `prepare_hessian`.
186
+
After preparation, the sparse computation itself will be much faster than the dense one, and require fewer calls to the function.
0 commit comments