Skip to content

Commit d51fc0a

Browse files
authored
Improve docs and document contexts (#469)
* Improve docs and document contexts * One less section
1 parent 267023a commit d51fc0a

16 files changed

Lines changed: 498 additions & 663 deletions

File tree

DifferentiationInterface/README.md

Lines changed: 4 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -17,20 +17,18 @@ An interface to various automatic differentiation (AD) backends in Julia.
1717

1818
## Goal
1919

20-
This package provides a unified syntax to differentiate functions.
21-
22-
## Features
20+
This package provides a unified syntax to differentiate functions, including:
2321

2422
- First- and second-order operators (gradients, Jacobians, Hessians and more)
2523
- In-place and out-of-place differentiation
26-
- Preparation mechanism (e.g. to create a config or tape)
24+
- Preparation mechanism (e.g. to pre-allocate a cache or record a tape)
2725
- Built-in sparsity handling
2826
- Thorough validation on standard inputs and outputs (numbers, vectors, matrices)
2927
- Testing and benchmarking utilities accessible to users with [DifferentiationInterfaceTest](https://github.com/gdalle/DifferentiationInterface.jl/tree/main/DifferentiationInterfaceTest)
3028

3129
## Compatibility
3230

33-
We support all of the backends defined by [ADTypes.jl](https://github.com/SciML/ADTypes.jl):
31+
We support the following backends defined by [ADTypes.jl](https://github.com/SciML/ADTypes.jl):
3432

3533
- [ChainRulesCore.jl](https://github.com/JuliaDiff/ChainRulesCore.jl)
3634
- [Diffractor.jl](https://github.com/JuliaDiff/Diffractor.jl)
@@ -87,7 +85,7 @@ value_and_gradient(f, AutoEnzyme(), x) # returns (5.0, [2.0, 4.0]) with Enz
8785
value_and_gradient(f, AutoZygote(), x) # returns (5.0, [2.0, 4.0]) with Zygote.jl
8886
```
8987

90-
To improve your performance by up to several orders of magnitude compared to this example, take a look at the [DifferentiationInterface tutorial](https://gdalle.github.io/DifferentiationInterface.jl/DifferentiationInterface/stable/tutorial1/) and its section on operator preparation.
88+
To improve your performance by up to several orders of magnitude compared to this example, take a look at the tutorial and its section on operator preparation.
9189

9290
## Citation
9391

DifferentiationInterface/docs/Project.toml

Lines changed: 0 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -1,26 +1,16 @@
11
[deps]
22
ADTypes = "47edcb42-4c32-4615-8424-f2b9edc5f35b"
33
BenchmarkTools = "6e4b80f9-dd63-53aa-95a3-0cdb28fa8baf"
4-
ChainRulesCore = "d360d2e6-b24c-11e9-a2a3-2a2ae2dbcce4"
54
DifferentiationInterface = "a0c0ee7d-e4b9-4e03-894e-1c5f64a51d63"
6-
Diffractor = "9f5e2b26-1114-432f-b630-d3fe2085c51c"
75
Documenter = "e30172f5-a6a5-5a46-863b-614d45cd2de4"
86
DocumenterInterLinks = "d12716ef-a0f6-4df4-a9f1-a5a34e75c656"
97
DocumenterMermaid = "a078cd44-4d9c-4618-b545-3ab9d77f9177"
108
Enzyme = "7da242da-08ed-463a-9acd-ee780be4f1d9"
11-
FastDifferentiation = "eb9bf01b-bf85-4b60-bf87-ee5de06c00be"
12-
FiniteDiff = "6a86dc24-6348-571c-b903-95158fe2bd41"
13-
FiniteDifferences = "26cc04aa-876d-5657-8c51-4c34ba976000"
149
ForwardDiff = "f6369f11-7733-5829-9624-2563aa707210"
1510
Markdown = "d6f4376e-aef5-505a-96c1-9c027394607a"
16-
PolyesterForwardDiff = "98d1487c-24ca-40b6-b7ab-df2af84e126b"
1711
PrettyTables = "08abe8d2-0d0c-5749-adfa-8a2ac140af0d"
18-
ReverseDiff = "37e2e3b7-166d-5795-8a7a-e32c996b4267"
1912
SparseConnectivityTracer = "9f842d2f-2579-4b1d-911e-f412cf18a3f5"
2013
SparseMatrixColorings = "0a514795-09f3-496d-8182-132a7b665d35"
21-
Symbolics = "0c5d862f-8b57-4792-8d23-62f2024744c7"
22-
Tapir = "07d77754-e150-4737-8c94-cd238a1fb45b"
23-
Tracker = "9f7883ad-71c0-57eb-9f7f-b5c9e6d3789c"
2414
Zygote = "e88e6eb3-aa80-5325-afca-941959d7151f"
2515

2616
[compat]

DifferentiationInterface/docs/make.jl

Lines changed: 8 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -6,17 +6,8 @@ using DocumenterMermaid
66
using DocumenterInterLinks
77

88
using ADTypes: ADTypes
9-
using Diffractor: Diffractor
109
using Enzyme: Enzyme
11-
using FastDifferentiation: FastDifferentiation
12-
using FiniteDiff: FiniteDiff
13-
using FiniteDifferences: FiniteDifferences
1410
using ForwardDiff: ForwardDiff
15-
using PolyesterForwardDiff: PolyesterForwardDiff
16-
using ReverseDiff: ReverseDiff
17-
using Symbolics: Symbolics
18-
using Tapir: Tapir
19-
using Tracker: Tracker
2011
using Zygote: Zygote
2112

2213
links = InterLinks(
@@ -35,9 +26,14 @@ makedocs(;
3526
format=Documenter.HTML(; assets=["assets/favicon.ico"]),
3627
pages=[
3728
"Home" => "index.md",
38-
"Tutorials" => ["tutorial1.md", "tutorial2.md"],
39-
"Reference" => ["operators.md", "backends.md", "api.md"],
40-
"Advanced" => ["dev_guide.md", "implementations.md"],
29+
"Tutorials" => ["tutorials/basic.md", "tutorials/advanced.md"],
30+
"Explanation" => [
31+
"explanation/operators.md",
32+
"explanation/backends.md",
33+
"explanation/advanced.md",
34+
],
35+
"api.md",
36+
"dev_guide.md",
4137
],
4238
plugins=[links],
4339
)

DifferentiationInterface/docs/src/api.md

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,12 +8,16 @@ CollapsedDocStrings = true
88
DifferentiationInterface
99
```
1010

11-
## First order
11+
## Argument wrappers
1212

1313
```@docs
14+
Context
15+
Constant
1416
Tangents
1517
```
1618

19+
## First order
20+
1721
### Pushforward
1822

1923
```@docs

DifferentiationInterface/docs/src/backends.md

Lines changed: 0 additions & 107 deletions
This file was deleted.

DifferentiationInterface/docs/src/dev_guide.md

Lines changed: 1 addition & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,7 @@
11
# Dev guide
22

33
This page is important reading if you want to contribute to DifferentiationInterface.jl.
4-
It is not part of the public API.
5-
6-
!!! warning
7-
The content below may become outdated, in which case you should refer to the source code as the ground truth.
4+
It is not part of the public API and the content below may become outdated, in which case you should refer to the source code as the ground truth.
85

96
## General principles
107

Lines changed: 61 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,61 @@
1+
# Advanced features
2+
3+
## Contexts
4+
5+
### Additional arguments
6+
7+
For all operators provided DifferentiationInterface, there can be only one differentiated (or "active") argument, which we call `x`.
8+
However, the release v0.6 introduced the possibility of additional "context" arguments, which are not differentiated but still passed to the function after `x`.
9+
10+
Contexts can be useful if you have a function `y = f(x, a, b, c, ...)` or `f!(y, x, a, b, c, ...)` and you want derivatives of `y` with respect to `x` only.
11+
Another option would be creating a closure, but that is sometimes undesirable.
12+
13+
!!! warning
14+
This feature is still experimental, and will likely not be supported by all backends.
15+
At the moment, it only works with ForwardDiff.
16+
17+
### Types of contexts
18+
19+
Every context argument must be wrapped in a subtype of [`Context`](@ref) and come after the differentiated input `x`.
20+
Right now, there is only one kind of context, namely [`Constant`](@ref), but we might add more.
21+
Semantically, calling
22+
23+
```julia
24+
gradient(f, backend, x, Constant(c))
25+
```
26+
27+
computes the partial gradient of `f(x, c)` with respect to `x`, while keeping `c` constant.
28+
Importantly, one can prepare an operator with an arbitrary value `c'` of the constant (subject to the usual restrictions on preparation).
29+
30+
## Sparsity
31+
32+
When faced with sparse Jacobian or Hessian matrices, one can take advantage of their sparsity pattern to speed up the computation.
33+
DifferentiationInterface does this automatically if you pass a backend of type [`AutoSparse`](@extref ADTypes.AutoSparse).
34+
35+
!!! tip
36+
To know more about sparse AD, read the survey [_What Color Is Your Jacobian? Graph Coloring for Computing Derivatives_](https://epubs.siam.org/doi/10.1137/S0036144504444711) (Gebremedhin et al., 2005).
37+
38+
### `AutoSparse` object
39+
40+
An `AutoSparse` backend must be constructed from three ingredients:
41+
42+
1. An underlying (dense) backend
43+
2. A sparsity pattern detector like:
44+
- [`TracerSparsityDetector`](@extref SparseConnectivityTracer.TracerSparsityDetector) from [SparseConnectivityTracer.jl](https://github.com/adrhill/SparseConnectivityTracer.jl)
45+
- [`SymbolicsSparsityDetector`](@extref Symbolics.SymbolicsSparsityDetector) from [Symbolics.jl](https://github.com/JuliaSymbolics/Symbolics.jl)
46+
- [`DenseSparsityDetector`](@ref) from DifferentiationInterface.jl (beware that this detector only gives a locally valid pattern)
47+
3. A coloring algorithm: [`GreedyColoringAlgorithm`](@extref SparseMatrixColorings.GreedyColoringAlgorithm) from [SparseMatrixColorings.jl](https://github.com/gdalle/SparseMatrixColorings.jl) is the only one we support. As a result, sparse AD is now located in a package extension which depends on SparseMatrixColorings.jl.
48+
49+
`AutoSparse` backends only support [`jacobian`](@ref) and [`hessian`](@ref) (as well as their variants), because other operators do not output matrices.
50+
To obtain sparse Hessians, you need to put the `SecondOrder` backend inside `AutoSparse`, and not the other way around.
51+
52+
!!! note
53+
Symbolic backends have built-in sparsity handling, so `AutoSparse(AutoSymbolics())` and `AutoSparse(AutoFastDifferentiation())` do not need additional configuration for pattern detection or coloring.
54+
55+
### Cost of sparse preparation
56+
57+
The preparation step of `jacobian` or `hessian` with an `AutoSparse` backend can be long, because it needs to detect the sparsity pattern and perform a matrix coloring.
58+
But after preparation, the more zeros are present in the matrix, the greater the speedup will be compared to dense differentiation.
59+
60+
!!! danger
61+
The result of preparation for an `AutoSparse` backend cannot be reused if the sparsity pattern changes.

0 commit comments

Comments
 (0)