You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We support (and re-export) most backend choices from [ADTypes.jl](https://github.com/SciML/ADTypes.jl), and we provide a few more of our own.
42
-
43
-
!!! warning
44
-
Only the backends listed below are supported by DifferentiationInterface.jl, even though ADTypes.jl defines more.
45
-
46
-
### Dense
47
-
48
-
```@docs
49
-
AutoChainRules
50
-
AutoDiffractor
51
-
AutoEnzyme
52
-
AutoFastDifferentiation
53
-
AutoForwardDiff
54
-
AutoForwardDiff()
55
-
AutoFiniteDiff
56
-
AutoFiniteDifferences
57
-
AutoPolyesterForwardDiff
58
-
AutoPolyesterForwardDiff()
59
-
AutoReverseDiff
60
-
AutoSymbolics
61
-
AutoTapir
62
-
AutoTracker
63
-
AutoZygote
64
-
```
65
-
66
-
### Sparse
35
+
We support all dense backend choices from [ADTypes.jl](https://github.com/SciML/ADTypes.jl), as well as their sparse wrapper `AutoSparse`.
67
36
68
37
For sparse backends, only the Jacobian and Hessian operators are implemented differently, the other operators behave the same as for the corresponding dense backend.
69
38
70
-
```@docs
71
-
AutoSparseFastDifferentiation
72
-
AutoSparseFiniteDiff
73
-
AutoSparseForwardDiff
74
-
AutoSparseForwardDiff()
75
-
AutoSparsePolyesterForwardDiff
76
-
AutoSparseReverseDiff
77
-
AutoSparseSymbolics
78
-
AutoSparseZygote
79
-
```
80
-
81
39
## Availability
82
40
83
41
You can use [`check_available`](@ref) to verify whether a given backend is loaded, like we did below:
Copy file name to clipboardExpand all lines: DifferentiationInterface/docs/src/overview.md
+8-15Lines changed: 8 additions & 15 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -109,16 +109,10 @@ We offer two ways to perform second-order differentiation (for [`second_derivati
109
109
110
110
### Sparsity
111
111
112
-
[ADTypes.jl](https://github.com/SciML/ADTypes.jl) provides [sparse versions](@ref Sparse) of many common AD backends.
113
-
They can accelerate the computation of sparse Jacobians and Hessians:
112
+
[ADTypes.jl](https://github.com/SciML/ADTypes.jl) provides `AutoSparse` to accelerate the computation of sparse Jacobians and Hessians:
114
113
115
-
- for sparse Jacobians, just select one of them as your first-order backend.
116
-
- for sparse Hessians, select one of them as the _outer part_ of a [`SecondOrder`](@ref) backend (in that case, the Hessian is obtained as the sparse Jacobian of the gradient).
117
-
118
-
The sparsity pattern is computed automatically with [Symbolics.jl](https://github.com/JuliaSymbolics/Symbolics.jl) during the preparation step.
119
-
120
-
!!! info "Planned feature"
121
-
Modular sparsity pattern computation, with other algorithms beyond those from Symbolics.jl.
114
+
- for sparse Jacobians, wrap `AutoSparse` around a first-order backend.
115
+
- for sparse Hessians, wrap `AutoSparse` around a [`SecondOrder`](@ref) backend.
122
116
123
117
### Split reverse mode
124
118
@@ -129,12 +123,7 @@ We make this available for all backends with the following operators:
Interface for providing several pushforward / pullback seeds at once, similar to the chunking in ForwardDiff.jl or the batches in Enzyme.jl.
126
+
## Going further
138
127
139
128
### Non-standard types
140
129
@@ -147,3 +136,7 @@ We voluntarily keep the type annotations minimal, so that passing more complex o
147
136
148
137
Restricting the API to one input and one output has many coding advantages, but it is not very flexible.
149
138
If you need more than that, use [ComponentArrays.jl](https://github.com/jonniedie/ComponentArrays.jl) to wrap several objects inside a single `ComponentVector`.
139
+
140
+
### Batched evaluation
141
+
142
+
This is not supported at the moment, but we plan to allow several pushforward / pullback seeds at once (similar to the chunking in ForwardDiff.jl or the batches in Enzyme.jl).
0 commit comments