You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
value_and_gradient(f, AutoForwardDiff(), x) # returns (5.0, [2.0, 4.0]) with ForwardDiff.jl
82
+
value_and_gradient(f, AutoEnzyme(), x) # returns (5.0, [2.0, 4.0]) with Enzyme.jl
83
+
value_and_gradient(f, AutoZygote(), x) # returns (5.0, [2.0, 4.0]) with Zygote.jl
84
84
```
85
85
86
86
For more performance, take a look at the [DifferentiationInterface tutorial](https://gdalle.github.io/DifferentiationInterface.jl/DifferentiationInterface/stable/tutorial/).
We support all dense backend choices from [ADTypes.jl](https://github.com/SciML/ADTypes.jl), as well as their sparse wrapper `AutoSparse`.
50
+
We support all dense backend choices from [ADTypes.jl](https://github.com/SciML/ADTypes.jl), as well as their sparse wrapper [`AutoSparse`](@ref).
51
51
52
52
For sparse backends, only the Jacobian and Hessian operators are implemented differently, the other operators behave the same as for the corresponding dense backend.
53
53
54
54
```@example backends
55
55
backend_table #hide
56
56
```
57
57
58
-
## Availability
58
+
## Checks
59
+
60
+
### Availability
59
61
60
62
You can use [`check_available`](@ref) to verify whether a given backend is loaded.
61
63
62
-
## Support for two-argument functions
64
+
###Support for two-argument functions
63
65
64
66
All backends are compatible with one-argument functions `f(x) = y`.
65
67
Only some are compatible with two-argument functions `f!(y, x) = nothing`.
66
68
You can check this compatibility using [`check_twoarg`](@ref).
67
69
68
-
##Hessian support
70
+
### Support for Hessian
69
71
70
72
Only some backends are able to compute Hessians.
71
-
You can use [`check_hessian`](@ref) to check this feature.
73
+
You can use [`check_hessian`](@ref) to check this feature (beware that it will try to compute a small Hessian, so it is not instantaneous).
72
74
73
75
## API reference
74
76
75
77
!!! warning
76
-
The following documentation has been re-exported from [ADTypes.jl](https://github.com/SciML/ADTypes.jl).
77
-
Refer to the ADTypes documentation for more information.
78
+
The following documentation has been borrowed from ADTypes.jl.
79
+
Refer to the [ADTypes documentation](https://sciml.github.io/ADTypes.jl/stable/) for more information.
In order to ensure symmetry between one-argument functions `f(x) = y` and two-argument functions `f!(y, x) = nothing`, we define the same operators for both cases.
|`f!(y, x)`|`operator(f!, y, backend, x, [v], [extras])`|`operator!(f!, y, result, backend, x, [v], [extras])`|
54
54
55
55
!!! warning
56
56
Our mutation convention is that all positional arguments between `f`/`f!` and `backend` are mutated (the `extras` as well, see below).
57
57
This convention holds regardless of the bang `!` in the operator name, because we assume that a user passing a two-argument function `f!(y, x)` anticipates mutation anyway.
58
-
59
58
Still, better be careful with two-argument functions, because every variant of the operator will mutate `y`... even if it does not have a `!` in its name (see the bottom left cell in the table).
60
59
61
60
## Preparation
@@ -78,8 +77,8 @@ Unsurprisingly, preparation syntax depends on the number of arguments:
|`f!(y, x)`|`prepare_operator(f!, y, backend, x, ...)`|
80
+
|`f(x)`|`prepare_operator(f, backend, x, [v])`|
81
+
|`f!(y, x)`|`prepare_operator(f!, y, backend, x, [v])`|
83
82
84
83
The preparation `prepare_operator(f, backend, x)` will create an object called `extras` containing the necessary information to speed up `operator` and its variants.
85
84
This information is specific to `backend` and `f`, as well as the _type and size_ of the input `x` and the _control flow_ within the function, but it should work with different _values_ of `x`.
@@ -102,6 +101,9 @@ We offer two ways to perform second-order differentiation (for [`second_derivati
102
101
At the moment, trial and error is your best friend.
103
102
Usually, the most efficient approach for Hessians is forward-over-reverse, i.e. a forward-mode outer backend and a reverse-mode inner backend.
104
103
104
+
!!! warning
105
+
Preparation does not yet work for the inner differentiation step of a `SecondOrder`, only the outer differentiation is prepared.
106
+
105
107
## Experimental
106
108
107
109
!!! danger
@@ -125,9 +127,10 @@ We make this available for all backends with the following operators:
125
127
126
128
### Translation
127
129
128
-
The wrapper [`DifferentiateWith`](@ref) allows you to take a function and specify that it should be differentiated with the backend of your choice.
129
-
In other words, when you try to differentiate `dw = DifferentiateWith(f, backend1)` with `backend2`, then `backend1` steps in and `backend2` does nothing.
130
-
At the moment it only works when `backend2` supports [ChainRules.jl](https://github.com/JuliaDiff/ChainRules.jl).
130
+
The wrapper [`DifferentiateWith`](@ref) allows you to translate between AD backends.
131
+
It takes a function `f` and specifies that `f` should be differentiated with the backend of your choice, instead of whatever other backend the code is trying to use.
132
+
In other words, when someone tries to differentiate `dw = DifferentiateWith(f, backend1)` with `backend2`, then `backend1` steps in and `backend2` does nothing.
133
+
At the moment, `DifferentiateWith` only works when `backend2` supports [ChainRules.jl](https://github.com/JuliaDiff/ChainRules.jl).
Not only is it blazingly fast, you achieved this speedup without looking at the docs of either ForwardDiff.jl or Enzyme.jl!
147
138
In short, DifferentiationInterface.jl allows for easy testing and comparison of AD backends.
148
139
If you want to go further, check out the [DifferentiationInterfaceTest.jl tutorial](https://gdalle.github.io/DifferentiationInterface.jl/DifferentiationInterfaceTest/dev/tutorial/).
149
140
It provides benchmarking utilities to compare backends and help you select the one that is best suited for your problem.
0 commit comments