You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[](https://github.com/SciML/ColPrac)
8
+
[](https://github.com/SciML/ColPrac)
An interface to various automatic differentiation (AD) backends in Julia.
@@ -19,31 +19,31 @@ An interface to various automatic differentiation (AD) backends in Julia.
19
19
20
20
This package provides a unified syntax to differentiate functions, including:
21
21
22
-
- First- and second-order operators (gradients, Jacobians, Hessians and more)
23
-
- In-place and out-of-place differentiation
24
-
- Preparation mechanism (e.g. to pre-allocate a cache or record a tape)
25
-
- Built-in sparsity handling
26
-
- Thorough validation on standard inputs and outputs (numbers, vectors, matrices)
27
-
- Testing and benchmarking utilities accessible to users with [DifferentiationInterfaceTest](https://github.com/JuliaDiff/DifferentiationInterface.jl/tree/main/DifferentiationInterfaceTest)
22
+
- First- and second-order operators (gradients, Jacobians, Hessians and more)
23
+
- In-place and out-of-place differentiation
24
+
- Preparation mechanism (e.g. to pre-allocate a cache or record a tape)
25
+
- Built-in sparsity handling
26
+
- Thorough validation on standard inputs and outputs (numbers, vectors, matrices)
27
+
- Testing and benchmarking utilities accessible to users with [DifferentiationInterfaceTest](https://github.com/JuliaDiff/DifferentiationInterface.jl/tree/main/DifferentiationInterfaceTest)
28
28
29
29
## Compatibility
30
30
31
31
We support the following backends defined by [ADTypes.jl](https://github.com/SciML/ADTypes.jl):
> Note that in some cases, going through DifferentiationInterface.jl might be slower or cause more errors than a direct call to the backend's API. This is especially true for Enzyme.jl, whose handling of activities and multiple arguments is not fully supported here. We are working on this challenge, and welcome any suggestions or contributions. Meanwhile, if differentiation fails or takes too long, consider using Enzyme.jl through its [native API](https://enzymead.github.io/Enzyme.jl/stable/) instead.
@@ -63,25 +63,27 @@ To install the development version, run this instead:
import ForwardDiff, Enzyme, Zygote # AD backends you want to use
76
+
using ForwardDiff: ForwardDiff
77
+
using Enzyme: Enzyme
78
+
using Zygote: Zygote # AD backends you want to use
77
79
78
80
f(x) =sum(abs2, x)
79
81
80
82
x = [1.0, 2.0]
81
83
82
84
value_and_gradient(f, AutoForwardDiff(), x) # returns (5.0, [2.0, 4.0]) with ForwardDiff.jl
83
-
value_and_gradient(f, AutoEnzyme(), x) # returns (5.0, [2.0, 4.0]) with Enzyme.jl
84
-
value_and_gradient(f, AutoZygote(), x) # returns (5.0, [2.0, 4.0]) with Zygote.jl
85
+
value_and_gradient(f, AutoEnzyme(), x) # returns (5.0, [2.0, 4.0]) with Enzyme.jl
86
+
value_and_gradient(f, AutoZygote(), x) # returns (5.0, [2.0, 4.0]) with Zygote.jl
85
87
```
86
88
87
89
To improve your performance by up to several orders of magnitude compared to this example, take a look at the tutorial and its section on operator preparation.
@@ -90,8 +92,8 @@ To improve your performance by up to several orders of magnitude compared to thi
90
92
91
93
Whenever you refer to this package or the ideas it contains, please cite:
92
94
93
-
1. our preprint [*A Common Interface for Automatic Differentiation*](https://arxiv.org/abs/2505.05542);
You can use the provided [`CITATION.cff`](https://github.com/JuliaDiff/DifferentiationInterface.jl/blob/main/CITATION.cff) file or the following BibTeX entries:
Copy file name to clipboardExpand all lines: DifferentiationInterface/docs/src/dev_guide.md
+10-9Lines changed: 10 additions & 9 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -7,14 +7,14 @@ It is not part of the public API and the content below may become outdated, in w
7
7
8
8
The package is structured around 8 [operators](@ref Operators):
9
9
10
-
-[`derivative`](@ref)
11
-
-[`second_derivative`](@ref)
12
-
-[`gradient`](@ref)
13
-
-[`jacobian`](@ref)
14
-
-[`hessian`](@ref)
15
-
-[`pushforward`](@ref)
16
-
-[`pullback`](@ref)
17
-
-[`hvp`](@ref)
10
+
-[`derivative`](@ref)
11
+
-[`second_derivative`](@ref)
12
+
-[`gradient`](@ref)
13
+
-[`jacobian`](@ref)
14
+
-[`hessian`](@ref)
15
+
-[`pushforward`](@ref)
16
+
-[`pullback`](@ref)
17
+
-[`hvp`](@ref)
18
18
19
19
Most operators have 4 variants, which look like this in the first order: `operator`, `operator!`, `value_and_operator`, `value_and_operator!`.
20
20
@@ -39,14 +39,15 @@ In the main package, you should define a new struct `SuperDiffBackend` which sub
39
39
You also have to define [`ADTypes.mode`](@extref) and [`DifferentiationInterface.inplace_support`](@ref) on `SuperDiffBackend`.
40
40
41
41
!!! info
42
+
42
43
In the end, this backend struct will need to be contributed to [ADTypes.jl](https://github.com/SciML/ADTypes.jl).
43
44
However, putting it in the DifferentiationInterface.jl PR is a good first step for debugging.
44
45
45
46
In a [package extension](https://pkgdocs.julialang.org/v1/creating-packages/#Conditional-loading-of-code-in-packages-(Extensions)) named `DifferentiationInterfaceSuperDiffExt`, you need to implement at least [`pushforward`](@ref) or [`pullback`](@ref) (and their variants).
46
47
The exact requirements depend on the differentiation mode you chose:
0 commit comments