You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: DifferentiationInterface/docs/src/dev_guide.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -23,10 +23,10 @@ Most operators have 4 variants, which look like this in the first order: `operat
23
23
To implement a new operator for an existing backend, you need to write 5 methods: 1 for [preparation](@ref Preparation) and 4 corresponding to the variants of the operator (see above).
24
24
For first-order operators, you may also want to support [in-place functions](@ref"Mutation and signatures"), which requires another 5 methods (defined on `f!` instead of `f`).
25
25
26
-
The method `prepare_operator` must output an `extras` object of the correct type.
27
-
For instance, `prepare_gradient(f, backend, x)` must return a [`DifferentiationInterface.GradientExtras`](@ref).
28
-
Assuming you don't need any preparation for said operator, you can use the trivial extras that are already defined, like `DifferentiationInterface.NoGradientExtras`.
29
-
Otherwise, define a custom struct like `MyGradientExtras <: DifferentiationInterface.GradientExtras` and put the necessary storage in there.
26
+
The method `prepare_operator` must output a `prep` object of the correct type.
27
+
For instance, `prepare_gradient(f, backend, x)` must return a [`DifferentiationInterface.GradientPrep`](@ref).
28
+
Assuming you don't need any preparation for said operator, you can use the trivial prep that are already defined, like `DifferentiationInterface.NoGradientPrep`.
29
+
Otherwise, define a custom struct like `MyGradientPrep <: DifferentiationInterface.GradientPrep` and put the necessary storage in there.
Copy file name to clipboardExpand all lines: DifferentiationInterface/docs/src/explanation/operators.md
+6-6Lines changed: 6 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -107,28 +107,28 @@ In addition, the preparation syntax depends on the number of arguments accepted
107
107
| out-of-place function |`prepare_op(f, backend, x, [t])`|
108
108
| in-place function |`prepare_op(f!, y, backend, x, [t])`|
109
109
110
-
Preparation creates an object called `extras` which contains the the necessary information to speed up an operator and its variants.
111
-
The idea is that you prepare only once, which can be costly, but then call the operator several times while reusing the same `extras`.
110
+
Preparation creates an object called `prep` which contains the the necessary information to speed up an operator and its variants.
111
+
The idea is that you prepare only once, which can be costly, but then call the operator several times while reusing the same `prep`.
112
112
113
113
```julia
114
114
op(f, backend, x, [t]) # slow because it includes preparation
115
-
op(f, extras, backend, x, [t]) # fast because it skips preparation
115
+
op(f, prep, backend, x, [t]) # fast because it skips preparation
116
116
```
117
117
118
118
!!! warning
119
-
The `extras` object is the last argument before `backend` and it is always mutated, regardless of the bang `!` in the operator name.
119
+
The `prep` object is the last argument before `backend` and it is always mutated, regardless of the bang `!` in the operator name.
120
120
121
121
### Reusing preparation
122
122
123
123
Deciding whether it is safe to reuse the results of preparation is not easy.
124
124
Here are the general rules that we strive to implement:
125
125
126
-
For different-point preparation, the output `extras` of `prepare_op(f, b, x, [t])` can be reused in `op(f, extras, b, other_x, [other_t])`, provided that:
126
+
For different-point preparation, the output `prep` of `prepare_op(f, b, x, [t])` can be reused in `op(f, prep, b, other_x, [other_t])`, provided that:
127
127
128
128
- the inputs `x` and `other_x` have similar types and equal shapes
129
129
- the tangents in `t` and `other_t` have similar types and equal shapes
130
130
131
-
For same-point preparation, the output `extras` of `prepare_op_same_point(f, b, x, [t])` can be reused in `op(f, extras, b, x, other_t)`, provided that:
131
+
For same-point preparation, the output `prep` of `prepare_op_same_point(f, b, x, [t])` can be reused in `op(f, prep, b, x, other_t)`, provided that:
132
132
133
133
- the input `x` remains the same
134
134
- the tangents in `t` and `other_t` have similar types and equal shapes
Copy file name to clipboardExpand all lines: DifferentiationInterface/ext/DifferentiationInterfaceChainRulesCoreExt/DifferentiationInterfaceChainRulesCoreExt.jl
Copy file name to clipboardExpand all lines: DifferentiationInterface/ext/DifferentiationInterfaceDiffractorExt/DifferentiationInterfaceDiffractorExt.jl
0 commit comments