Skip to content

perf: preallocate Cache during preparation for ForwardDiff#741

Merged
gdalle merged 5 commits intomainfrom
gd/prep_cache
Mar 14, 2025
Merged

perf: preallocate Cache during preparation for ForwardDiff#741
gdalle merged 5 commits intomainfrom
gd/prep_cache

Conversation

@gdalle
Copy link
Copy Markdown
Member

@gdalle gdalle commented Mar 14, 2025

  • Add a contexts_dual field to most preparation objects in the ForwardDiff extension
  • Initialize this field with a tuple of the same size as contexts, except that
    • non-Cache contexts are replaced by nothing
    • Cache contexts are replaced by similar(c, Dual{...})
  • During prepared execution, take the Cache contexts from prep.contexts_dual
  • Create Configs with f=nothing because the tag captures the relevant type
  • Only take the no-preparation shortcuts when there are no Cache contexts
  • Temporarily deactivate (Polyester)ForwardDiff HVP optimizations, must be reactivated before the next release

@codecov
Copy link
Copy Markdown

codecov Bot commented Mar 14, 2025

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 97.47%. Comparing base (ff95f72) to head (045b56c).
Report is 1 commits behind head on main.

Additional details and impacted files
@@            Coverage Diff             @@
##             main     #741      +/-   ##
==========================================
- Coverage   97.94%   97.47%   -0.47%     
==========================================
  Files         124      124              
  Lines        6510     6546      +36     
==========================================
+ Hits         6376     6381       +5     
- Misses        134      165      +31     
Flag Coverage Δ
DI 98.26% <100.00%> (-0.69%) ⬇️
DIT 95.74% <ø> (ø)

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@gdalle gdalle marked this pull request as ready for review March 14, 2025 13:09
@franckgaga
Copy link
Copy Markdown
Contributor

Yes Cache context does not allocate at runtime with AutoForwardDiff:

using BenchmarkTools
using DifferentiationInterface
import ForwardDiff
x = float.(1:8)
y = Vector{eltype(x)}(undef, length(x)-1)
cache = zeros(eltype(x), length(y))
function f!(y, x, cache)
    y2 = cache
    for i in eachindex(y)
        y[i] = (x[i+1]^2-x[i]^2)
    end
    for i in eachindex(y)
        j = length(x) - i
        y[i] += (x[j]^2-x[j+1]^2)
    end
    y2 .= y
    return y
end
dense_backend = AutoForwardDiff()
dense_prep = prepare_jacobian(f!, y, dense_backend, x, Cache(cache))
∇f = Matrix{eltype(x)}(undef, length(y), length(x))
jacobian!(f!, y, ∇f, dense_prep, dense_backend, x, Cache(cache))
display(∇f)  
@benchmark jacobian!($f!, $y, $∇f, $dense_prep, $dense_backend, $x, Cache($cache))

giving:

7×8 Matrix{Float64}:
 -2.0   4.0   0.0   0.0    0.0    0.0   14.0  -16.0
  0.0  -4.0   6.0   0.0    0.0   12.0  -14.0    0.0
  0.0   0.0  -6.0   8.0   10.0  -12.0    0.0    0.0
  0.0   0.0   0.0   0.0    0.0    0.0    0.0    0.0
  0.0   0.0   6.0  -8.0  -10.0   12.0    0.0    0.0
  0.0   4.0  -6.0   0.0    0.0  -12.0   14.0    0.0
  2.0  -4.0   0.0   0.0    0.0    0.0  -14.0   16.0
BenchmarkTools.Trial: 10000 samples with 907 evaluations per sample.
 Range (min  max):  108.548 ns  567.631 ns  ┊ GC (min  max): 0.00%  0.00%
 Time  (median):     125.157 ns               ┊ GC (median):    0.00%
 Time  (mean ± σ):   135.268 ns ±  33.886 ns  ┊ GC (mean ± σ):  0.00% ± 0.00%

  ▄▅▆▇▆█▆▆▆▄▅▅▄▃▃▂▁▁                                            ▂
  █████████████████████▆▇▅▇▆▆▅▆▅▅▅▅▄▆▆▆▇▇████▇██▆▆▆▅▅▄▅▅▆▅▃▆▅▄▃ █
  109 ns        Histogram: log(frequency) by time        289 ns <

 Memory estimate: 0 bytes, allocs estimate: 0.

Also allocation-free with AutoFiniteDiff.

Good work!

@gdalle gdalle merged commit 9d2d9ba into main Mar 14, 2025
49 of 50 checks passed
@gdalle gdalle deleted the gd/prep_cache branch March 14, 2025 16:03
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants