From 3db659a6805e6d843ec506ce0f59a2facbadfee3 Mon Sep 17 00:00:00 2001 From: "Documenter.jl" Date: Wed, 27 Mar 2024 13:47:52 +0000 Subject: [PATCH] build based on 6155b2a --- dev/.documenter-siteinfo.json | 2 +- dev/api/index.html | 18 ++++++------ dev/backends/index.html | 4 +-- dev/developer/index.html | 2 +- dev/index.html | 2 +- dev/objects.inv | Bin 1590 -> 1564 bytes dev/overview/index.html | 4 +-- dev/search_index.js | 2 +- dev/tutorial/index.html | 53 ++++++++++++++++++++++------------ 9 files changed, 52 insertions(+), 35 deletions(-) diff --git a/dev/.documenter-siteinfo.json b/dev/.documenter-siteinfo.json index c9852295a..ff5c5fb42 100644 --- a/dev/.documenter-siteinfo.json +++ b/dev/.documenter-siteinfo.json @@ -1 +1 @@ -{"documenter":{"julia_version":"1.10.2","generation_timestamp":"2024-03-26T20:58:06","documenter_version":"1.3.0"}} \ No newline at end of file +{"documenter":{"julia_version":"1.10.2","generation_timestamp":"2024-03-27T13:47:49","documenter_version":"1.3.0"}} \ No newline at end of file diff --git a/dev/api/index.html b/dev/api/index.html index 1a087dbcb..6f51ac344 100644 --- a/dev/api/index.html +++ b/dev/api/index.html @@ -1,17 +1,17 @@ -API reference · DifferentiationInterface.jl

API reference

DifferentiationInterfaceModule
source

Derivative

Gradient

Jacobian

Second order

DifferentiationInterface.SecondOrderType
SecondOrder

Combination of two backends for second-order differentiation.

Fields

  • outer::ADTypes.AbstractADType: backend for the outer differentiation

  • inner::ADTypes.AbstractADType: backend for the inner differentiation

source

Primitives

Backend queries

Preparation

Testing & benchmarking

DifferentiationInterfaceTest.BenchmarkDataType
BenchmarkData

Ad-hoc storage type for differentiation benchmarking results. You can turn it into a DataFrame as follows:

df = DataFrames.DataFrame(pairs(benchmark_data)...)

Fields

These are not part of the public API.

  • backend::Vector{String}

  • mode::Vector{Type}

  • operator::Vector{Function}

  • variant::Vector{Function}

  • func::Vector{String}

  • mutating::Vector{Bool}

  • input_type::Vector{Type}

  • output_type::Vector{Type}

  • input_size::Vector

  • output_size::Vector

  • samples::Vector{Int64}

  • time::Vector{Float64}

  • bytes::Vector{Float64}

  • allocs::Vector{Float64}

  • compile_fraction::Vector{Float64}

  • gc_fraction::Vector{Float64}

  • evals::Vector{Float64}

source
DifferentiationInterfaceTest.ScenarioType
Scenario{mutating}

Store a testing scenario composed of a function and its input + output + tangents.

Fields

  • f::Any: function

  • x::Any: input

  • y::Any: output

  • dx::Any: pushforward seed

  • dy::Any: pullback seed

  • ref::Any: reference to compare against. It can be either an ADTypes.AbstractADTypes object or a Reference containing the correct operators associated with f

source
DifferentiationInterfaceTest.test_differentiationFunction
test_differentiation(backends, [operators, scenarios]; [kwargs...])

Cross-test a list of backends for a list of operators on a list of scenarios, running a variety of different tests.

If benchmark=true, return a BenchmarkData object, otherwise return nothing.

Default arguments

  • operators::Vector{Function}: the list [pushforward, pullback,derivative, gradient, jacobian, second_derivative, hvp, hessian]
  • scenarios::Vector{Scenario}: the output of default_scenarios()

Keyword arguments

Testing:

  • correctness=true: whether to compare the differentiation results with the theoretical values specified in each scenario. If a backend object like correctness=AutoForwardDiff() is passed instead of a boolean, the results will be compared using that reference backend as the ground truth.
  • call_count=false: whether to check that the function is called the right number of times
  • type_stability=false: whether to check type stability with JET.jl (thanks to @test_opt)
  • benchmark=false: whether to run and return a benchmark suite with Chairmarks.jl
  • allocations=false: whether to check that the benchmarks are allocation-free
  • detailed=false: whether to print a detailed test set (by scenario) or condensed test set (by operator)

Filtering:

  • input_type=Any: restrict scenario inputs to subtypes of this
  • output_type=Any: restrict scenario outputs to subtypes of this
  • allocating=true: consider operators for allocating functions
  • mutating=true: consider operators for mutating functions
  • first_order=true: consider first order operators
  • second_order=true: consider second order operators
  • excluded=Symbol[]: list of excluded operators

Options:

  • isapprox=isapprox: function used to compare objects, only needs to be set for complicated cases beyond arrays / scalars
  • rtol=1e-3: precision for correctness testing (when comparing to the reference outputs)
source
DifferentiationInterfaceTest.test_differentiationMethod
test_differentiation(
+API reference · DifferentiationInterface.jl

API reference

DifferentiationInterfaceModule
source

Derivative

Gradient

Jacobian

Second order

DifferentiationInterface.SecondOrderType
SecondOrder

Combination of two backends for second-order differentiation.

Fields

  • outer::ADTypes.AbstractADType: backend for the outer differentiation

  • inner::ADTypes.AbstractADType: backend for the inner differentiation

source

Primitives

Backend queries

Preparation

Testing & benchmarking

DifferentiationInterfaceTest.BenchmarkDataType
BenchmarkData

Ad-hoc storage type for differentiation benchmarking results. You can turn it into a DataFrame as follows:

df = DataFrames.DataFrame(pairs(benchmark_data)...)

Fields

These are not part of the public API.

  • backend::Vector{String}

  • mode::Vector{Type}

  • operator::Vector{Function}

  • variant::Vector{Function}

  • func::Vector{String}

  • mutating::Vector{Bool}

  • input_type::Vector{Type}

  • output_type::Vector{Type}

  • input_size::Vector

  • output_size::Vector

  • samples::Vector{Int64}

  • time::Vector{Float64}

  • bytes::Vector{Float64}

  • allocs::Vector{Float64}

  • compile_fraction::Vector{Float64}

  • gc_fraction::Vector{Float64}

  • evals::Vector{Float64}

source
DifferentiationInterfaceTest.ScenarioType
Scenario{mutating}

Store a testing scenario composed of a function and its input + output + tangents.

Fields

  • f::Any: function

  • x::Any: input

  • y::Any: output

  • dx::Any: pushforward seed

  • dy::Any: pullback seed

  • ref::Any: reference to compare against. It can be either an ADTypes.AbstractADTypes object or a Reference containing the correct operators associated with f

source
DifferentiationInterfaceTest.test_differentiationFunction
test_differentiation(backends, [operators, scenarios]; [kwargs...])

Test a list of backends for a list of operators on a list of scenarios.

Default arguments

  • operators::Vector{Function}: the list [pushforward, pullback,derivative, gradient, jacobian, second_derivative, hvp, hessian]
  • scenarios::Vector{Scenario}: the output of default_scenarios()

Keyword arguments

Testing:

  • correctness=true: whether to compare the differentiation results with the theoretical values specified in each scenario. If a backend object like correctness=AutoForwardDiff() is passed instead of a boolean, the results will be compared using that reference backend as the ground truth.
  • call_count=false: whether to check that the function is called the right number of times
  • type_stability=false: whether to check type stability with JET.jl (thanks to @test_opt)
  • detailed=false: whether to print a detailed or condensed test log

Filtering:

  • input_type=Any: restrict scenario inputs to subtypes of this
  • output_type=Any: restrict scenario outputs to subtypes of this
  • allocating=true: consider operators for allocating functions
  • mutating=true: consider operators for mutating functions
  • first_order=true: consider first order operators
  • second_order=true: consider second order operators
  • excluded=Symbol[]: list of excluded operators

Options:

  • logging=true: whether to log progress
  • isapprox=isapprox: function used to compare objects, only needs to be set for complicated cases beyond arrays / scalars
  • rtol=1e-3: precision for correctness testing (when comparing to the reference outputs)
source

Internals

This is not part of the public API.

DifferentiationInterface.basisarrayMethod
basisarray(backend, a::AbstractArray, i::CartesianIndex)

Construct the i-th stardard basis array in the vector space of a with element type eltype(a).

Note

If an AD backend benefits from a more specialized basis array implementation, this function can be extended on the backend type.

source
DifferentiationInterface.modeMethod
mode(backend)

Return the AD mode of a backend in a statically predictable way.

The return value is a Type object chosen among:

  • ADTypes.AbstractForwardMode
  • ADTypes.AbstractFiniteDifferencesMode
  • ADTypes.AbstractReverseMode
  • ADTypes.AbstractSymbolicDifferentiationMode

This function exists because there are backends (like Enzyme) that can support both forward and reverse mode, which means their ADTypes.jl object does not subtype either.

source
DifferentiationInterfaceTest.ReferenceType
Reference

Store the ground truth operators for a Scenario.

Fields

  • pushforward::Any: function (x, dx) -> pf

  • pullback::Any: function (x, dy) -> pb

  • derivative::Any: function x -> der

  • gradient::Any: function x -> grad

  • jacobian::Any: function x -> jac

  • second_derivative::Any: function x -> der2

  • hvp::Any: function (x, v) -> p

  • hessian::Any: function x -> hess

source
+
diff --git a/dev/backends/index.html b/dev/backends/index.html index c12f6d612..0a2783365 100644 --- a/dev/backends/index.html +++ b/dev/backends/index.html @@ -1,10 +1,10 @@ Backends · DifferentiationInterface.jl

Backends

Types

Most backend choices are defined by ADTypes.jl.

Warning

Only the backends listed here are supported by DifferentiationInterface.jl, even though ADTypes.jl defines more.

ADTypes.AutoEnzymeType
AutoEnzyme(Enzyme.Forward)
-AutoEnzyme(Enzyme.Reverse)

Construct a forward or reverse mode AutoEnzyme backend.

source
AutoEnzyme{M}

Chooses Enzyme.jl.

Fields

  • mode::M = nothing
source

We also provide a few of our own:

Availability

You can use check_available to verify whether a given backend is loaded, like we did below:

Backendavailable
Diffractor (forward)
Enzyme (forward)
Enzyme (reverse)
FastDifferentiation (symbolic)
FiniteDiff (finite)
FiniteDifferences (finite)
ForwardDiff (forward)
PolyesterForwardDiff (forward)
ReverseDiff (reverse)
Tracker (reverse)
Zygote (reverse)
+AutoEnzyme(Enzyme.Reverse)

Construct a forward or reverse mode AutoEnzyme backend.

source
AutoEnzyme{M}

Chooses Enzyme.jl.

Fields

  • mode::M = nothing
source

We also provide a few of our own:

Availability

You can use check_available to verify whether a given backend is loaded, like we did below:

Backendavailable
Diffractor (forward)
Enzyme (forward)
Enzyme (reverse)
FastDifferentiation (symbolic)
FiniteDiff (finite)
FiniteDifferences (finite)
ForwardDiff (forward)
PolyesterForwardDiff (forward)
ReverseDiff (reverse)
Tracker (reverse)
Zygote (reverse)

Mutation support

All backends are compatible with allocating functions f(x) = y. Only some are compatible with mutating functions f!(y, x) = nothing. You can use check_mutation to check that feature, like we did below:

Backendmutation
Diffractor (forward)
Enzyme (forward)
Enzyme (reverse)
FastDifferentiation (symbolic)
FiniteDiff (finite)
FiniteDifferences (finite)
ForwardDiff (forward)
PolyesterForwardDiff (forward)
ReverseDiff (reverse)
Tracker (reverse)
Zygote (reverse)

Package extensions

Backend-specific extension content is not part of the public API.

+
diff --git a/dev/developer/index.html b/dev/developer/index.html index 467a4c0c0..0931f40bc 100644 --- a/dev/developer/index.html +++ b/dev/developer/index.html @@ -18,4 +18,4 @@ startOnLoad: true, theme: "neutral" }); - + diff --git a/dev/index.html b/dev/index.html index 2523c7442..09b8a107b 100644 --- a/dev/index.html +++ b/dev/index.html @@ -13,4 +13,4 @@ startOnLoad: true, theme: "neutral" }); - + diff --git a/dev/objects.inv b/dev/objects.inv index 164c4d0377ddefca22d074441938347b8e5ffc6d..f989d363f432aac526337588ebdfb8ed6f000145 100644 GIT binary patch delta 1440 zcmV;R1z-BM44e#*j(@dT&2HN`5WedvSaiLoDBk80pf@LOx?La}BR2NXi$P0t%vKUr zigMyz^xZp@NQ$yW%hcLHk;>G}kTc)>QZ$Rn8!ANnF=i18KOpocOGzL~!8pZ<{YI~i z(Ly~1Y)kl-l2@BduZ_`@dJ1q(Z3_4~Mw8hW#L1d)l0_s~;eY5m$>PE$(`#dNr=EhC zY)Qg$!fiUUI7SbQqsg>cDFVeBa`L03oTMZZ^(F~kk)+N3{x`i7vdM$bOc}!P0A(0F zmWiNwLPDD5r6|U%6!QNzrLo5QJ@V1bgJ#fgP+L&2*yUtToUuuDg1>OYRupG#8fo6@ zwY2{$r-EiLA%6~EIF2dk(~{zNth{0#hbfE6Rfv;> zMYuZAy38U$S>_gJvyN`=yO|Xcs^p`F1R0WGO4pU235w;ymrM}8#u4ej^a(b`U=b&Z zlIVe8QSyCgs#i0>2fvVzya|#OhuYcrDO%Lf1~X1_%zrBd!RV>>7!5_?#?Eje;6$3! zfLH_&Z=s2|PKRknV-jtT80zX zR-ZfF+q`}(MFoCXlzGm$aGyp~B1IY8vowd5E-h+z0H>1z!U!J1Lhv$@?r|(i6bF(V z$Z!Oo-hXf^QS0zaJJCAa@*lt-A2>;yMyJCcCE?Too6doXrqaJWIf)o*JB7?&sfadG z#H!P+YypPWpb|{sAX_XQ&4KR1E>{i*x&yb~q?mv2akOVZiWg)^!m?SN2}=_5%^;@YY>_z>~)|U#!WsM4)ZtNeEEt{b){IQx+c{~7?f8~5pX2x z-yZn5Bm+L(vU^a=tDb_GsMY7pc2}rdAgpIQpA&a)ArGAGAcKx>{&1YUeg{OyRcMg* zjK+r1L}9NI4bKHFwZF+FY~}6J`fvq_SbrWul=so1m>MWq78u%Kl=?;msk%AceUaEHtEaE4s`rCsl;(z^+Ch09<@H74C(!? u(P7Bfm)c!${UtUzUmwsZcNwT7YCQgCVh|*DQ+T1S=ZO3st$zWP7MpM5h13xM delta 1466 zcmV;r1x5Ot47LoAj(^2k&2HN`5WedvSaf}h;%zPgdQ0M_?E={tiLr-X3|bn;Y$Q>o zC@0=U-+f7mlqgxWOhs;iAd#tWhMfQ5I3x+-6=gj97}5aw9}xONWAJ#+DI+*C-{`d| zTBs+FZh>tHyqGY(HbwX9$-^lz5#r+%UCqBD25VrD1mLZ3@P7flxu<5S9ttdga zfO*c^rYT;{g?;c7`0&ah$@Zl)?WeG)#(HxGDQ1O$LVxM8_84u7%x=4i5(!GADJvsp zo`^TUinmUOX-4F=8IczMn==v%ciFa8QF-9i789{bBEolv7!@W-aAf22xzl&6*LOKD zEbC`^no`E?TWKUxNbiQmsaWYcVrgsPQIdflhzCF8EDxmZ4~2lDK#~Gg6yB#-97~gQ zIGveAEq^}uk1#IY^MEEs=ei#`Fk-QZPH+lS$e$mbAu6<)A0A%_4>pp;s?%po1{Jol ziZ>Gn*<$HM7V9p|TGYwa_a1}=SO!C~j7e(EG75Ak%RE4YIfQ;%hV_1SPynO>*1WWU zjGtc#`Mv?8I=_97K5|hHkQrUGqCynV1(W(@27jy>VJ21yBe%EWZD%{5=cRi;mH`Bv zC$my-ag@UnC*e}N&%1Y$fmWkt;=o{X@j?8wMEsQeyNkhDxaZ1fV+)Ua?6%!o^95uik5;@N?M7`gaclFb`b9X zbAKUWH~SvBYnz63F6eNIBR%W8?uuMlRFcKIo{X-uCg-6FON(L;!UC!|sdFZ~Gt|vc ztS38{6L)VR51i~omYiJtX*qfMPJj-JP+QtD0n~(5;P(nq)0|LK?M=qh)w~Y0VOjSL zVWp+ZAfSx^J)tHodyS^hB13cWYZ*2X2Y=)J42d2LvnMI?PuF^2-_4!_3Sr7UmFSnV zBNM|0vP@*a6WXY;!|5&K?6<0IlC8Dw-CQEZD00mdJ7)Q8XV$mXPNN(T?x_qOz1g*} z)wQ@QZeq-yXP9Fr4Ofn|O;72jn)|pe`cPe@d)o3jjdp?nnDR%-wX>zHJG69cwloX>vWain@s=cn54>;< zZfA!lgswDTigUBUBzedqamjTSM`h!oiCG7>*2)QUM{|qu+BrExusFU7^E-iXIdRRg z!^aJt<-tr#N|q8t?URFEJ?V@134cw|++y5Z{un@l8KDE`7>;)^XBF6Xa#4bVHbPkA zJmSktX=UgD#KPy8vQ_0C=ZrUSrnEOna+p57VNNlyK<_i?Klk2^%K6xO?Dse2!|tHQ zC3Yx}!m$p$EPRxPqAHQ90Z z3#HVhIG`NJk8}yMvhKNsHC$m(LvQcnP1LvD6XovH>vMG}cMk4g1Htq7Q_n884jS&d z{J86+FHu{paP)hyl z)aofyVuzCUAI4Fe{x*D2H8XZAwvF#Jhg|%Od*gv(-4jQ+{C|YVcK;e0pDn&)M5yi! Uf2}W?4drie#JUSrWI_dH?_b diff --git a/dev/overview/index.html b/dev/overview/index.html index d0d3dc71a..268dccbf9 100644 --- a/dev/overview/index.html +++ b/dev/overview/index.html @@ -1,7 +1,7 @@ -Overview · DifferentiationInterface.jl

Overview

Operators

Depending on the type of input and output, differentiation operators can have various names. Most backends have custom implementations, which we reuse if possible.

We choose the following terminology for the high-level operators we provide:

operatorinput xoutput yresult typeresult shape
derivativeNumberAnysame as ysize(y)
gradientAnyNumbersame as xsize(x)
jacobianAbstractArrayAbstractArrayAbstractMatrix(length(y), length(x))

They are all based on the following low-level operators:

  • pushforward (or JVP), to propagate input tangents
  • pullback (or VJP), to backpropagate output cotangents
Tip

See the book The Elements of Differentiable Programming for details on these concepts.

Variants

Several variants of each operator are defined:

out-of-placein-place (or not)out-of-place + primalin-place (or not) + primal
derivativederivative!!value_and_derivativevalue_and_derivative!!
gradientgradient!!value_and_gradientvalue_and_gradient!!
jacobianjacobian!!value_and_jacobianvalue_and_jacobian!!
pushforwardpushforward!!value_and_pushforwardvalue_and_pushforward!!
pullbackpullback!!value_and_pullbackvalue_and_pullback!!
Warning

The "bang-bang" syntactic convention !! signals that some of the arguments can be mutated, but they do not have to be. Such arguments will always be part of the return, so that one can simply reuse the operator's output and forget its input.

In other words, this is good:

grad = gradient!!(f, grad, backend, x)  # do this

On the other hand, this is bad, because if grad has not been mutated, you will get wrong results:

gradient!!(f, grad, backend, x)  # don't do this

Second order

Second-order differentiation is also supported, with the following operators:

operatorinput xoutput yresult typeresult shape
second_derivativeNumberAnysame as ysize(y)
hvpAnyNumbersame as xsize(x)
hessianAbstractArrayNumberAbstractMatrix(length(x), length(x))
Danger

This is an experimental functionality, use at your own risk.

Preparation

In many cases, automatic differentiation can be accelerated if the function has been run at least once (e.g. to record a tape) and if some cache objects are provided. This is a backend-specific procedure, but we expose a common syntax to achieve it.

operatorpreparation function
derivativeprepare_derivative
gradientprepare_gradient
jacobianprepare_jacobian
second_derivativeprepare_second_derivative
hessianprepare_hessian
pushforwardprepare_pushforward
pullbackprepare_pullback
hvpprepare_hvp

If you run prepare_operator(backend, f, x), it will create an object called extras containing the necessary information to speed up operator and its variants. This information is specific to backend and f, as well as the type and size of the input x, but it should work with different values of x.

You can then call operator(backend, f, similar_x, extras), which should be faster than operator(backend, f, similar_x). This is especially worth it if you plan to call operator several times in similar settings: you can think of it as a warm up.

By default, all the preparation functions return nothing. We do not make any guarantees on their implementation for each backend, or on the performance gains that can be expected.

Warning

We haven't fully figured out what must happen when an extras object is prepared for a specific operator but then given to a lower-level one (i.e. prepare it for jacobian but then give it to pushforward inside jacobian).

Multiple inputs/outputs

Restricting the API to one input and one output has many coding advantages, but it is not very flexible. If you need more than that, use ComponentArrays.jl to wrap several objects inside a single ComponentVector.

Overview

Operators

Depending on the type of input and output, differentiation operators can have various names. Most backends have custom implementations, which we reuse if possible.

We choose the following terminology for the high-level operators we provide:

operatorinput xoutput yresult typeresult shape
derivativeNumberAnysame as ysize(y)
gradientAnyNumbersame as xsize(x)
jacobianAbstractArrayAbstractArrayAbstractMatrix(length(y), length(x))

They are all based on the following low-level operators:

  • pushforward (or JVP), to propagate input tangents
  • pullback (or VJP), to backpropagate output cotangents
Tip

See the book The Elements of Differentiable Programming for details on these concepts.

Variants

Several variants of each operator are defined:

out-of-placein-place (or not)out-of-place + primalin-place (or not) + primal
derivativederivative!!value_and_derivativevalue_and_derivative!!
gradientgradient!!value_and_gradientvalue_and_gradient!!
jacobianjacobian!!value_and_jacobianvalue_and_jacobian!!
pushforwardpushforward!!value_and_pushforwardvalue_and_pushforward!!
pullbackpullback!!value_and_pullbackvalue_and_pullback!!
Warning

The "bang-bang" syntactic convention !! signals that some of the arguments can be mutated, but they do not have to be. Such arguments will always be part of the return, so that one can simply reuse the operator's output and forget its input.

In other words, this is good:

grad = gradient!!(f, grad, backend, x)  # do this

On the other hand, this is bad, because if grad has not been mutated, you will get wrong results:

gradient!!(f, grad, backend, x)  # don't do this

Second order

Second-order differentiation is also supported, with the following operators:

operatorinput xoutput yresult typeresult shape
second_derivativeNumberAnysame as ysize(y)
hvpAnyNumbersame as xsize(x)
hessianAbstractArrayNumberAbstractMatrix(length(x), length(x))
Danger

This is an experimental functionality, use at your own risk.

Preparation

In many cases, automatic differentiation can be accelerated if the function has been run at least once (e.g. to record a tape) and if some cache objects are provided. This is a backend-specific procedure, but we expose a common syntax to achieve it.

operatorpreparation function
derivativeprepare_derivative
gradientprepare_gradient
jacobianprepare_jacobian
second_derivativeprepare_second_derivative
hessianprepare_hessian
pushforwardprepare_pushforward
pullbackprepare_pullback
hvpprepare_hvp

If you run prepare_operator(backend, f, x), it will create an object called extras containing the necessary information to speed up operator and its variants. This information is specific to backend and f, as well as the type and size of the input x, but it should work with different values of x.

You can then call operator(backend, f, similar_x, extras), which should be faster than operator(backend, f, similar_x). This is especially worth it if you plan to call operator several times in similar settings: you can think of it as a warm up.

By default, all the preparation functions return nothing. We do not make any guarantees on their implementation for each backend, or on the performance gains that can be expected.

Warning

We haven't fully figured out what must happen when an extras object is prepared for a specific operator but then given to a lower-level one (i.e. prepare it for jacobian but then give it to pushforward inside jacobian).

Multiple inputs/outputs

Restricting the API to one input and one output has many coding advantages, but it is not very flexible. If you need more than that, use ComponentArrays.jl to wrap several objects inside a single ComponentVector.

+
diff --git a/dev/search_index.js b/dev/search_index.js index 303211ec2..594ad0bca 100644 --- a/dev/search_index.js +++ b/dev/search_index.js @@ -1,3 +1,3 @@ var documenterSearchIndex = {"docs": -[{"location":"api/","page":"API reference","title":"API reference","text":"CurrentModule = Main\nCollapsedDocStrings = true","category":"page"},{"location":"api/#API-reference","page":"API reference","title":"API reference","text":"","category":"section"},{"location":"api/","page":"API reference","title":"API reference","text":"DifferentiationInterface","category":"page"},{"location":"api/#DifferentiationInterface","page":"API reference","title":"DifferentiationInterface","text":"DifferentiationInterface\n\nAn interface to various automatic differentiation backends in Julia.\n\nExports\n\nAutoFastDifferentiation\nSecondOrder\ncheck_available\ncheck_hessian\ncheck_mutation\nderivative\nderivative!!\ngradient\ngradient!!\nhessian\nhvp\njacobian\njacobian!!\nprepare_derivative\nprepare_gradient\nprepare_hessian\nprepare_hvp\nprepare_jacobian\nprepare_pullback\nprepare_pushforward\nprepare_second_derivative\npullback\npullback!!\npushforward\npushforward!!\nsecond_derivative\nvalue_and_derivative\nvalue_and_derivative!!\nvalue_and_gradient\nvalue_and_gradient!!\nvalue_and_jacobian\nvalue_and_jacobian!!\nvalue_and_pullback\nvalue_and_pullback!!\nvalue_and_pushforward\nvalue_and_pushforward!!\n\n\n\n\n\n","category":"module"},{"location":"api/#Derivative","page":"API reference","title":"Derivative","text":"","category":"section"},{"location":"api/","page":"API reference","title":"API reference","text":"Modules = [DifferentiationInterface]\nPages = [\"src/derivative.jl\"]","category":"page"},{"location":"api/#DifferentiationInterface.derivative","page":"API reference","title":"DifferentiationInterface.derivative","text":"derivative(f, backend, x, [extras]) -> der\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterface.derivative!!","page":"API reference","title":"DifferentiationInterface.derivative!!","text":"derivative!!(f, der, backend, x, [extras]) -> der\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterface.value_and_derivative","page":"API reference","title":"DifferentiationInterface.value_and_derivative","text":"value_and_derivative(f, backend, x, [extras]) -> (y, der)\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterface.value_and_derivative!!","page":"API reference","title":"DifferentiationInterface.value_and_derivative!!","text":"value_and_derivative!!(f, der, backend, x, [extras]) -> (y, der)\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterface.value_and_derivative!!-2","page":"API reference","title":"DifferentiationInterface.value_and_derivative!!","text":"value_and_derivative!!(f!, y, der, backend, x, [extras]) -> (y, der)\n\n\n\n\n\n","category":"function"},{"location":"api/#Gradient","page":"API reference","title":"Gradient","text":"","category":"section"},{"location":"api/","page":"API reference","title":"API reference","text":"Modules = [DifferentiationInterface]\nPages = [\"gradient.jl\"]","category":"page"},{"location":"api/#DifferentiationInterface.gradient","page":"API reference","title":"DifferentiationInterface.gradient","text":"gradient(f, backend, x, [extras]) -> grad\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterface.gradient!!","page":"API reference","title":"DifferentiationInterface.gradient!!","text":"gradient!!(f, grad, backend, x, [extras]) -> grad\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterface.value_and_gradient","page":"API reference","title":"DifferentiationInterface.value_and_gradient","text":"value_and_gradient(f, backend, x, [extras]) -> (y, grad)\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterface.value_and_gradient!!","page":"API reference","title":"DifferentiationInterface.value_and_gradient!!","text":"value_and_gradient!!(f, grad, backend, x, [extras]) -> (y, grad)\n\n\n\n\n\n","category":"function"},{"location":"api/#Jacobian","page":"API reference","title":"Jacobian","text":"","category":"section"},{"location":"api/","page":"API reference","title":"API reference","text":"Modules = [DifferentiationInterface]\nPages = [\"jacobian.jl\"]","category":"page"},{"location":"api/#DifferentiationInterface.jacobian","page":"API reference","title":"DifferentiationInterface.jacobian","text":"jacobian(f, backend, x, [extras]) -> jac\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterface.jacobian!!","page":"API reference","title":"DifferentiationInterface.jacobian!!","text":"jacobian!!(f, jac, backend, x, [extras]) -> jac\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterface.value_and_jacobian","page":"API reference","title":"DifferentiationInterface.value_and_jacobian","text":"value_and_jacobian(f, backend, x, [extras]) -> (y, jac)\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterface.value_and_jacobian!!","page":"API reference","title":"DifferentiationInterface.value_and_jacobian!!","text":"value_and_jacobian!!(f, jac, backend, x, [extras]) -> (y, jac)\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterface.value_and_jacobian!!-2","page":"API reference","title":"DifferentiationInterface.value_and_jacobian!!","text":"value_and_jacobian!!(f!, y, jac, backend, x, [extras]) -> (y, jac)\n\n\n\n\n\n","category":"function"},{"location":"api/#Second-order","page":"API reference","title":"Second order","text":"","category":"section"},{"location":"api/","page":"API reference","title":"API reference","text":"Modules = [DifferentiationInterface]\nPages = [\"second_order.jl\", \"second_derivative.jl\", \"hessian.jl\", \"hvp.jl\"]","category":"page"},{"location":"api/#DifferentiationInterface.SecondOrder","page":"API reference","title":"DifferentiationInterface.SecondOrder","text":"SecondOrder\n\nCombination of two backends for second-order differentiation.\n\nFields\n\nouter::ADTypes.AbstractADType: backend for the outer differentiation\ninner::ADTypes.AbstractADType: backend for the inner differentiation\n\n\n\n\n\n","category":"type"},{"location":"api/#DifferentiationInterface.second_derivative","page":"API reference","title":"DifferentiationInterface.second_derivative","text":"second_derivative(f, backend, x, [extras]) -> der2\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterface.hessian","page":"API reference","title":"DifferentiationInterface.hessian","text":"hessian(f, backend, x, [extras]) -> hess\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterface.hvp","page":"API reference","title":"DifferentiationInterface.hvp","text":"hvp(f, backend, x, v, [extras]) -> p\n\n\n\n\n\n","category":"function"},{"location":"api/#Primitives","page":"API reference","title":"Primitives","text":"","category":"section"},{"location":"api/","page":"API reference","title":"API reference","text":"Modules = [DifferentiationInterface]\nPages = [\"pushforward.jl\", \"pullback.jl\"]","category":"page"},{"location":"api/#DifferentiationInterface.pushforward","page":"API reference","title":"DifferentiationInterface.pushforward","text":"pushforward(f, backend, x, dx, [extras]) -> (y, dy)\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterface.pushforward!!","page":"API reference","title":"DifferentiationInterface.pushforward!!","text":"pushforward!!(f, dy, backend, x, dx, [extras]) -> (y, dy)\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterface.value_and_pushforward!!","page":"API reference","title":"DifferentiationInterface.value_and_pushforward!!","text":"value_and_pushforward!!(f, dy, backend, x, dx, [extras]) -> (y, dy)\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterface.value_and_pushforward!!-Tuple{Any, Any, Any, ADTypes.AbstractADType, Any, Any}","page":"API reference","title":"DifferentiationInterface.value_and_pushforward!!","text":"value_and_pushforward!!(f!, y, dy, backend, x, dx, [extras]) -> (y, dy)\n\ninfo: Info\nRequired primitive for forward mode backends to support mutating functions.\n\n\n\n\n\n","category":"method"},{"location":"api/#DifferentiationInterface.value_and_pushforward-Tuple{Any, ADTypes.AbstractADType, Any, Any}","page":"API reference","title":"DifferentiationInterface.value_and_pushforward","text":"value_and_pushforward(f, backend, x, dx, [extras]) -> (y, dy)\n\ninfo: Info\nRequired primitive for forward mode backends to support allocating functions.\n\n\n\n\n\n","category":"method"},{"location":"api/#DifferentiationInterface.pullback","page":"API reference","title":"DifferentiationInterface.pullback","text":"pullback(f, backend, x, dy, [extras]) -> dx\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterface.pullback!!","page":"API reference","title":"DifferentiationInterface.pullback!!","text":"pullback!!(f, dx, backend, x, dy, [extras]) -> dx\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterface.value_and_pullback!!","page":"API reference","title":"DifferentiationInterface.value_and_pullback!!","text":"value_and_pullback!!(f, dx, backend, x, dy, [extras]) -> (y, dx)\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterface.value_and_pullback!!-Tuple{Any, Any, Any, ADTypes.AbstractADType, Any, Any}","page":"API reference","title":"DifferentiationInterface.value_and_pullback!!","text":"value_and_pullback!!(f!, y, dx, backend, x, dy, [extras]) -> (y, dx)\n\ninfo: Info\nRequired primitive for reverse mode backends to support mutating functions.\n\n\n\n\n\n","category":"method"},{"location":"api/#DifferentiationInterface.value_and_pullback-Tuple{Any, ADTypes.AbstractADType, Any, Any}","page":"API reference","title":"DifferentiationInterface.value_and_pullback","text":"value_and_pullback(f, backend, x, dy, [extras]) -> (y, dx)\n\ninfo: Info\nRequired primitive for reverse mode backends to support allocating functions.\n\n\n\n\n\n","category":"method"},{"location":"api/#Backend-queries","page":"API reference","title":"Backend queries","text":"","category":"section"},{"location":"api/","page":"API reference","title":"API reference","text":"Modules = [DifferentiationInterface]\nPages = [\"backends.jl\"]","category":"page"},{"location":"api/#DifferentiationInterface.check_available-Tuple{ADTypes.AbstractADType}","page":"API reference","title":"DifferentiationInterface.check_available","text":"check_available(backend)\n\nCheck whether backend is available by trying a scalar-to-scalar derivative.\n\nwarning: Warning\nMight take a while due to compilation time.\n\n\n\n\n\n","category":"method"},{"location":"api/#DifferentiationInterface.check_hessian-Tuple{ADTypes.AbstractADType}","page":"API reference","title":"DifferentiationInterface.check_hessian","text":"check_hessian(backend)\n\nCheck whether backend supports second order differentiation by trying a hessian.\n\nwarning: Warning\nMight take a while due to compilation time.\n\n\n\n\n\n","category":"method"},{"location":"api/#DifferentiationInterface.check_mutation-Tuple{ADTypes.AbstractADType}","page":"API reference","title":"DifferentiationInterface.check_mutation","text":"check_mutation(backend)\n\nCheck whether backend supports differentiation of mutating functions by trying a jacobian.\n\nwarning: Warning\nMight take a while due to compilation time.\n\n\n\n\n\n","category":"method"},{"location":"api/#Preparation","page":"API reference","title":"Preparation","text":"","category":"section"},{"location":"api/","page":"API reference","title":"API reference","text":"Modules = [DifferentiationInterface]\nPages = [\"prepare.jl\"]","category":"page"},{"location":"api/#DifferentiationInterface.prepare_derivative-Tuple{Any, ADTypes.AbstractADType, Any}","page":"API reference","title":"DifferentiationInterface.prepare_derivative","text":"prepare_derivative(f, backend, x) -> extras\nprepare_derivative(f!, backend, y, x) -> extras\n\nCreate an extras object that can be given to derivative operators.\n\n\n\n\n\n","category":"method"},{"location":"api/#DifferentiationInterface.prepare_gradient-Tuple{Any, ADTypes.AbstractADType, Any}","page":"API reference","title":"DifferentiationInterface.prepare_gradient","text":"prepare_gradient(f, backend, x) -> extras\n\nCreate an extras object that can be given to gradient operators.\n\n\n\n\n\n","category":"method"},{"location":"api/#DifferentiationInterface.prepare_hessian-Tuple{Any, ADTypes.AbstractADType, Any}","page":"API reference","title":"DifferentiationInterface.prepare_hessian","text":"prepare_hessian(f, backend, x) -> extras\n\nCreate an extras object that can be given to Hessian operators.\n\n\n\n\n\n","category":"method"},{"location":"api/#DifferentiationInterface.prepare_hvp-Tuple{Any, ADTypes.AbstractADType, Any}","page":"API reference","title":"DifferentiationInterface.prepare_hvp","text":"prepare_hvp(f, backend, x) -> extras\n\nCreate an extras object that can be given to Hessian-vector product operators.\n\n\n\n\n\n","category":"method"},{"location":"api/#DifferentiationInterface.prepare_jacobian-Tuple{Any, ADTypes.AbstractADType, Any}","page":"API reference","title":"DifferentiationInterface.prepare_jacobian","text":"prepare_jacobian(f, backend, x) -> extras\nprepare_jacobian(f!, backend, y, x) -> extras\n\nCreate an extras object that can be given to Jacobian operators.\n\n\n\n\n\n","category":"method"},{"location":"api/#DifferentiationInterface.prepare_pullback-Tuple{Any, ADTypes.AbstractADType, Any}","page":"API reference","title":"DifferentiationInterface.prepare_pullback","text":"prepare_pullback(f, backend, x) -> extras\nprepare_pullback(f!, backend, y, x) -> extras\n\nCreate an extras object that can be given to pullback operators.\n\n\n\n\n\n","category":"method"},{"location":"api/#DifferentiationInterface.prepare_pushforward-Tuple{Any, ADTypes.AbstractADType, Any}","page":"API reference","title":"DifferentiationInterface.prepare_pushforward","text":"prepare_pushforward(f, backend, x) -> extras\nprepare_pushforward(f!, backend, y, x) -> extras\n\nCreate an extras object that can be given to pushforward operators.\n\n\n\n\n\n","category":"method"},{"location":"api/#DifferentiationInterface.prepare_second_derivative-Tuple{Any, ADTypes.AbstractADType, Any}","page":"API reference","title":"DifferentiationInterface.prepare_second_derivative","text":"prepare_second_derivative(f, backend, x) -> extras\nprepare_second_derivative(f!, backend, y, x) -> extras\n\nCreate an extras object that can be given to second derivative operators.\n\n\n\n\n\n","category":"method"},{"location":"api/#Testing-and-benchmarking","page":"API reference","title":"Testing & benchmarking","text":"","category":"section"},{"location":"api/","page":"API reference","title":"API reference","text":"Modules = [DifferentiationInterfaceTest]\nPrivate = false","category":"page"},{"location":"api/#DifferentiationInterfaceTest.DifferentiationInterfaceTest","page":"API reference","title":"DifferentiationInterfaceTest.DifferentiationInterfaceTest","text":"DifferentiationInterfaceTest\n\nTesting utilities for DifferentiationInterface.\n\n\n\n\n\n","category":"module"},{"location":"api/#DifferentiationInterfaceTest.BenchmarkData","page":"API reference","title":"DifferentiationInterfaceTest.BenchmarkData","text":"BenchmarkData\n\nAd-hoc storage type for differentiation benchmarking results. You can turn it into a DataFrame as follows:\n\ndf = DataFrames.DataFrame(pairs(benchmark_data)...)\n\nFields\n\nThese are not part of the public API.\n\nbackend::Vector{String}\nmode::Vector{Type}\noperator::Vector{Function}\nvariant::Vector{Function}\nfunc::Vector{String}\nmutating::Vector{Bool}\ninput_type::Vector{Type}\noutput_type::Vector{Type}\ninput_size::Vector\noutput_size::Vector\nsamples::Vector{Int64}\ntime::Vector{Float64}\nbytes::Vector{Float64}\nallocs::Vector{Float64}\ncompile_fraction::Vector{Float64}\ngc_fraction::Vector{Float64}\nevals::Vector{Float64}\n\n\n\n\n\n","category":"type"},{"location":"api/#DifferentiationInterfaceTest.Scenario","page":"API reference","title":"DifferentiationInterfaceTest.Scenario","text":"Scenario{mutating}\n\nStore a testing scenario composed of a function and its input + output + tangents.\n\nFields\n\nf::Any: function\nx::Any: input\ny::Any: output\ndx::Any: pushforward seed\ndy::Any: pullback seed\nref::Any: reference to compare against. It can be either an ADTypes.AbstractADTypes object or a Reference containing the correct operators associated with f\n\n\n\n\n\n","category":"type"},{"location":"api/#DifferentiationInterfaceTest.all_operators-Tuple{}","page":"API reference","title":"DifferentiationInterfaceTest.all_operators","text":"all_operators()\n\nList all operators that can be tested with test_differentiation.\n\n\n\n\n\n","category":"method"},{"location":"api/#DifferentiationInterfaceTest.default_scenarios-Tuple{}","page":"API reference","title":"DifferentiationInterfaceTest.default_scenarios","text":"default_scenarios()\n\nCreate a vector of Scenarios for testing differentiation. \n\n\n\n\n\n","category":"method"},{"location":"api/#DifferentiationInterfaceTest.test_differentiation","page":"API reference","title":"DifferentiationInterfaceTest.test_differentiation","text":"test_differentiation(backends, [operators, scenarios]; [kwargs...])\n\nCross-test a list of backends for a list of operators on a list of scenarios, running a variety of different tests.\n\nIf benchmark=true, return a BenchmarkData object, otherwise return nothing.\n\nDefault arguments\n\noperators::Vector{Function}: the list [pushforward, pullback,derivative, gradient, jacobian, second_derivative, hvp, hessian]\nscenarios::Vector{Scenario}: the output of default_scenarios()\n\nKeyword arguments\n\nTesting:\n\ncorrectness=true: whether to compare the differentiation results with the theoretical values specified in each scenario. If a backend object like correctness=AutoForwardDiff() is passed instead of a boolean, the results will be compared using that reference backend as the ground truth. \ncall_count=false: whether to check that the function is called the right number of times\ntype_stability=false: whether to check type stability with JET.jl (thanks to @test_opt)\nbenchmark=false: whether to run and return a benchmark suite with Chairmarks.jl\nallocations=false: whether to check that the benchmarks are allocation-free\ndetailed=false: whether to print a detailed test set (by scenario) or condensed test set (by operator)\n\nFiltering:\n\ninput_type=Any: restrict scenario inputs to subtypes of this\noutput_type=Any: restrict scenario outputs to subtypes of this\nallocating=true: consider operators for allocating functions\nmutating=true: consider operators for mutating functions\nfirst_order=true: consider first order operators\nsecond_order=true: consider second order operators\nexcluded=Symbol[]: list of excluded operators\n\nOptions:\n\nisapprox=isapprox: function used to compare objects, only needs to be set for complicated cases beyond arrays / scalars\nrtol=1e-3: precision for correctness testing (when comparing to the reference outputs)\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterfaceTest.test_differentiation-Tuple{ADTypes.AbstractADType, Vararg{Any}}","page":"API reference","title":"DifferentiationInterfaceTest.test_differentiation","text":"test_differentiation(\n backend::ADTypes.AbstractADType,\n args...;\n kwargs...\n) -> Union{Nothing, BenchmarkData}\n\n\nShortcut for a single backend.\n\n\n\n\n\n","category":"method"},{"location":"api/#Internals","page":"API reference","title":"Internals","text":"","category":"section"},{"location":"api/","page":"API reference","title":"API reference","text":"This is not part of the public API.","category":"page"},{"location":"api/","page":"API reference","title":"API reference","text":"Modules = [DifferentiationInterface]\nPublic = false\nOrder = [:function, :type]\nFilter = t -> !(t isa Type && t <: ADTypes.AbstractADType)","category":"page"},{"location":"api/#DifferentiationInterface.basisarray-Tuple{ADTypes.AbstractADType, AbstractArray, Any}","page":"API reference","title":"DifferentiationInterface.basisarray","text":"basisarray(backend, a::AbstractArray, i::CartesianIndex)\n\nConstruct the i-th stardard basis array in the vector space of a with element type eltype(a).\n\nNote\n\nIf an AD backend benefits from a more specialized basis array implementation, this function can be extended on the backend type.\n\n\n\n\n\n","category":"method"},{"location":"api/#DifferentiationInterface.mode-Tuple{ADTypes.AbstractForwardMode}","page":"API reference","title":"DifferentiationInterface.mode","text":"mode(backend)\n\nReturn the AD mode of a backend in a statically predictable way.\n\nThe return value is a Type object chosen among:\n\nADTypes.AbstractForwardMode\nADTypes.AbstractFiniteDifferencesMode\nADTypes.AbstractReverseMode\nADTypes.AbstractSymbolicDifferentiationMode\n\nThis function exists because there are backends (like Enzyme) that can support both forward and reverse mode, which means their ADTypes.jl object does not subtype either.\n\n\n\n\n\n","category":"method"},{"location":"api/#DifferentiationInterface.supports_mutation-Tuple{ADTypes.AbstractADType}","page":"API reference","title":"DifferentiationInterface.supports_mutation","text":"supports_mutation(backend)\n\nReturn MutationSupported or MutationNotSupported in a statically predictable way.\n\n\n\n\n\n","category":"method"},{"location":"api/#DifferentiationInterface.supports_pullback-Tuple{ADTypes.AbstractADType}","page":"API reference","title":"DifferentiationInterface.supports_pullback","text":"supports_pullback(backend)\n\nReturn PullbackSupported or PullbackNotSupported in a statically predictable way.\n\n\n\n\n\n","category":"method"},{"location":"api/#DifferentiationInterface.supports_pushforward-Tuple{ADTypes.AbstractADType}","page":"API reference","title":"DifferentiationInterface.supports_pushforward","text":"supports_pushforward(backend)\n\nReturn PushforwardSupported or PushforwardNotSupported in a statically predictable way.\n\n\n\n\n\n","category":"method"},{"location":"api/#DifferentiationInterface.ForwardOverForward","page":"API reference","title":"DifferentiationInterface.ForwardOverForward","text":"ForwardOverForward\n\nTraits identifying second-order backends that compute HVPs in forward over forward mode (inefficient).\n\n\n\n\n\n","category":"type"},{"location":"api/#DifferentiationInterface.ForwardOverReverse","page":"API reference","title":"DifferentiationInterface.ForwardOverReverse","text":"ForwardOverReverse\n\nTraits identifying second-order backends that compute HVPs in forward over reverse mode.\n\n\n\n\n\n","category":"type"},{"location":"api/#DifferentiationInterface.MutationNotSupported","page":"API reference","title":"DifferentiationInterface.MutationNotSupported","text":"MutationNotSupported\n\nTrait identifying backends that do not support mutating functions f!(y, x).\n\n\n\n\n\n","category":"type"},{"location":"api/#DifferentiationInterface.MutationSupported","page":"API reference","title":"DifferentiationInterface.MutationSupported","text":"MutationSupported\n\nTrait identifying backends that support mutating functions f!(y, x).\n\n\n\n\n\n","category":"type"},{"location":"api/#DifferentiationInterface.PullbackNotSupported","page":"API reference","title":"DifferentiationInterface.PullbackNotSupported","text":"PullbackNotSupported\n\nTrait identifying backends that do not support efficient pullbacks.\n\n\n\n\n\n","category":"type"},{"location":"api/#DifferentiationInterface.PullbackSupported","page":"API reference","title":"DifferentiationInterface.PullbackSupported","text":"PullbackSupported\n\nTrait identifying backends that support efficient pullbacks.\n\n\n\n\n\n","category":"type"},{"location":"api/#DifferentiationInterface.PushforwardNotSupported","page":"API reference","title":"DifferentiationInterface.PushforwardNotSupported","text":"PushforwardNotSupported\n\nTrait identifying backends that do not support efficient pushforwards.\n\n\n\n\n\n","category":"type"},{"location":"api/#DifferentiationInterface.PushforwardSupported","page":"API reference","title":"DifferentiationInterface.PushforwardSupported","text":"PushforwardSupported\n\nTrait identifying backends that support efficient pushforwards.\n\n\n\n\n\n","category":"type"},{"location":"api/#DifferentiationInterface.ReverseOverForward","page":"API reference","title":"DifferentiationInterface.ReverseOverForward","text":"ReverseOverForward\n\nTraits identifying second-order backends that compute HVPs in reverse over forward mode.\n\n\n\n\n\n","category":"type"},{"location":"api/#DifferentiationInterface.ReverseOverReverse","page":"API reference","title":"DifferentiationInterface.ReverseOverReverse","text":"ReverseOverReverse\n\nTraits identifying second-order backends that compute HVPs in reverse over reverse mode.\n\n\n\n\n\n","category":"type"},{"location":"api/","page":"API reference","title":"API reference","text":"Modules = [DifferentiationInterfaceTest]\nPublic = false","category":"page"},{"location":"api/#DifferentiationInterfaceTest.AutoZeroForward","page":"API reference","title":"DifferentiationInterfaceTest.AutoZeroForward","text":"AutoZeroForward <: ADTypes.AbstractForwardMode\n\nTrivial backend that sets all derivatives to zero. Used in testing and benchmarking.\n\n\n\n\n\n","category":"type"},{"location":"api/#DifferentiationInterfaceTest.AutoZeroReverse","page":"API reference","title":"DifferentiationInterfaceTest.AutoZeroReverse","text":"AutoZeroReverse <: ADTypes.AbstractReverseMode\n\nTrivial backend that sets all derivatives to zero. Used in testing and benchmarking.\n\n\n\n\n\n","category":"type"},{"location":"api/#DifferentiationInterfaceTest.Reference","page":"API reference","title":"DifferentiationInterfaceTest.Reference","text":"Reference\n\nStore the ground truth operators for a Scenario.\n\nFields\n\npushforward::Any: function (x, dx) -> pf\npullback::Any: function (x, dy) -> pb\nderivative::Any: function x -> der\ngradient::Any: function x -> grad\njacobian::Any: function x -> jac\nsecond_derivative::Any: function x -> der2\nhvp::Any: function (x, v) -> p\nhessian::Any: function x -> hess\n\n\n\n\n\n","category":"type"},{"location":"api/#DifferentiationInterfaceTest.backend_string-Tuple{ADTypes.AbstractADType}","page":"API reference","title":"DifferentiationInterfaceTest.backend_string","text":"backend_string(backend)\n\nReturn a shorter string than the full object printing from ADTypes.jl. Might be ambiguous.\n\n\n\n\n\n","category":"method"},{"location":"api/","page":"API reference","title":"API reference","text":"","category":"page"},{"location":"developer/#For-AD-developers","page":"For AD developers","title":"For AD developers","text":"","category":"section"},{"location":"developer/#Backend-requirements","page":"For AD developers","title":"Backend requirements","text":"","category":"section"},{"location":"developer/","page":"For AD developers","title":"For AD developers","text":"To be usable with DifferentiationInterface.jl, an AD backend needs an object subtyping ADTypes.AbstractADType. In addition, some operators must be defined:","category":"page"},{"location":"developer/","page":"For AD developers","title":"For AD developers","text":"backend subtype pushforward necessary pullback necessary\nADTypes.AbstractForwardMode yes no\nADTypes.AbstractFiniteDifferencesMode yes no\nADTypes.AbstractReverseMode no yes\nADTypes.AbstractSymbolicDifferentiationMode yes yes","category":"page"},{"location":"developer/","page":"For AD developers","title":"For AD developers","text":"Every backend we support corresponds to a package extension of DifferentiationInterface.jl (located in the ext subfolder). Advanced users are welcome to code more backends and submit pull requests!","category":"page"},{"location":"developer/#Fallback-call-structure","page":"For AD developers","title":"Fallback call structure","text":"","category":"section"},{"location":"developer/","page":"For AD developers","title":"For AD developers","text":"For simplicity, we remove value_ in the operator names below.","category":"page"},{"location":"developer/","page":"For AD developers","title":"For AD developers","text":"note: Edge labels\nFull edges in the following graphs require a single call to the destination. Dotted edges require multiple calls to the destination, the number is indicated above.","category":"page"},{"location":"developer/#Forward-mode,-allocating-functions","page":"For AD developers","title":"Forward mode, allocating functions","text":"","category":"section"},{"location":"developer/","page":"For AD developers","title":"For AD developers","text":"flowchart LR\n pushforward!! --> pushforward\n derivative --> pushforward\n derivative!! --> pushforward!!\n gradient .-> |n|pushforward\n gradient!! .-> |n|pushforward!!\n jacobian .-> |n|pushforward\n jacobian!! .-> |n|pushforward!!","category":"page"},{"location":"developer/#Reverse-mode,-allocating-functions","page":"For AD developers","title":"Reverse mode, allocating functions","text":"","category":"section"},{"location":"developer/","page":"For AD developers","title":"For AD developers","text":"flowchart LR\n pullback!! --> pullback\n derivative .-> |m|pullback\n derivative!! .-> |m|pullback!!\n gradient --> pullback\n gradient!! --> pullback!!\n jacobian .-> |m|pullback\n jacobian!! .-> |m|pullback!!","category":"page"},{"location":"developer/","page":"For AD developers","title":"For AD developers","text":"","category":"page"},{"location":"overview/#Overview","page":"Overview","title":"Overview","text":"","category":"section"},{"location":"overview/#Operators","page":"Overview","title":"Operators","text":"","category":"section"},{"location":"overview/","page":"Overview","title":"Overview","text":"Depending on the type of input and output, differentiation operators can have various names. Most backends have custom implementations, which we reuse if possible.","category":"page"},{"location":"overview/","page":"Overview","title":"Overview","text":"We choose the following terminology for the high-level operators we provide:","category":"page"},{"location":"overview/","page":"Overview","title":"Overview","text":"operator input x output y result type result shape\nderivative Number Any same as y size(y)\ngradient Any Number same as x size(x)\njacobian AbstractArray AbstractArray AbstractMatrix (length(y), length(x))","category":"page"},{"location":"overview/","page":"Overview","title":"Overview","text":"They are all based on the following low-level operators:","category":"page"},{"location":"overview/","page":"Overview","title":"Overview","text":"pushforward (or JVP), to propagate input tangents\npullback (or VJP), to backpropagate output cotangents","category":"page"},{"location":"overview/","page":"Overview","title":"Overview","text":"tip: Tip\nSee the book The Elements of Differentiable Programming for details on these concepts.","category":"page"},{"location":"overview/#Variants","page":"Overview","title":"Variants","text":"","category":"section"},{"location":"overview/","page":"Overview","title":"Overview","text":"Several variants of each operator are defined:","category":"page"},{"location":"overview/","page":"Overview","title":"Overview","text":"out-of-place in-place (or not) out-of-place + primal in-place (or not) + primal\nderivative derivative!! value_and_derivative value_and_derivative!!\ngradient gradient!! value_and_gradient value_and_gradient!!\njacobian jacobian!! value_and_jacobian value_and_jacobian!!\npushforward pushforward!! value_and_pushforward value_and_pushforward!!\npullback pullback!! value_and_pullback value_and_pullback!!","category":"page"},{"location":"overview/","page":"Overview","title":"Overview","text":"warning: Warning\nThe \"bang-bang\" syntactic convention !! signals that some of the arguments can be mutated, but they do not have to be. Such arguments will always be part of the return, so that one can simply reuse the operator's output and forget its input.In other words, this is good:grad = gradient!!(f, grad, backend, x) # do thisOn the other hand, this is bad, because if grad has not been mutated, you will get wrong results:gradient!!(f, grad, backend, x) # don't do this","category":"page"},{"location":"overview/#Second-order","page":"Overview","title":"Second order","text":"","category":"section"},{"location":"overview/","page":"Overview","title":"Overview","text":"Second-order differentiation is also supported, with the following operators:","category":"page"},{"location":"overview/","page":"Overview","title":"Overview","text":"operator input x output y result type result shape\nsecond_derivative Number Any same as y size(y)\nhvp Any Number same as x size(x)\nhessian AbstractArray Number AbstractMatrix (length(x), length(x))","category":"page"},{"location":"overview/","page":"Overview","title":"Overview","text":"danger: Danger\nThis is an experimental functionality, use at your own risk.","category":"page"},{"location":"overview/#Preparation","page":"Overview","title":"Preparation","text":"","category":"section"},{"location":"overview/","page":"Overview","title":"Overview","text":"In many cases, automatic differentiation can be accelerated if the function has been run at least once (e.g. to record a tape) and if some cache objects are provided. This is a backend-specific procedure, but we expose a common syntax to achieve it.","category":"page"},{"location":"overview/","page":"Overview","title":"Overview","text":"operator preparation function\nderivative prepare_derivative\ngradient prepare_gradient\njacobian prepare_jacobian\nsecond_derivative prepare_second_derivative\nhessian prepare_hessian\npushforward prepare_pushforward\npullback prepare_pullback\nhvp prepare_hvp","category":"page"},{"location":"overview/","page":"Overview","title":"Overview","text":"If you run prepare_operator(backend, f, x), it will create an object called extras containing the necessary information to speed up operator and its variants. This information is specific to backend and f, as well as the type and size of the input x, but it should work with different values of x.","category":"page"},{"location":"overview/","page":"Overview","title":"Overview","text":"You can then call operator(backend, f, similar_x, extras), which should be faster than operator(backend, f, similar_x). This is especially worth it if you plan to call operator several times in similar settings: you can think of it as a warm up.","category":"page"},{"location":"overview/","page":"Overview","title":"Overview","text":"By default, all the preparation functions return nothing. We do not make any guarantees on their implementation for each backend, or on the performance gains that can be expected.","category":"page"},{"location":"overview/","page":"Overview","title":"Overview","text":"warning: Warning\nWe haven't fully figured out what must happen when an extras object is prepared for a specific operator but then given to a lower-level one (i.e. prepare it for jacobian but then give it to pushforward inside jacobian).","category":"page"},{"location":"overview/#Multiple-inputs/outputs","page":"Overview","title":"Multiple inputs/outputs","text":"","category":"section"},{"location":"overview/","page":"Overview","title":"Overview","text":"Restricting the API to one input and one output has many coding advantages, but it is not very flexible. If you need more than that, use ComponentArrays.jl to wrap several objects inside a single ComponentVector.","category":"page"},{"location":"overview/","page":"Overview","title":"Overview","text":"","category":"page"},{"location":"backends/","page":"Backends","title":"Backends","text":"CurrentModule = Main\nCollapsedDocStrings = true","category":"page"},{"location":"backends/","page":"Backends","title":"Backends","text":"using ADTypes, DifferentiationInterface\nusing DifferentiationInterfaceTest: backend_string\nimport Markdown\nimport Enzyme, FastDifferentiation, FiniteDiff, FiniteDifferences, ForwardDiff, PolyesterForwardDiff, ReverseDiff, Tracker, Zygote\n\nfunction all_backends()\n return [\n AutoDiffractor(),\n AutoEnzyme(Enzyme.Forward),\n AutoEnzyme(Enzyme.Reverse),\n AutoFastDifferentiation(),\n AutoFiniteDiff(),\n AutoFiniteDifferences(FiniteDifferences.central_fdm(5, 1)),\n AutoForwardDiff(),\n AutoPolyesterForwardDiff(; chunksize=2),\n AutoReverseDiff(),\n AutoTracker(),\n AutoZygote(),\n ]\nend\n\nfunction all_backends_without_enzyme()\n return filter(all_backends()) do b\n !isa(b, AutoEnzyme)\n end\nend","category":"page"},{"location":"backends/#Backends","page":"Backends","title":"Backends","text":"","category":"section"},{"location":"backends/#Types","page":"Backends","title":"Types","text":"","category":"section"},{"location":"backends/","page":"Backends","title":"Backends","text":"Most backend choices are defined by ADTypes.jl.","category":"page"},{"location":"backends/","page":"Backends","title":"Backends","text":"warning: Warning\nOnly the backends listed here are supported by DifferentiationInterface.jl, even though ADTypes.jl defines more.","category":"page"},{"location":"backends/","page":"Backends","title":"Backends","text":"AutoChainRules\nAutoDiffractor\nAutoEnzyme\nAutoForwardDiff\nAutoForwardDiff()\nAutoFiniteDiff\nAutoFiniteDifferences\nAutoPolyesterForwardDiff\nAutoPolyesterForwardDiff()\nAutoReverseDiff\nAutoTracker\nAutoZygote","category":"page"},{"location":"backends/#ADTypes.AutoChainRules","page":"Backends","title":"ADTypes.AutoChainRules","text":"AutoChainRules{RC}\n\nChooses any AD library based on ChainRulesCore.jl, given an appropriate RuleConfig object.\n\nFields\n\nruleconfig::RC\n\n\n\n\n\n","category":"type"},{"location":"backends/#ADTypes.AutoDiffractor","page":"Backends","title":"ADTypes.AutoDiffractor","text":"AutoDiffractor\n\nChooses Diffractor.jl.\n\n\n\n\n\n","category":"type"},{"location":"backends/#ADTypes.AutoEnzyme","page":"Backends","title":"ADTypes.AutoEnzyme","text":"AutoEnzyme(Enzyme.Forward)\nAutoEnzyme(Enzyme.Reverse)\n\nConstruct a forward or reverse mode AutoEnzyme backend.\n\n\n\n\n\nAutoEnzyme{M}\n\nChooses Enzyme.jl.\n\nFields\n\nmode::M = nothing\n\n\n\n\n\n","category":"type"},{"location":"backends/#ADTypes.AutoForwardDiff","page":"Backends","title":"ADTypes.AutoForwardDiff","text":"AutoForwardDiff{chunksize,T}\n\nChooses ForwardDiff.jl.\n\nFields\n\ntag::T\n\n\n\n\n\n","category":"type"},{"location":"backends/#ADTypes.AutoForwardDiff-Tuple{}","page":"Backends","title":"ADTypes.AutoForwardDiff","text":"AutoForwardDiff(; chunksize = nothing, tag = nothing)\n\nConstructor.\n\n\n\n\n\n","category":"method"},{"location":"backends/#ADTypes.AutoFiniteDiff","page":"Backends","title":"ADTypes.AutoFiniteDiff","text":"AutoFiniteDiff{T1,T2,T3}\n\nChooses FiniteDiff.jl.\n\nFields\n\nfdtype::T1 = Val(:forward)\nfdjtype::T2 = fdtype\nfdhtype::T3 = Val(:hcentral)\n\n\n\n\n\n","category":"type"},{"location":"backends/#ADTypes.AutoFiniteDifferences","page":"Backends","title":"ADTypes.AutoFiniteDifferences","text":"AutoFiniteDifferences{T}\n\nChooses FiniteDifferences.jl.\n\nFields\n\nfdm::T = nothing\n\n\n\n\n\n","category":"type"},{"location":"backends/#ADTypes.AutoPolyesterForwardDiff","page":"Backends","title":"ADTypes.AutoPolyesterForwardDiff","text":"AutoPolyesterForwardDiff{chunksize}\n\nChooses PolyesterForwardDiff.jl.\n\n\n\n\n\n","category":"type"},{"location":"backends/#ADTypes.AutoPolyesterForwardDiff-Tuple{}","page":"Backends","title":"ADTypes.AutoPolyesterForwardDiff","text":"AutoPolyesterForwardDiff(; chunksize = nothing)\n\nConstructor.\n\n\n\n\n\n","category":"method"},{"location":"backends/#ADTypes.AutoReverseDiff","page":"Backends","title":"ADTypes.AutoReverseDiff","text":"AutoReverseDiff\n\nChooses ReverseDiff.jl.\n\nFields\n\ncompile::Bool = false\n\n\n\n\n\n","category":"type"},{"location":"backends/#ADTypes.AutoTracker","page":"Backends","title":"ADTypes.AutoTracker","text":"AutoTracker\n\nChooses Tracker.jl.\n\n\n\n\n\n","category":"type"},{"location":"backends/#ADTypes.AutoZygote","page":"Backends","title":"ADTypes.AutoZygote","text":"AutoZygote\n\nChooses Zygote.jl.\n\n\n\n\n\n","category":"type"},{"location":"backends/","page":"Backends","title":"Backends","text":"We also provide a few of our own:","category":"page"},{"location":"backends/","page":"Backends","title":"Backends","text":"AutoFastDifferentiation","category":"page"},{"location":"backends/#DifferentiationInterface.AutoFastDifferentiation","page":"Backends","title":"DifferentiationInterface.AutoFastDifferentiation","text":"AutoFastDifferentiation\n\nChooses FastDifferentiation.jl.\n\n\n\n\n\n","category":"type"},{"location":"backends/#Availability","page":"Backends","title":"Availability","text":"","category":"section"},{"location":"backends/","page":"Backends","title":"Backends","text":"You can use check_available to verify whether a given backend is loaded, like we did below:","category":"page"},{"location":"backends/","page":"Backends","title":"Backends","text":"header = \"| Backend | available |\" # hide\nsubheader = \"|---|---|\" # hide\nrows = map(all_backends()) do backend # hide\n \"| `$(backend_string(backend))` | $(check_available(backend) ? '✓' : '✗') |\" # hide\nend # hide\nMarkdown.parse(join(vcat(header, subheader, rows...), \"\\n\")) # hide","category":"page"},{"location":"backends/#Mutation-support","page":"Backends","title":"Mutation support","text":"","category":"section"},{"location":"backends/","page":"Backends","title":"Backends","text":"All backends are compatible with allocating functions f(x) = y. Only some are compatible with mutating functions f!(y, x) = nothing. You can use check_mutation to check that feature, like we did below:","category":"page"},{"location":"backends/","page":"Backends","title":"Backends","text":"header = \"| Backend | mutation |\" # hide\nsubheader = \"|---|---|\" # hide\nrows = map(all_backends()) do backend # hide\n \"| `$(backend_string(backend))` | $(check_mutation(backend) ? '✓' : '✗') |\" # hide\nend # hide\nMarkdown.parse(join(vcat(header, subheader, rows...), \"\\n\")) # hide","category":"page"},{"location":"backends/#Package-extensions","page":"Backends","title":"Package extensions","text":"","category":"section"},{"location":"backends/","page":"Backends","title":"Backends","text":"CurrentModule = DifferentiationInterface","category":"page"},{"location":"backends/","page":"Backends","title":"Backends","text":"Backend-specific extension content is not part of the public API.","category":"page"},{"location":"backends/","page":"Backends","title":"Backends","text":"Modules = [\n Base.get_extension(DifferentiationInterface, :DifferentiationInterfaceChainRulesCoreExt),\n Base.get_extension(DifferentiationInterface, :DifferentiationInterfaceDiffractorExt),\n Base.get_extension(DifferentiationInterface, :DifferentiationInterfaceEnzymeExt),\n Base.get_extension(DifferentiationInterface, :DifferentiationInterfaceFastDifferentiationExt),\n Base.get_extension(DifferentiationInterface, :DifferentiationInterfaceFiniteDiffExt),\n Base.get_extension(DifferentiationInterface, :DifferentiationInterfaceFiniteDifferencesExt),\n Base.get_extension(DifferentiationInterface, :DifferentiationInterfaceForwardDiffExt),\n Base.get_extension(DifferentiationInterface, :DifferentiationInterfacePolyesterForwardDiffExt),\n Base.get_extension(DifferentiationInterface, :DifferentiationInterfaceReverseDiffExt),\n Base.get_extension(DifferentiationInterface, :DifferentiationInterfaceTrackerExt),\n Base.get_extension(DifferentiationInterface, :DifferentiationInterfaceZygoteExt)\n]\nFilter = t -> !(t isa Type && t <: ADTypes.AbstractADType)","category":"page"},{"location":"backends/","page":"Backends","title":"Backends","text":"","category":"page"},{"location":"","page":"Home","title":"Home","text":"EditURL = \"https://github.com/gdalle/DifferentiationInterface.jl/blob/main/README.md\"","category":"page"},{"location":"#DifferentiationInterface","page":"Home","title":"DifferentiationInterface","text":"","category":"section"},{"location":"","page":"Home","title":"Home","text":"(Image: Dev) (Image: Build Status) (Image: Coverage) (Image: Code Style: Blue)","category":"page"},{"location":"","page":"Home","title":"Home","text":"An interface to various automatic differentiation backends in Julia.","category":"page"},{"location":"#Goal","page":"Home","title":"Goal","text":"","category":"section"},{"location":"","page":"Home","title":"Home","text":"This package provides a backend-agnostic syntax to differentiate functions of the following types:","category":"page"},{"location":"","page":"Home","title":"Home","text":"allocating: f(x) = y\nmutating: f!(y, x) = nothing","category":"page"},{"location":"#Features","page":"Home","title":"Features","text":"","category":"section"},{"location":"","page":"Home","title":"Home","text":"First and second order operators\nIn-place and out-of-place differentiation\nPreparation mechanism (e.g. to create a config or tape)\nCross-backend testing and benchmarking utilities\nThorough validation on standard inputs and outputs (scalars, vectors, matrices)","category":"page"},{"location":"#Compatibility","page":"Home","title":"Compatibility","text":"","category":"section"},{"location":"","page":"Home","title":"Home","text":"We support most of the backends defined by ADTypes.jl:","category":"page"},{"location":"","page":"Home","title":"Home","text":"backend object\nChainRulesCore.jl AutoChainRules(ruleconfig)\nDiffractor.jl AutoDiffractor()\nEnzyme.jl AutoEnzyme(Enzyme.Forward) or AutoEnzyme(Enzyme.Reverse)\nFiniteDiff.jl AutoFiniteDiff()\nFiniteDifferences.jl AutoFiniteDifferences(fdm)\nForwardDiff.jl AutoForwardDiff()\nPolyesterForwardDiff.jl AutoPolyesterForwardDiff(; chunksize)\nReverseDiff.jl AutoReverseDiff()\nTracker.jl AutoTracker()\nZygote.jl AutoZygote()","category":"page"},{"location":"","page":"Home","title":"Home","text":"We also provide one additional backend:","category":"page"},{"location":"","page":"Home","title":"Home","text":"backend object\nFastDifferentiation.jl AutoFastDifferentiation()","category":"page"},{"location":"#Example","page":"Home","title":"Example","text":"","category":"section"},{"location":"","page":"Home","title":"Home","text":"julia> import ADTypes, ForwardDiff\n\njulia> using DifferentiationInterface\n\njulia> backend = ADTypes.AutoForwardDiff();\n\njulia> f(x) = sum(abs2, x);\n\njulia> value_and_gradient(f, backend, [1., 2., 3.])\n(14.0, [2.0, 4.0, 6.0])","category":"page"},{"location":"#Related-packages","page":"Home","title":"Related packages","text":"","category":"section"},{"location":"","page":"Home","title":"Home","text":"AbstractDifferentiation.jl is the original inspiration for DifferentiationInterface.jl.\nAutoDiffOperators.jl is an attempt to bridge ADTypes.jl with AbstractDifferentiation.jl.","category":"page"},{"location":"","page":"Home","title":"Home","text":"","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"CurrentModule = Main","category":"page"},{"location":"tutorial/#Tutorial","page":"Tutorial","title":"Tutorial","text":"","category":"section"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"We present a typical workflow with DifferentiationInterface.jl and showcase its potential performance benefits.","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"using ADTypes, BenchmarkTools, DifferentiationInterface\nimport ForwardDiff, Enzyme, DataFrames","category":"page"},{"location":"tutorial/#Computing-a-gradient","page":"Tutorial","title":"Computing a gradient","text":"","category":"section"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"A common use case of Automatic Differentiation (AD) is optimizing real-valued functions with first- or second-order methods. Let's define a simple objective","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"f(x::AbstractArray) = sum(abs2, x)","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"and a random input vector","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"x = [1.0, 2.0, 3.0]","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"To compute its gradient, we need to choose a \"backend\", i.e. an AD package that DifferentiationInterface.jl will call under the hood. ForwardDiff.jl is very efficient for low-dimensional inputs, so we'll go with that one. Backend types are defined and exported by ADTypes.jl:","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"backend = AutoForwardDiff()","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"Now we can use DifferentiationInterface.jl to get our gradient:","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"gradient(f, backend, x)","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"Was that fast? We can use BenchmarkTools.jl to answer that question.","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"@btime gradient($f, $backend, $x);","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"More or less what you would get if you just used the API from ForwardDiff.jl:","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"@btime ForwardDiff.gradient($f, $x);","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"Not bad, but we can do better.","category":"page"},{"location":"tutorial/#Overwriting-a-gradient","page":"Tutorial","title":"Overwriting a gradient","text":"","category":"section"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"Since we know how much space our gradient will occupy, we can pre-allocate that memory and offer it to AD. Some backends can get a speed boost from this trick.","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"grad = zero(x)\ngrad = gradient!!(f, grad, backend, x)","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"Note the double exclamation mark, which is a convention telling you that grad may or may not be overwritten, but will be returned either way (see this section for more details).","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"@btime gradient!!($f, _grad, $backend, $x) evals=1 setup=(_grad=similar($x));","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"For some reason the in-place version is slower than our first attempt, but as you can see it has one less allocation, corresponding to the gradient vector. Don't worry, we're not done yet.","category":"page"},{"location":"tutorial/#Preparing-for-multiple-gradients","page":"Tutorial","title":"Preparing for multiple gradients","text":"","category":"section"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"Internally, ForwardDiff.jl creates some data structures to keep track of things. These objects can be reused between gradient computations, even on different input values. We abstract away the preparation step behind a backend-agnostic syntax:","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"extras = prepare_gradient(f, backend, x)","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"You don't need to know what that is, you just need to pass it to the gradient operator.","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"grad = zero(x);\ngrad = gradient!!(f, grad, backend, x, extras)","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"Why, you ask? Because it is much faster, and allocation-free.","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"@btime gradient!!($f, _grad, $backend, $x, _extras) evals=1 setup=(\n _grad=similar($x);\n _extras=prepare_gradient($f, $backend, $x)\n);","category":"page"},{"location":"tutorial/#Switching-backends","page":"Tutorial","title":"Switching backends","text":"","category":"section"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"Now the whole point of DifferentiationInterface.jl is that you can easily experiment with different AD solutions. Typically, for gradients, reverse mode AD might be a better fit. So let's try the state-of-the-art Enzyme.jl!","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"For this one, the backend definition is slightly more involved, because you need to feed the \"mode\" to the object from ADTypes.jl:","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"backend2 = AutoEnzyme(Enzyme.Reverse)","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"But once it is done, things run smoothly with exactly the same syntax:","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"gradient(f, backend2, x)","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"And we can run the same benchmarks:","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"@btime gradient!!($f, _grad, $backend2, $x, _extras) evals=1 setup=(\n _grad=similar($x);\n _extras=prepare_gradient($f, $backend2, $x)\n);","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"Have you seen this? It's blazingly fast. And you know what's even better? You didn't need to look at the docs of either ForwardDiff.jl or Enzyme.jl to achieve top performance with both, or to compare them.","category":"page"},{"location":"tutorial/#Testing-and-benchmarking","page":"Tutorial","title":"Testing and benchmarking","text":"","category":"section"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"DifferentiationInterface.jl also provides some utilities for more involved comparison between backends. They are gathered in a submodule called DifferentiationInterfaceTest.","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"using DifferentiationInterfaceTest","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"The main entry point is test_differentiation, which is used as follows:","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"data = test_differentiation(\n [AutoForwardDiff(), AutoEnzyme(Enzyme.Reverse)], # backends to compare\n [gradient], # operators to try\n [Scenario(f; x=x)]; # test scenario\n correctness=AutoZygote(), # compare results to a \"ground truth\" from Zygote\n benchmark=true, # measure runtime and allocations too\n detailed=true, # print detailed test set\n);","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"The output of test_differentiation when benchmark=true can be converted to a DataFrame from DataFrames.jl:","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"df = DataFrames.DataFrame(pairs(data)...)","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"Here's what the resulting DataFrame looks like with all its columns. Note that the results may be slightly different from the ones presented above (we use Chairmarks.jl internally instead of BenchmarkTools.jl, and measure slightly different operators).","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"import Markdown, PrettyTables # hide\nMarkdown.parse(PrettyTables.pretty_table(String, df; backend=Val(:markdown), header=names(df))) # hide","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"","category":"page"}] +[{"location":"api/","page":"API reference","title":"API reference","text":"CurrentModule = Main\nCollapsedDocStrings = true","category":"page"},{"location":"api/#API-reference","page":"API reference","title":"API reference","text":"","category":"section"},{"location":"api/","page":"API reference","title":"API reference","text":"DifferentiationInterface","category":"page"},{"location":"api/#DifferentiationInterface","page":"API reference","title":"DifferentiationInterface","text":"DifferentiationInterface\n\nAn interface to various automatic differentiation backends in Julia.\n\nExports\n\nAutoFastDifferentiation\nSecondOrder\ncheck_available\ncheck_hessian\ncheck_mutation\nderivative\nderivative!!\ngradient\ngradient!!\nhessian\nhvp\njacobian\njacobian!!\nprepare_derivative\nprepare_gradient\nprepare_hessian\nprepare_hvp\nprepare_jacobian\nprepare_pullback\nprepare_pushforward\nprepare_second_derivative\npullback\npullback!!\npushforward\npushforward!!\nsecond_derivative\nvalue_and_derivative\nvalue_and_derivative!!\nvalue_and_gradient\nvalue_and_gradient!!\nvalue_and_jacobian\nvalue_and_jacobian!!\nvalue_and_pullback\nvalue_and_pullback!!\nvalue_and_pushforward\nvalue_and_pushforward!!\n\n\n\n\n\n","category":"module"},{"location":"api/#Derivative","page":"API reference","title":"Derivative","text":"","category":"section"},{"location":"api/","page":"API reference","title":"API reference","text":"Modules = [DifferentiationInterface]\nPages = [\"src/derivative.jl\"]","category":"page"},{"location":"api/#DifferentiationInterface.derivative","page":"API reference","title":"DifferentiationInterface.derivative","text":"derivative(f, backend, x, [extras]) -> der\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterface.derivative!!","page":"API reference","title":"DifferentiationInterface.derivative!!","text":"derivative!!(f, der, backend, x, [extras]) -> der\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterface.value_and_derivative","page":"API reference","title":"DifferentiationInterface.value_and_derivative","text":"value_and_derivative(f, backend, x, [extras]) -> (y, der)\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterface.value_and_derivative!!","page":"API reference","title":"DifferentiationInterface.value_and_derivative!!","text":"value_and_derivative!!(f, der, backend, x, [extras]) -> (y, der)\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterface.value_and_derivative!!-2","page":"API reference","title":"DifferentiationInterface.value_and_derivative!!","text":"value_and_derivative!!(f!, y, der, backend, x, [extras]) -> (y, der)\n\n\n\n\n\n","category":"function"},{"location":"api/#Gradient","page":"API reference","title":"Gradient","text":"","category":"section"},{"location":"api/","page":"API reference","title":"API reference","text":"Modules = [DifferentiationInterface]\nPages = [\"gradient.jl\"]","category":"page"},{"location":"api/#DifferentiationInterface.gradient","page":"API reference","title":"DifferentiationInterface.gradient","text":"gradient(f, backend, x, [extras]) -> grad\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterface.gradient!!","page":"API reference","title":"DifferentiationInterface.gradient!!","text":"gradient!!(f, grad, backend, x, [extras]) -> grad\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterface.value_and_gradient","page":"API reference","title":"DifferentiationInterface.value_and_gradient","text":"value_and_gradient(f, backend, x, [extras]) -> (y, grad)\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterface.value_and_gradient!!","page":"API reference","title":"DifferentiationInterface.value_and_gradient!!","text":"value_and_gradient!!(f, grad, backend, x, [extras]) -> (y, grad)\n\n\n\n\n\n","category":"function"},{"location":"api/#Jacobian","page":"API reference","title":"Jacobian","text":"","category":"section"},{"location":"api/","page":"API reference","title":"API reference","text":"Modules = [DifferentiationInterface]\nPages = [\"jacobian.jl\"]","category":"page"},{"location":"api/#DifferentiationInterface.jacobian","page":"API reference","title":"DifferentiationInterface.jacobian","text":"jacobian(f, backend, x, [extras]) -> jac\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterface.jacobian!!","page":"API reference","title":"DifferentiationInterface.jacobian!!","text":"jacobian!!(f, jac, backend, x, [extras]) -> jac\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterface.value_and_jacobian","page":"API reference","title":"DifferentiationInterface.value_and_jacobian","text":"value_and_jacobian(f, backend, x, [extras]) -> (y, jac)\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterface.value_and_jacobian!!","page":"API reference","title":"DifferentiationInterface.value_and_jacobian!!","text":"value_and_jacobian!!(f, jac, backend, x, [extras]) -> (y, jac)\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterface.value_and_jacobian!!-2","page":"API reference","title":"DifferentiationInterface.value_and_jacobian!!","text":"value_and_jacobian!!(f!, y, jac, backend, x, [extras]) -> (y, jac)\n\n\n\n\n\n","category":"function"},{"location":"api/#Second-order","page":"API reference","title":"Second order","text":"","category":"section"},{"location":"api/","page":"API reference","title":"API reference","text":"Modules = [DifferentiationInterface]\nPages = [\"second_order.jl\", \"second_derivative.jl\", \"hessian.jl\", \"hvp.jl\"]","category":"page"},{"location":"api/#DifferentiationInterface.SecondOrder","page":"API reference","title":"DifferentiationInterface.SecondOrder","text":"SecondOrder\n\nCombination of two backends for second-order differentiation.\n\nFields\n\nouter::ADTypes.AbstractADType: backend for the outer differentiation\ninner::ADTypes.AbstractADType: backend for the inner differentiation\n\n\n\n\n\n","category":"type"},{"location":"api/#DifferentiationInterface.second_derivative","page":"API reference","title":"DifferentiationInterface.second_derivative","text":"second_derivative(f, backend, x, [extras]) -> der2\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterface.hessian","page":"API reference","title":"DifferentiationInterface.hessian","text":"hessian(f, backend, x, [extras]) -> hess\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterface.hvp","page":"API reference","title":"DifferentiationInterface.hvp","text":"hvp(f, backend, x, v, [extras]) -> p\n\n\n\n\n\n","category":"function"},{"location":"api/#Primitives","page":"API reference","title":"Primitives","text":"","category":"section"},{"location":"api/","page":"API reference","title":"API reference","text":"Modules = [DifferentiationInterface]\nPages = [\"pushforward.jl\", \"pullback.jl\"]","category":"page"},{"location":"api/#DifferentiationInterface.pushforward","page":"API reference","title":"DifferentiationInterface.pushforward","text":"pushforward(f, backend, x, dx, [extras]) -> (y, dy)\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterface.pushforward!!","page":"API reference","title":"DifferentiationInterface.pushforward!!","text":"pushforward!!(f, dy, backend, x, dx, [extras]) -> (y, dy)\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterface.value_and_pushforward","page":"API reference","title":"DifferentiationInterface.value_and_pushforward","text":"value_and_pushforward(f, backend, x, dx, [extras]) -> (y, dy)\n\ninfo: Info\nRequired primitive for forward mode backends to support allocating functions.\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterface.value_and_pushforward!!","page":"API reference","title":"DifferentiationInterface.value_and_pushforward!!","text":"value_and_pushforward!!(f!, y, dy, backend, x, dx, [extras]) -> (y, dy)\n\ninfo: Info\nRequired primitive for forward mode backends to support mutating functions.\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterface.value_and_pushforward!!-2","page":"API reference","title":"DifferentiationInterface.value_and_pushforward!!","text":"value_and_pushforward!!(f, dy, backend, x, dx, [extras]) -> (y, dy)\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterface.pullback","page":"API reference","title":"DifferentiationInterface.pullback","text":"pullback(f, backend, x, dy, [extras]) -> dx\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterface.pullback!!","page":"API reference","title":"DifferentiationInterface.pullback!!","text":"pullback!!(f, dx, backend, x, dy, [extras]) -> dx\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterface.value_and_pullback","page":"API reference","title":"DifferentiationInterface.value_and_pullback","text":"value_and_pullback(f, backend, x, dy, [extras]) -> (y, dx)\n\ninfo: Info\nRequired primitive for reverse mode backends to support allocating functions.\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterface.value_and_pullback!!","page":"API reference","title":"DifferentiationInterface.value_and_pullback!!","text":"value_and_pullback!!(f!, y, dx, backend, x, dy, [extras]) -> (y, dx)\n\ninfo: Info\nRequired primitive for reverse mode backends to support mutating functions.\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterface.value_and_pullback!!-2","page":"API reference","title":"DifferentiationInterface.value_and_pullback!!","text":"value_and_pullback!!(f, dx, backend, x, dy, [extras]) -> (y, dx)\n\n\n\n\n\n","category":"function"},{"location":"api/#Backend-queries","page":"API reference","title":"Backend queries","text":"","category":"section"},{"location":"api/","page":"API reference","title":"API reference","text":"Modules = [DifferentiationInterface]\nPages = [\"backends.jl\"]","category":"page"},{"location":"api/#DifferentiationInterface.check_available-Tuple{ADTypes.AbstractADType}","page":"API reference","title":"DifferentiationInterface.check_available","text":"check_available(backend)\n\nCheck whether backend is available by trying a scalar-to-scalar derivative.\n\nwarning: Warning\nMight take a while due to compilation time.\n\n\n\n\n\n","category":"method"},{"location":"api/#DifferentiationInterface.check_hessian-Tuple{ADTypes.AbstractADType}","page":"API reference","title":"DifferentiationInterface.check_hessian","text":"check_hessian(backend)\n\nCheck whether backend supports second order differentiation by trying a hessian.\n\nwarning: Warning\nMight take a while due to compilation time.\n\n\n\n\n\n","category":"method"},{"location":"api/#DifferentiationInterface.check_mutation-Tuple{ADTypes.AbstractADType}","page":"API reference","title":"DifferentiationInterface.check_mutation","text":"check_mutation(backend)\n\nCheck whether backend supports differentiation of mutating functions by trying a jacobian.\n\nwarning: Warning\nMight take a while due to compilation time.\n\n\n\n\n\n","category":"method"},{"location":"api/#Preparation","page":"API reference","title":"Preparation","text":"","category":"section"},{"location":"api/","page":"API reference","title":"API reference","text":"Modules = [DifferentiationInterface]\nPages = [\"prepare.jl\"]","category":"page"},{"location":"api/#DifferentiationInterface.prepare_derivative-Tuple{Any, ADTypes.AbstractADType, Any}","page":"API reference","title":"DifferentiationInterface.prepare_derivative","text":"prepare_derivative(f, backend, x) -> extras\nprepare_derivative(f!, backend, y, x) -> extras\n\nCreate an extras object that can be given to derivative operators.\n\n\n\n\n\n","category":"method"},{"location":"api/#DifferentiationInterface.prepare_gradient-Tuple{Any, ADTypes.AbstractADType, Any}","page":"API reference","title":"DifferentiationInterface.prepare_gradient","text":"prepare_gradient(f, backend, x) -> extras\n\nCreate an extras object that can be given to gradient operators.\n\n\n\n\n\n","category":"method"},{"location":"api/#DifferentiationInterface.prepare_hessian-Tuple{Any, ADTypes.AbstractADType, Any}","page":"API reference","title":"DifferentiationInterface.prepare_hessian","text":"prepare_hessian(f, backend, x) -> extras\n\nCreate an extras object that can be given to Hessian operators.\n\n\n\n\n\n","category":"method"},{"location":"api/#DifferentiationInterface.prepare_hvp-Tuple{Any, ADTypes.AbstractADType, Any}","page":"API reference","title":"DifferentiationInterface.prepare_hvp","text":"prepare_hvp(f, backend, x) -> extras\n\nCreate an extras object that can be given to Hessian-vector product operators.\n\n\n\n\n\n","category":"method"},{"location":"api/#DifferentiationInterface.prepare_jacobian-Tuple{Any, ADTypes.AbstractADType, Any}","page":"API reference","title":"DifferentiationInterface.prepare_jacobian","text":"prepare_jacobian(f, backend, x) -> extras\nprepare_jacobian(f!, backend, y, x) -> extras\n\nCreate an extras object that can be given to Jacobian operators.\n\n\n\n\n\n","category":"method"},{"location":"api/#DifferentiationInterface.prepare_pullback-Tuple{Any, ADTypes.AbstractADType, Any}","page":"API reference","title":"DifferentiationInterface.prepare_pullback","text":"prepare_pullback(f, backend, x) -> extras\nprepare_pullback(f!, backend, y, x) -> extras\n\nCreate an extras object that can be given to pullback operators.\n\n\n\n\n\n","category":"method"},{"location":"api/#DifferentiationInterface.prepare_pushforward-Tuple{Any, ADTypes.AbstractADType, Any}","page":"API reference","title":"DifferentiationInterface.prepare_pushforward","text":"prepare_pushforward(f, backend, x) -> extras\nprepare_pushforward(f!, backend, y, x) -> extras\n\nCreate an extras object that can be given to pushforward operators.\n\n\n\n\n\n","category":"method"},{"location":"api/#DifferentiationInterface.prepare_second_derivative-Tuple{Any, ADTypes.AbstractADType, Any}","page":"API reference","title":"DifferentiationInterface.prepare_second_derivative","text":"prepare_second_derivative(f, backend, x) -> extras\nprepare_second_derivative(f!, backend, y, x) -> extras\n\nCreate an extras object that can be given to second derivative operators.\n\n\n\n\n\n","category":"method"},{"location":"api/#Testing-and-benchmarking","page":"API reference","title":"Testing & benchmarking","text":"","category":"section"},{"location":"api/","page":"API reference","title":"API reference","text":"Modules = [DifferentiationInterfaceTest]\nPrivate = false","category":"page"},{"location":"api/#DifferentiationInterfaceTest.DifferentiationInterfaceTest","page":"API reference","title":"DifferentiationInterfaceTest.DifferentiationInterfaceTest","text":"DifferentiationInterfaceTest\n\nTesting utilities for DifferentiationInterface.\n\n\n\n\n\n","category":"module"},{"location":"api/#DifferentiationInterfaceTest.BenchmarkData","page":"API reference","title":"DifferentiationInterfaceTest.BenchmarkData","text":"BenchmarkData\n\nAd-hoc storage type for differentiation benchmarking results. You can turn it into a DataFrame as follows:\n\ndf = DataFrames.DataFrame(pairs(benchmark_data)...)\n\nFields\n\nThese are not part of the public API.\n\nbackend::Vector{String}\nmode::Vector{Type}\noperator::Vector{Function}\nvariant::Vector{Function}\nfunc::Vector{String}\nmutating::Vector{Bool}\ninput_type::Vector{Type}\noutput_type::Vector{Type}\ninput_size::Vector\noutput_size::Vector\nsamples::Vector{Int64}\ntime::Vector{Float64}\nbytes::Vector{Float64}\nallocs::Vector{Float64}\ncompile_fraction::Vector{Float64}\ngc_fraction::Vector{Float64}\nevals::Vector{Float64}\n\n\n\n\n\n","category":"type"},{"location":"api/#DifferentiationInterfaceTest.Scenario","page":"API reference","title":"DifferentiationInterfaceTest.Scenario","text":"Scenario{mutating}\n\nStore a testing scenario composed of a function and its input + output + tangents.\n\nFields\n\nf::Any: function\nx::Any: input\ny::Any: output\ndx::Any: pushforward seed\ndy::Any: pullback seed\nref::Any: reference to compare against. It can be either an ADTypes.AbstractADTypes object or a Reference containing the correct operators associated with f\n\n\n\n\n\n","category":"type"},{"location":"api/#DifferentiationInterfaceTest.all_operators-Tuple{}","page":"API reference","title":"DifferentiationInterfaceTest.all_operators","text":"all_operators()\n\nList all operators that can be tested with test_differentiation.\n\n\n\n\n\n","category":"method"},{"location":"api/#DifferentiationInterfaceTest.benchmark_differentiation","page":"API reference","title":"DifferentiationInterfaceTest.benchmark_differentiation","text":"benchmark_differentiation(backends, [operators, scenarios]; [kwargs...])\n\nBenchmark a list of backends for a list of operators on a list of scenarios.\n\nKeyword arguments\n\nfiltering: same as test_differentiation for the filtering part.\nlogging=true: whether to log progress\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterfaceTest.default_scenarios-Tuple{}","page":"API reference","title":"DifferentiationInterfaceTest.default_scenarios","text":"default_scenarios()\n\nCreate a vector of Scenarios for testing differentiation. \n\n\n\n\n\n","category":"method"},{"location":"api/#DifferentiationInterfaceTest.test_differentiation","page":"API reference","title":"DifferentiationInterfaceTest.test_differentiation","text":"test_differentiation(backends, [operators, scenarios]; [kwargs...])\n\nTest a list of backends for a list of operators on a list of scenarios.\n\nDefault arguments\n\noperators::Vector{Function}: the list [pushforward, pullback,derivative, gradient, jacobian, second_derivative, hvp, hessian]\nscenarios::Vector{Scenario}: the output of default_scenarios()\n\nKeyword arguments\n\nTesting:\n\ncorrectness=true: whether to compare the differentiation results with the theoretical values specified in each scenario. If a backend object like correctness=AutoForwardDiff() is passed instead of a boolean, the results will be compared using that reference backend as the ground truth. \ncall_count=false: whether to check that the function is called the right number of times\ntype_stability=false: whether to check type stability with JET.jl (thanks to @test_opt)\ndetailed=false: whether to print a detailed or condensed test log\n\nFiltering:\n\ninput_type=Any: restrict scenario inputs to subtypes of this\noutput_type=Any: restrict scenario outputs to subtypes of this\nallocating=true: consider operators for allocating functions\nmutating=true: consider operators for mutating functions\nfirst_order=true: consider first order operators\nsecond_order=true: consider second order operators\nexcluded=Symbol[]: list of excluded operators\n\nOptions:\n\nlogging=true: whether to log progress\nisapprox=isapprox: function used to compare objects, only needs to be set for complicated cases beyond arrays / scalars\nrtol=1e-3: precision for correctness testing (when comparing to the reference outputs)\n\n\n\n\n\n","category":"function"},{"location":"api/#DifferentiationInterfaceTest.test_differentiation-Tuple{ADTypes.AbstractADType, Vararg{Any}}","page":"API reference","title":"DifferentiationInterfaceTest.test_differentiation","text":"test_differentiation(\n backend::ADTypes.AbstractADType,\n args...;\n kwargs...\n)\n\n\nShortcut for a single backend.\n\n\n\n\n\n","category":"method"},{"location":"api/#Internals","page":"API reference","title":"Internals","text":"","category":"section"},{"location":"api/","page":"API reference","title":"API reference","text":"This is not part of the public API.","category":"page"},{"location":"api/","page":"API reference","title":"API reference","text":"Modules = [DifferentiationInterface]\nPublic = false\nOrder = [:function, :type]\nFilter = t -> !(t isa Type && t <: ADTypes.AbstractADType)","category":"page"},{"location":"api/#DifferentiationInterface.basis-Tuple{ADTypes.AbstractADType, AbstractArray, Any}","page":"API reference","title":"DifferentiationInterface.basis","text":"basis(backend, a::AbstractArray, i::CartesianIndex)\n\nConstruct the i-th stardard basis array in the vector space of a with element type eltype(a).\n\nNote\n\nIf an AD backend benefits from a more specialized basis array implementation, this function can be extended on the backend type.\n\n\n\n\n\n","category":"method"},{"location":"api/#DifferentiationInterface.mode-Tuple{ADTypes.AbstractForwardMode}","page":"API reference","title":"DifferentiationInterface.mode","text":"mode(backend)\n\nReturn the AD mode of a backend in a statically predictable way.\n\nThe return value is a Type object chosen among:\n\nADTypes.AbstractForwardMode\nADTypes.AbstractFiniteDifferencesMode\nADTypes.AbstractReverseMode\nADTypes.AbstractSymbolicDifferentiationMode\n\nThis function exists because there are backends (like Enzyme) that can support both forward and reverse mode, which means their ADTypes.jl object does not subtype either.\n\n\n\n\n\n","category":"method"},{"location":"api/#DifferentiationInterface.pullback_performance-Tuple{ADTypes.AbstractADType}","page":"API reference","title":"DifferentiationInterface.pullback_performance","text":"pullback_performance(backend)\n\nReturn PullbackFast or PullbackSlow in a statically predictable way.\n\n\n\n\n\n","category":"method"},{"location":"api/#DifferentiationInterface.pushforward_performance-Tuple{ADTypes.AbstractADType}","page":"API reference","title":"DifferentiationInterface.pushforward_performance","text":"pushforward_performance(backend)\n\nReturn PushforwardFast or PushforwardSlow in a statically predictable way.\n\n\n\n\n\n","category":"method"},{"location":"api/#DifferentiationInterface.supports_mutation-Tuple{ADTypes.AbstractADType}","page":"API reference","title":"DifferentiationInterface.supports_mutation","text":"supports_mutation(backend)\n\nReturn MutationSupported or MutationNotSupported in a statically predictable way.\n\n\n\n\n\n","category":"method"},{"location":"api/#DifferentiationInterface.ForwardOverForward","page":"API reference","title":"DifferentiationInterface.ForwardOverForward","text":"ForwardOverForward\n\nTraits identifying second-order backends that compute HVPs in forward over forward mode (inefficient).\n\n\n\n\n\n","category":"type"},{"location":"api/#DifferentiationInterface.ForwardOverReverse","page":"API reference","title":"DifferentiationInterface.ForwardOverReverse","text":"ForwardOverReverse\n\nTraits identifying second-order backends that compute HVPs in forward over reverse mode.\n\n\n\n\n\n","category":"type"},{"location":"api/#DifferentiationInterface.MutationNotSupported","page":"API reference","title":"DifferentiationInterface.MutationNotSupported","text":"MutationNotSupported\n\nTrait identifying backends that do not support mutating functions f!(y, x).\n\n\n\n\n\n","category":"type"},{"location":"api/#DifferentiationInterface.MutationSupported","page":"API reference","title":"DifferentiationInterface.MutationSupported","text":"MutationSupported\n\nTrait identifying backends that support mutating functions f!(y, x).\n\n\n\n\n\n","category":"type"},{"location":"api/#DifferentiationInterface.PullbackFast","page":"API reference","title":"DifferentiationInterface.PullbackFast","text":"PullbackFast\n\nTrait identifying backends that support efficient pullbacks.\n\n\n\n\n\n","category":"type"},{"location":"api/#DifferentiationInterface.PullbackSlow","page":"API reference","title":"DifferentiationInterface.PullbackSlow","text":"PullbackSlow\n\nTrait identifying backends that do not support efficient pullbacks.\n\n\n\n\n\n","category":"type"},{"location":"api/#DifferentiationInterface.PushforwardFast","page":"API reference","title":"DifferentiationInterface.PushforwardFast","text":"PushforwardFast\n\nTrait identifying backends that support efficient pushforwards.\n\n\n\n\n\n","category":"type"},{"location":"api/#DifferentiationInterface.PushforwardSlow","page":"API reference","title":"DifferentiationInterface.PushforwardSlow","text":"PushforwardSlow\n\nTrait identifying backends that do not support efficient pushforwards.\n\n\n\n\n\n","category":"type"},{"location":"api/#DifferentiationInterface.ReverseOverForward","page":"API reference","title":"DifferentiationInterface.ReverseOverForward","text":"ReverseOverForward\n\nTraits identifying second-order backends that compute HVPs in reverse over forward mode.\n\n\n\n\n\n","category":"type"},{"location":"api/#DifferentiationInterface.ReverseOverReverse","page":"API reference","title":"DifferentiationInterface.ReverseOverReverse","text":"ReverseOverReverse\n\nTraits identifying second-order backends that compute HVPs in reverse over reverse mode.\n\n\n\n\n\n","category":"type"},{"location":"api/","page":"API reference","title":"API reference","text":"Modules = [DifferentiationInterfaceTest]\nPublic = false","category":"page"},{"location":"api/#DifferentiationInterfaceTest.AutoZeroForward","page":"API reference","title":"DifferentiationInterfaceTest.AutoZeroForward","text":"AutoZeroForward <: ADTypes.AbstractForwardMode\n\nTrivial backend that sets all derivatives to zero. Used in testing and benchmarking.\n\n\n\n\n\n","category":"type"},{"location":"api/#DifferentiationInterfaceTest.AutoZeroReverse","page":"API reference","title":"DifferentiationInterfaceTest.AutoZeroReverse","text":"AutoZeroReverse <: ADTypes.AbstractReverseMode\n\nTrivial backend that sets all derivatives to zero. Used in testing and benchmarking.\n\n\n\n\n\n","category":"type"},{"location":"api/#DifferentiationInterfaceTest.Reference","page":"API reference","title":"DifferentiationInterfaceTest.Reference","text":"Reference\n\nStore the ground truth operators for a Scenario.\n\nFields\n\npushforward::Any: function (x, dx) -> pf\npullback::Any: function (x, dy) -> pb\nderivative::Any: function x -> der\ngradient::Any: function x -> grad\njacobian::Any: function x -> jac\nsecond_derivative::Any: function x -> der2\nhvp::Any: function (x, v) -> p\nhessian::Any: function x -> hess\n\n\n\n\n\n","category":"type"},{"location":"api/#DifferentiationInterfaceTest.backend_string-Tuple{ADTypes.AbstractADType}","page":"API reference","title":"DifferentiationInterfaceTest.backend_string","text":"backend_string(backend)\n\nReturn a shorter string than the full object printing from ADTypes.jl. Might be ambiguous.\n\n\n\n\n\n","category":"method"},{"location":"api/","page":"API reference","title":"API reference","text":"","category":"page"},{"location":"developer/#For-AD-developers","page":"For AD developers","title":"For AD developers","text":"","category":"section"},{"location":"developer/#Backend-requirements","page":"For AD developers","title":"Backend requirements","text":"","category":"section"},{"location":"developer/","page":"For AD developers","title":"For AD developers","text":"To be usable with DifferentiationInterface.jl, an AD backend needs an object subtyping ADTypes.AbstractADType. In addition, some operators must be defined:","category":"page"},{"location":"developer/","page":"For AD developers","title":"For AD developers","text":"backend subtype pushforward necessary pullback necessary\nADTypes.AbstractForwardMode yes no\nADTypes.AbstractFiniteDifferencesMode yes no\nADTypes.AbstractReverseMode no yes\nADTypes.AbstractSymbolicDifferentiationMode yes yes","category":"page"},{"location":"developer/","page":"For AD developers","title":"For AD developers","text":"Every backend we support corresponds to a package extension of DifferentiationInterface.jl (located in the ext subfolder). Advanced users are welcome to code more backends and submit pull requests!","category":"page"},{"location":"developer/#Fallback-call-structure","page":"For AD developers","title":"Fallback call structure","text":"","category":"section"},{"location":"developer/","page":"For AD developers","title":"For AD developers","text":"For simplicity, we remove value_ in the operator names below.","category":"page"},{"location":"developer/","page":"For AD developers","title":"For AD developers","text":"note: Edge labels\nFull edges in the following graphs require a single call to the destination. Dotted edges require multiple calls to the destination, the number is indicated above.","category":"page"},{"location":"developer/#Forward-mode,-allocating-functions","page":"For AD developers","title":"Forward mode, allocating functions","text":"","category":"section"},{"location":"developer/","page":"For AD developers","title":"For AD developers","text":"flowchart LR\n pushforward!! --> pushforward\n derivative --> pushforward\n derivative!! --> pushforward!!\n gradient .-> |n|pushforward\n gradient!! .-> |n|pushforward!!\n jacobian .-> |n|pushforward\n jacobian!! .-> |n|pushforward!!","category":"page"},{"location":"developer/#Reverse-mode,-allocating-functions","page":"For AD developers","title":"Reverse mode, allocating functions","text":"","category":"section"},{"location":"developer/","page":"For AD developers","title":"For AD developers","text":"flowchart LR\n pullback!! --> pullback\n derivative .-> |m|pullback\n derivative!! .-> |m|pullback!!\n gradient --> pullback\n gradient!! --> pullback!!\n jacobian .-> |m|pullback\n jacobian!! .-> |m|pullback!!","category":"page"},{"location":"developer/","page":"For AD developers","title":"For AD developers","text":"","category":"page"},{"location":"overview/#Overview","page":"Overview","title":"Overview","text":"","category":"section"},{"location":"overview/#Operators","page":"Overview","title":"Operators","text":"","category":"section"},{"location":"overview/","page":"Overview","title":"Overview","text":"Depending on the type of input and output, differentiation operators can have various names. Most backends have custom implementations, which we reuse if possible.","category":"page"},{"location":"overview/","page":"Overview","title":"Overview","text":"We choose the following terminology for the high-level operators we provide:","category":"page"},{"location":"overview/","page":"Overview","title":"Overview","text":"operator input x output y result type result shape\nderivative Number Any same as y size(y)\ngradient Any Number same as x size(x)\njacobian AbstractArray AbstractArray AbstractMatrix (length(y), length(x))","category":"page"},{"location":"overview/","page":"Overview","title":"Overview","text":"They are all based on the following low-level operators:","category":"page"},{"location":"overview/","page":"Overview","title":"Overview","text":"pushforward (or JVP), to propagate input tangents\npullback (or VJP), to backpropagate output cotangents","category":"page"},{"location":"overview/","page":"Overview","title":"Overview","text":"tip: Tip\nSee the book The Elements of Differentiable Programming for details on these concepts.","category":"page"},{"location":"overview/#Variants","page":"Overview","title":"Variants","text":"","category":"section"},{"location":"overview/","page":"Overview","title":"Overview","text":"Several variants of each operator are defined:","category":"page"},{"location":"overview/","page":"Overview","title":"Overview","text":"out-of-place in-place (or not) out-of-place + primal in-place (or not) + primal\nderivative derivative!! value_and_derivative value_and_derivative!!\ngradient gradient!! value_and_gradient value_and_gradient!!\njacobian jacobian!! value_and_jacobian value_and_jacobian!!\npushforward pushforward!! value_and_pushforward value_and_pushforward!!\npullback pullback!! value_and_pullback value_and_pullback!!","category":"page"},{"location":"overview/","page":"Overview","title":"Overview","text":"warning: Warning\nThe \"bang-bang\" syntactic convention !! signals that some of the arguments can be mutated, but they do not have to be. Such arguments will always be part of the return, so that one can simply reuse the operator's output and forget its input.In other words, this is good:grad = gradient!!(f, grad, backend, x) # do thisOn the other hand, this is bad, because if grad has not been mutated, you will get wrong results:gradient!!(f, grad, backend, x) # don't do this","category":"page"},{"location":"overview/#Second-order","page":"Overview","title":"Second order","text":"","category":"section"},{"location":"overview/","page":"Overview","title":"Overview","text":"Second-order differentiation is also supported, with the following operators:","category":"page"},{"location":"overview/","page":"Overview","title":"Overview","text":"operator input x output y result type result shape\nsecond_derivative Number Any same as y size(y)\nhvp Any Number same as x size(x)\nhessian AbstractArray Number AbstractMatrix (length(x), length(x))","category":"page"},{"location":"overview/","page":"Overview","title":"Overview","text":"danger: Danger\nThis is an experimental functionality, use at your own risk.","category":"page"},{"location":"overview/#Preparation","page":"Overview","title":"Preparation","text":"","category":"section"},{"location":"overview/","page":"Overview","title":"Overview","text":"In many cases, automatic differentiation can be accelerated if the function has been run at least once (e.g. to record a tape) and if some cache objects are provided. This is a backend-specific procedure, but we expose a common syntax to achieve it.","category":"page"},{"location":"overview/","page":"Overview","title":"Overview","text":"operator preparation function\nderivative prepare_derivative\ngradient prepare_gradient\njacobian prepare_jacobian\nsecond_derivative prepare_second_derivative\nhessian prepare_hessian\npushforward prepare_pushforward\npullback prepare_pullback\nhvp prepare_hvp","category":"page"},{"location":"overview/","page":"Overview","title":"Overview","text":"If you run prepare_operator(backend, f, x), it will create an object called extras containing the necessary information to speed up operator and its variants. This information is specific to backend and f, as well as the type and size of the input x, but it should work with different values of x.","category":"page"},{"location":"overview/","page":"Overview","title":"Overview","text":"You can then call operator(backend, f, similar_x, extras), which should be faster than operator(backend, f, similar_x). This is especially worth it if you plan to call operator several times in similar settings: you can think of it as a warm up.","category":"page"},{"location":"overview/","page":"Overview","title":"Overview","text":"By default, all the preparation functions return nothing. We do not make any guarantees on their implementation for each backend, or on the performance gains that can be expected.","category":"page"},{"location":"overview/","page":"Overview","title":"Overview","text":"warning: Warning\nWe haven't fully figured out what must happen when an extras object is prepared for a specific operator but then given to a lower-level one (i.e. prepare it for jacobian but then give it to pushforward inside jacobian).","category":"page"},{"location":"overview/#Multiple-inputs/outputs","page":"Overview","title":"Multiple inputs/outputs","text":"","category":"section"},{"location":"overview/","page":"Overview","title":"Overview","text":"Restricting the API to one input and one output has many coding advantages, but it is not very flexible. If you need more than that, use ComponentArrays.jl to wrap several objects inside a single ComponentVector.","category":"page"},{"location":"overview/","page":"Overview","title":"Overview","text":"","category":"page"},{"location":"backends/","page":"Backends","title":"Backends","text":"CurrentModule = Main\nCollapsedDocStrings = true","category":"page"},{"location":"backends/","page":"Backends","title":"Backends","text":"using ADTypes, DifferentiationInterface\nusing DifferentiationInterfaceTest: backend_string\nimport Markdown\nimport Enzyme, FastDifferentiation, FiniteDiff, FiniteDifferences, ForwardDiff, PolyesterForwardDiff, ReverseDiff, Tracker, Zygote\n\nfunction all_backends()\n return [\n AutoDiffractor(),\n AutoEnzyme(Enzyme.Forward),\n AutoEnzyme(Enzyme.Reverse),\n AutoFastDifferentiation(),\n AutoFiniteDiff(),\n AutoFiniteDifferences(FiniteDifferences.central_fdm(5, 1)),\n AutoForwardDiff(),\n AutoPolyesterForwardDiff(; chunksize=2),\n AutoReverseDiff(),\n AutoTracker(),\n AutoZygote(),\n ]\nend\n\nfunction all_backends_without_enzyme()\n return filter(all_backends()) do b\n !isa(b, AutoEnzyme)\n end\nend","category":"page"},{"location":"backends/#Backends","page":"Backends","title":"Backends","text":"","category":"section"},{"location":"backends/#Types","page":"Backends","title":"Types","text":"","category":"section"},{"location":"backends/","page":"Backends","title":"Backends","text":"Most backend choices are defined by ADTypes.jl.","category":"page"},{"location":"backends/","page":"Backends","title":"Backends","text":"warning: Warning\nOnly the backends listed here are supported by DifferentiationInterface.jl, even though ADTypes.jl defines more.","category":"page"},{"location":"backends/","page":"Backends","title":"Backends","text":"AutoChainRules\nAutoDiffractor\nAutoEnzyme\nAutoForwardDiff\nAutoForwardDiff()\nAutoFiniteDiff\nAutoFiniteDifferences\nAutoPolyesterForwardDiff\nAutoPolyesterForwardDiff()\nAutoReverseDiff\nAutoTracker\nAutoZygote","category":"page"},{"location":"backends/#ADTypes.AutoChainRules","page":"Backends","title":"ADTypes.AutoChainRules","text":"AutoChainRules{RC}\n\nChooses any AD library based on ChainRulesCore.jl, given an appropriate RuleConfig object.\n\nFields\n\nruleconfig::RC\n\n\n\n\n\n","category":"type"},{"location":"backends/#ADTypes.AutoDiffractor","page":"Backends","title":"ADTypes.AutoDiffractor","text":"AutoDiffractor\n\nChooses Diffractor.jl.\n\n\n\n\n\n","category":"type"},{"location":"backends/#ADTypes.AutoEnzyme","page":"Backends","title":"ADTypes.AutoEnzyme","text":"AutoEnzyme(Enzyme.Forward)\nAutoEnzyme(Enzyme.Reverse)\n\nConstruct a forward or reverse mode AutoEnzyme backend.\n\n\n\n\n\nAutoEnzyme{M}\n\nChooses Enzyme.jl.\n\nFields\n\nmode::M = nothing\n\n\n\n\n\n","category":"type"},{"location":"backends/#ADTypes.AutoForwardDiff","page":"Backends","title":"ADTypes.AutoForwardDiff","text":"AutoForwardDiff{chunksize,T}\n\nChooses ForwardDiff.jl.\n\nFields\n\ntag::T\n\n\n\n\n\n","category":"type"},{"location":"backends/#ADTypes.AutoForwardDiff-Tuple{}","page":"Backends","title":"ADTypes.AutoForwardDiff","text":"AutoForwardDiff(; chunksize = nothing, tag = nothing)\n\nConstructor.\n\n\n\n\n\n","category":"method"},{"location":"backends/#ADTypes.AutoFiniteDiff","page":"Backends","title":"ADTypes.AutoFiniteDiff","text":"AutoFiniteDiff{T1,T2,T3}\n\nChooses FiniteDiff.jl.\n\nFields\n\nfdtype::T1 = Val(:forward)\nfdjtype::T2 = fdtype\nfdhtype::T3 = Val(:hcentral)\n\n\n\n\n\n","category":"type"},{"location":"backends/#ADTypes.AutoFiniteDifferences","page":"Backends","title":"ADTypes.AutoFiniteDifferences","text":"AutoFiniteDifferences{T}\n\nChooses FiniteDifferences.jl.\n\nFields\n\nfdm::T = nothing\n\n\n\n\n\n","category":"type"},{"location":"backends/#ADTypes.AutoPolyesterForwardDiff","page":"Backends","title":"ADTypes.AutoPolyesterForwardDiff","text":"AutoPolyesterForwardDiff{chunksize}\n\nChooses PolyesterForwardDiff.jl.\n\n\n\n\n\n","category":"type"},{"location":"backends/#ADTypes.AutoPolyesterForwardDiff-Tuple{}","page":"Backends","title":"ADTypes.AutoPolyesterForwardDiff","text":"AutoPolyesterForwardDiff(; chunksize = nothing)\n\nConstructor.\n\n\n\n\n\n","category":"method"},{"location":"backends/#ADTypes.AutoReverseDiff","page":"Backends","title":"ADTypes.AutoReverseDiff","text":"AutoReverseDiff\n\nChooses ReverseDiff.jl.\n\nFields\n\ncompile::Bool = false\n\n\n\n\n\n","category":"type"},{"location":"backends/#ADTypes.AutoTracker","page":"Backends","title":"ADTypes.AutoTracker","text":"AutoTracker\n\nChooses Tracker.jl.\n\n\n\n\n\n","category":"type"},{"location":"backends/#ADTypes.AutoZygote","page":"Backends","title":"ADTypes.AutoZygote","text":"AutoZygote\n\nChooses Zygote.jl.\n\n\n\n\n\n","category":"type"},{"location":"backends/","page":"Backends","title":"Backends","text":"We also provide a few of our own:","category":"page"},{"location":"backends/","page":"Backends","title":"Backends","text":"AutoFastDifferentiation","category":"page"},{"location":"backends/#DifferentiationInterface.AutoFastDifferentiation","page":"Backends","title":"DifferentiationInterface.AutoFastDifferentiation","text":"AutoFastDifferentiation\n\nChooses FastDifferentiation.jl.\n\n\n\n\n\n","category":"type"},{"location":"backends/#Availability","page":"Backends","title":"Availability","text":"","category":"section"},{"location":"backends/","page":"Backends","title":"Backends","text":"You can use check_available to verify whether a given backend is loaded, like we did below:","category":"page"},{"location":"backends/","page":"Backends","title":"Backends","text":"header = \"| Backend | available |\" # hide\nsubheader = \"|---|---|\" # hide\nrows = map(all_backends()) do backend # hide\n \"| `$(backend_string(backend))` | $(check_available(backend) ? '✓' : '✗') |\" # hide\nend # hide\nMarkdown.parse(join(vcat(header, subheader, rows...), \"\\n\")) # hide","category":"page"},{"location":"backends/#Mutation-support","page":"Backends","title":"Mutation support","text":"","category":"section"},{"location":"backends/","page":"Backends","title":"Backends","text":"All backends are compatible with allocating functions f(x) = y. Only some are compatible with mutating functions f!(y, x) = nothing. You can use check_mutation to check that feature, like we did below:","category":"page"},{"location":"backends/","page":"Backends","title":"Backends","text":"header = \"| Backend | mutation |\" # hide\nsubheader = \"|---|---|\" # hide\nrows = map(all_backends()) do backend # hide\n \"| `$(backend_string(backend))` | $(check_mutation(backend) ? '✓' : '✗') |\" # hide\nend # hide\nMarkdown.parse(join(vcat(header, subheader, rows...), \"\\n\")) # hide","category":"page"},{"location":"backends/#Package-extensions","page":"Backends","title":"Package extensions","text":"","category":"section"},{"location":"backends/","page":"Backends","title":"Backends","text":"CurrentModule = DifferentiationInterface","category":"page"},{"location":"backends/","page":"Backends","title":"Backends","text":"Backend-specific extension content is not part of the public API.","category":"page"},{"location":"backends/","page":"Backends","title":"Backends","text":"Modules = [\n Base.get_extension(DifferentiationInterface, :DifferentiationInterfaceChainRulesCoreExt),\n Base.get_extension(DifferentiationInterface, :DifferentiationInterfaceDiffractorExt),\n Base.get_extension(DifferentiationInterface, :DifferentiationInterfaceEnzymeExt),\n Base.get_extension(DifferentiationInterface, :DifferentiationInterfaceFastDifferentiationExt),\n Base.get_extension(DifferentiationInterface, :DifferentiationInterfaceFiniteDiffExt),\n Base.get_extension(DifferentiationInterface, :DifferentiationInterfaceFiniteDifferencesExt),\n Base.get_extension(DifferentiationInterface, :DifferentiationInterfaceForwardDiffExt),\n Base.get_extension(DifferentiationInterface, :DifferentiationInterfacePolyesterForwardDiffExt),\n Base.get_extension(DifferentiationInterface, :DifferentiationInterfaceReverseDiffExt),\n Base.get_extension(DifferentiationInterface, :DifferentiationInterfaceTrackerExt),\n Base.get_extension(DifferentiationInterface, :DifferentiationInterfaceZygoteExt)\n]\nFilter = t -> !(t isa Type && t <: ADTypes.AbstractADType)","category":"page"},{"location":"backends/","page":"Backends","title":"Backends","text":"","category":"page"},{"location":"","page":"Home","title":"Home","text":"EditURL = \"https://github.com/gdalle/DifferentiationInterface.jl/blob/main/README.md\"","category":"page"},{"location":"#DifferentiationInterface","page":"Home","title":"DifferentiationInterface","text":"","category":"section"},{"location":"","page":"Home","title":"Home","text":"(Image: Dev) (Image: Build Status) (Image: Coverage) (Image: Code Style: Blue)","category":"page"},{"location":"","page":"Home","title":"Home","text":"An interface to various automatic differentiation backends in Julia.","category":"page"},{"location":"#Goal","page":"Home","title":"Goal","text":"","category":"section"},{"location":"","page":"Home","title":"Home","text":"This package provides a backend-agnostic syntax to differentiate functions of the following types:","category":"page"},{"location":"","page":"Home","title":"Home","text":"allocating: f(x) = y\nmutating: f!(y, x) = nothing","category":"page"},{"location":"#Features","page":"Home","title":"Features","text":"","category":"section"},{"location":"","page":"Home","title":"Home","text":"First and second order operators\nIn-place and out-of-place differentiation\nPreparation mechanism (e.g. to create a config or tape)\nCross-backend testing and benchmarking utilities\nThorough validation on standard inputs and outputs (scalars, vectors, matrices)","category":"page"},{"location":"#Compatibility","page":"Home","title":"Compatibility","text":"","category":"section"},{"location":"","page":"Home","title":"Home","text":"We support most of the backends defined by ADTypes.jl:","category":"page"},{"location":"","page":"Home","title":"Home","text":"backend object\nChainRulesCore.jl AutoChainRules(ruleconfig)\nDiffractor.jl AutoDiffractor()\nEnzyme.jl AutoEnzyme(Enzyme.Forward) or AutoEnzyme(Enzyme.Reverse)\nFiniteDiff.jl AutoFiniteDiff()\nFiniteDifferences.jl AutoFiniteDifferences(fdm)\nForwardDiff.jl AutoForwardDiff()\nPolyesterForwardDiff.jl AutoPolyesterForwardDiff(; chunksize)\nReverseDiff.jl AutoReverseDiff()\nTracker.jl AutoTracker()\nZygote.jl AutoZygote()","category":"page"},{"location":"","page":"Home","title":"Home","text":"We also provide one additional backend:","category":"page"},{"location":"","page":"Home","title":"Home","text":"backend object\nFastDifferentiation.jl AutoFastDifferentiation()","category":"page"},{"location":"#Example","page":"Home","title":"Example","text":"","category":"section"},{"location":"","page":"Home","title":"Home","text":"julia> import ADTypes, ForwardDiff\n\njulia> using DifferentiationInterface\n\njulia> backend = ADTypes.AutoForwardDiff();\n\njulia> f(x) = sum(abs2, x);\n\njulia> value_and_gradient(f, backend, [1., 2., 3.])\n(14.0, [2.0, 4.0, 6.0])","category":"page"},{"location":"#Related-packages","page":"Home","title":"Related packages","text":"","category":"section"},{"location":"","page":"Home","title":"Home","text":"AbstractDifferentiation.jl is the original inspiration for DifferentiationInterface.jl.\nAutoDiffOperators.jl is an attempt to bridge ADTypes.jl with AbstractDifferentiation.jl.","category":"page"},{"location":"","page":"Home","title":"Home","text":"","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"CurrentModule = Main","category":"page"},{"location":"tutorial/#Tutorial","page":"Tutorial","title":"Tutorial","text":"","category":"section"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"We present a typical workflow with DifferentiationInterface.jl and showcase its potential performance benefits.","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"using ADTypes, BenchmarkTools, DifferentiationInterface\nimport ForwardDiff, Enzyme, DataFrames","category":"page"},{"location":"tutorial/#Computing-a-gradient","page":"Tutorial","title":"Computing a gradient","text":"","category":"section"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"A common use case of Automatic Differentiation (AD) is optimizing real-valued functions with first- or second-order methods. Let's define a simple objective","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"f(x::AbstractArray) = sum(abs2, x)","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"and a random input vector","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"x = [1.0, 2.0, 3.0]","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"To compute its gradient, we need to choose a \"backend\", i.e. an AD package that DifferentiationInterface.jl will call under the hood. ForwardDiff.jl is very efficient for low-dimensional inputs, so we'll go with that one. Backend types are defined and exported by ADTypes.jl:","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"backend = AutoForwardDiff()","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"Now we can use DifferentiationInterface.jl to get our gradient:","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"gradient(f, backend, x)","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"Was that fast? We can use BenchmarkTools.jl to answer that question.","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"@btime gradient($f, $backend, $x);","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"More or less what you would get if you just used the API from ForwardDiff.jl:","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"@btime ForwardDiff.gradient($f, $x);","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"Not bad, but we can do better.","category":"page"},{"location":"tutorial/#Overwriting-a-gradient","page":"Tutorial","title":"Overwriting a gradient","text":"","category":"section"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"Since we know how much space our gradient will occupy, we can pre-allocate that memory and offer it to AD. Some backends can get a speed boost from this trick.","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"grad = zero(x)\ngrad = gradient!!(f, grad, backend, x)","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"Note the double exclamation mark, which is a convention telling you that grad may or may not be overwritten, but will be returned either way (see this section for more details).","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"@btime gradient!!($f, _grad, $backend, $x) evals=1 setup=(_grad=similar($x));","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"For some reason the in-place version is not much better than our first attempt. However, as you can see, it has one less allocation: it corresponds to the gradient vector we provided. Don't worry, we're not done yet.","category":"page"},{"location":"tutorial/#Preparing-for-multiple-gradients","page":"Tutorial","title":"Preparing for multiple gradients","text":"","category":"section"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"Internally, ForwardDiff.jl creates some data structures to keep track of things. These objects can be reused between gradient computations, even on different input values. We abstract away the preparation step behind a backend-agnostic syntax:","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"extras = prepare_gradient(f, backend, x)","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"You don't need to know what that is, you just need to pass it to the gradient operator.","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"grad = zero(x);\ngrad = gradient!!(f, grad, backend, x, extras)","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"Why, you ask? Because it is much faster, and allocation-free.","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"@btime gradient!!($f, _grad, $backend, $x, _extras) evals=1 setup=(\n _grad=similar($x);\n _extras=prepare_gradient($f, $backend, $x)\n);","category":"page"},{"location":"tutorial/#Switching-backends","page":"Tutorial","title":"Switching backends","text":"","category":"section"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"Now the whole point of DifferentiationInterface.jl is that you can easily experiment with different AD solutions. Typically, for gradients, reverse mode AD might be a better fit. So let's try the state-of-the-art Enzyme.jl!","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"For this one, the backend definition is slightly more involved, because you need to feed the \"mode\" to the object from ADTypes.jl:","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"backend2 = AutoEnzyme(Enzyme.Reverse)","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"But once it is done, things run smoothly with exactly the same syntax:","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"gradient(f, backend2, x)","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"And we can run the same benchmarks:","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"@btime gradient!!($f, _grad, $backend2, $x, _extras) evals=1 setup=(\n _grad=similar($x);\n _extras=prepare_gradient($f, $backend2, $x)\n);","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"Have you seen this? It's blazingly fast. And you know what's even better? You didn't need to look at the docs of either ForwardDiff.jl or Enzyme.jl to achieve top performance with both, or to compare them.","category":"page"},{"location":"tutorial/#Testing","page":"Tutorial","title":"Testing","text":"","category":"section"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"DifferentiationInterface.jl also provides some utilities for more involved comparison between backends. They are gathered in a submodule called DifferentiationInterfaceTest, located here in the repo.","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"using DifferentiationInterfaceTest","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"For testing, you can use test_differentiation as follows:","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"test_differentiation(\n [AutoForwardDiff(), AutoEnzyme(Enzyme.Reverse)], # backends to compare\n [gradient, pullback], # operators to try\n [Scenario(f; x=rand(3)), Scenario(f; x=rand(3,3))]; # test scenarios\n correctness=AutoZygote(), # compare results to a \"ground truth\" from Zygote\n detailed=true, # print detailed test set\n);","category":"page"},{"location":"tutorial/#Benchmarking","page":"Tutorial","title":"Benchmarking","text":"","category":"section"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"Once you have ascertained correctness, performance will be your next concern. The interface of benchmark_differentiation is very similar to the one we've just seen, but this time it returns a data object.","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"data = benchmark_differentiation(\n [AutoForwardDiff(), AutoEnzyme(Enzyme.Reverse)],\n [gradient, pullback],\n [Scenario(f; x=rand(3)), Scenario(f; x=rand(3,3))];\n);","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"The BenchmarkData object is just a struct of vectors, and you can easily convert to a DataFrame from DataFrames.jl:","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"df = DataFrames.DataFrame(pairs(data)...)","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"Here's what the resulting DataFrame looks like with all its columns. Note that the results may vary from the ones presented above (we use Chairmarks.jl internally instead of BenchmarkTools.jl, and measure slightly different operators).","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"import Markdown, PrettyTables # hide\nMarkdown.parse(PrettyTables.pretty_table(String, df; backend=Val(:markdown), header=names(df))) # hide","category":"page"},{"location":"tutorial/","page":"Tutorial","title":"Tutorial","text":"","category":"page"}] } diff --git a/dev/tutorial/index.html b/dev/tutorial/index.html index a0923b451..109750269 100644 --- a/dev/tutorial/index.html +++ b/dev/tutorial/index.html @@ -1,17 +1,17 @@ -Tutorial · DifferentiationInterface.jl

Tutorial

We present a typical workflow with DifferentiationInterface.jl and showcase its potential performance benefits.

julia> using ADTypes, BenchmarkTools, DifferentiationInterface
julia> import ForwardDiff, Enzyme, DataFrames

Computing a gradient

A common use case of Automatic Differentiation (AD) is optimizing real-valued functions with first- or second-order methods. Let's define a simple objective

julia> f(x::AbstractArray) = sum(abs2, x)f (generic function with 1 method)

and a random input vector

julia> x = [1.0, 2.0, 3.0]3-element Vector{Float64}:
+Tutorial · DifferentiationInterface.jl

Tutorial

We present a typical workflow with DifferentiationInterface.jl and showcase its potential performance benefits.

julia> using ADTypes, BenchmarkTools, DifferentiationInterface
julia> import ForwardDiff, Enzyme, DataFrames

Computing a gradient

A common use case of Automatic Differentiation (AD) is optimizing real-valued functions with first- or second-order methods. Let's define a simple objective

julia> f(x::AbstractArray) = sum(abs2, x)f (generic function with 1 method)

and a random input vector

julia> x = [1.0, 2.0, 3.0]3-element Vector{Float64}:
  1.0
  2.0
  3.0

To compute its gradient, we need to choose a "backend", i.e. an AD package that DifferentiationInterface.jl will call under the hood. ForwardDiff.jl is very efficient for low-dimensional inputs, so we'll go with that one. Backend types are defined and exported by ADTypes.jl:

julia> backend = AutoForwardDiff()AutoForwardDiff{nothing, Nothing}(nothing)

Now we can use DifferentiationInterface.jl to get our gradient:

julia> gradient(f, backend, x)3-element Vector{Float64}:
  2.0
  4.0
- 6.0

Was that fast? We can use BenchmarkTools.jl to answer that question.

julia> @btime gradient($f, $backend, $x);  479.764 ns (4 allocations: 432 bytes)

More or less what you would get if you just used the API from ForwardDiff.jl:

julia> @btime ForwardDiff.gradient($f, $x);  1.426 μs (4 allocations: 432 bytes)

Not bad, but we can do better.

Overwriting a gradient

Since we know how much space our gradient will occupy, we can pre-allocate that memory and offer it to AD. Some backends can get a speed boost from this trick.

julia> grad = zero(x)3-element Vector{Float64}:
+ 6.0

Was that fast? We can use BenchmarkTools.jl to answer that question.

julia> @btime gradient($f, $backend, $x);  468.786 ns (4 allocations: 432 bytes)

More or less what you would get if you just used the API from ForwardDiff.jl:

julia> @btime ForwardDiff.gradient($f, $x);  1.045 μs (4 allocations: 432 bytes)

Not bad, but we can do better.

Overwriting a gradient

Since we know how much space our gradient will occupy, we can pre-allocate that memory and offer it to AD. Some backends can get a speed boost from this trick.

julia> grad = zero(x)3-element Vector{Float64}:
  0.0
  0.0
  0.0
julia> grad = gradient!!(f, grad, backend, x)3-element Vector{Float64}: 2.0 4.0 - 6.0

Note the double exclamation mark, which is a convention telling you that grad may or may not be overwritten, but will be returned either way (see this section for more details).

julia> @btime gradient!!($f, _grad, $backend, $x) evals=1 setup=(_grad=similar($x));  500.000 ns (3 allocations: 352 bytes)

For some reason the in-place version is slower than our first attempt, but as you can see it has one less allocation, corresponding to the gradient vector. Don't worry, we're not done yet.

Preparing for multiple gradients

Internally, ForwardDiff.jl creates some data structures to keep track of things. These objects can be reused between gradient computations, even on different input values. We abstract away the preparation step behind a backend-agnostic syntax:

julia> extras = prepare_gradient(f, backend, x)ForwardDiff.GradientConfig{ForwardDiff.Tag{typeof(Main.f), Float64}, Float64, 3, Vector{ForwardDiff.Dual{ForwardDiff.Tag{typeof(Main.f), Float64}, Float64, 3}}}((Partials(1.0, 0.0, 0.0), Partials(0.0, 1.0, 0.0), Partials(0.0, 0.0, 1.0)), ForwardDiff.Dual{ForwardDiff.Tag{typeof(Main.f), Float64}, Float64, 3}[Dual{ForwardDiff.Tag{typeof(Main.f), Float64}}(1.0,1.0,0.0,0.0), Dual{ForwardDiff.Tag{typeof(Main.f), Float64}}(2.0,0.0,1.0,0.0), Dual{ForwardDiff.Tag{typeof(Main.f), Float64}}(3.0,0.0,0.0,1.0)])

You don't need to know what that is, you just need to pass it to the gradient operator.

julia> grad = zero(x);
julia> grad = gradient!!(f, grad, backend, x, extras)3-element Vector{Float64}: + 6.0

Note the double exclamation mark, which is a convention telling you that grad may or may not be overwritten, but will be returned either way (see this section for more details).

julia> @btime gradient!!($f, _grad, $backend, $x) evals=1 setup=(_grad=similar($x));  481.000 ns (3 allocations: 352 bytes)

For some reason the in-place version is not much better than our first attempt. However, as you can see, it has one less allocation: it corresponds to the gradient vector we provided. Don't worry, we're not done yet.

Preparing for multiple gradients

Internally, ForwardDiff.jl creates some data structures to keep track of things. These objects can be reused between gradient computations, even on different input values. We abstract away the preparation step behind a backend-agnostic syntax:

julia> extras = prepare_gradient(f, backend, x)ForwardDiff.GradientConfig{ForwardDiff.Tag{typeof(Main.f), Float64}, Float64, 3, Vector{ForwardDiff.Dual{ForwardDiff.Tag{typeof(Main.f), Float64}, Float64, 3}}}((Partials(1.0, 0.0, 0.0), Partials(0.0, 1.0, 0.0), Partials(0.0, 0.0, 1.0)), ForwardDiff.Dual{ForwardDiff.Tag{typeof(Main.f), Float64}, Float64, 3}[Dual{ForwardDiff.Tag{typeof(Main.f), Float64}}(1.0,1.0,0.0,0.0), Dual{ForwardDiff.Tag{typeof(Main.f), Float64}}(2.0,0.0,1.0,0.0), Dual{ForwardDiff.Tag{typeof(Main.f), Float64}}(3.0,0.0,0.0,1.0)])

You don't need to know what that is, you just need to pass it to the gradient operator.

julia> grad = zero(x);
julia> grad = gradient!!(f, grad, backend, x, extras)3-element Vector{Float64}: 2.0 4.0 6.0

Why, you ask? Because it is much faster, and allocation-free.

julia> @btime gradient!!($f, _grad, $backend, $x, _extras) evals=1 setup=(
@@ -23,30 +23,47 @@
  6.0

And we can run the same benchmarks:

julia> @btime gradient!!($f, _grad, $backend2, $x, _extras) evals=1 setup=(
            _grad=similar($x);
            _extras=prepare_gradient($f, $backend2, $x)
-       );  29.000 ns (0 allocations: 0 bytes)

Have you seen this? It's blazingly fast. And you know what's even better? You didn't need to look at the docs of either ForwardDiff.jl or Enzyme.jl to achieve top performance with both, or to compare them.

Testing and benchmarking

DifferentiationInterface.jl also provides some utilities for more involved comparison between backends. They are gathered in a submodule called DifferentiationInterfaceTest.

julia> using DifferentiationInterfaceTest

The main entry point is test_differentiation, which is used as follows:

julia> data = test_differentiation(
+       );  29.000 ns (0 allocations: 0 bytes)

Have you seen this? It's blazingly fast. And you know what's even better? You didn't need to look at the docs of either ForwardDiff.jl or Enzyme.jl to achieve top performance with both, or to compare them.

Testing

DifferentiationInterface.jl also provides some utilities for more involved comparison between backends. They are gathered in a submodule called DifferentiationInterfaceTest, located here in the repo.

julia> using DifferentiationInterfaceTest

For testing, you can use test_differentiation as follows:

julia> test_differentiation(
            [AutoForwardDiff(), AutoEnzyme(Enzyme.Reverse)],  # backends to compare
-           [gradient],  # operators to try
-           [Scenario(f; x=x)];  # test scenario
+           [gradient, pullback],  # operators to try
+           [Scenario(f; x=rand(3)), Scenario(f; x=rand(3,3))];  # test scenarios
            correctness=AutoZygote(),  # compare results to a "ground truth" from Zygote
-           benchmark=true,  # measure runtime and allocations too
            detailed=true,  # print detailed test set
-       );Test Summary:                     | Pass  Total  Time
-ForwardDiff (forward)             |   10     10  0.6s
-  gradient                        |   10     10  0.6s
-    f: Vector{Float64} -> Float64 |   10     10  0.6s
-Test Summary:                     | Pass  Total  Time
-Enzyme (reverse)                  |   10     10  1.9s
-  gradient                        |   10     10  1.9s
-    f: Vector{Float64} -> Float64 |   10     10  1.9s

The output of test_differentiation when benchmark=true can be converted to a DataFrame from DataFrames.jl:

julia> df = DataFrames.DataFrame(pairs(data)...)2×17 DataFrame
+       );Test Summary:                       | Pass  Total   Time
+Differentiation tests - correctness |   80     80  10.2s
+  ForwardDiff (forward)             |   40     40   4.9s
+    gradient                        |   20     20   3.4s
+      f: Vector{Float64} -> Float64 |   10     10   2.1s
+      f: Matrix{Float64} -> Float64 |   10     10   1.2s
+    pullback                        |   20     20   1.6s
+      f: Vector{Float64} -> Float64 |   10     10   0.3s
+      f: Matrix{Float64} -> Float64 |   10     10   1.2s
+  Enzyme (reverse)                  |   40     40   5.3s
+    gradient                        |   20     20   4.9s
+      f: Vector{Float64} -> Float64 |   10     10   1.7s
+      f: Matrix{Float64} -> Float64 |   10     10   3.2s
+    pullback                        |   20     20   0.4s
+      f: Vector{Float64} -> Float64 |   10     10   0.2s
+      f: Matrix{Float64} -> Float64 |   10     10   0.2s

Benchmarking

Once you have ascertained correctness, performance will be your next concern. The interface of benchmark_differentiation is very similar to the one we've just seen, but this time it returns a data object.

julia> data = benchmark_differentiation(
+           [AutoForwardDiff(), AutoEnzyme(Enzyme.Reverse)],
+           [gradient, pullback],
+           [Scenario(f; x=rand(3)), Scenario(f; x=rand(3,3))];
+       );

The BenchmarkData object is just a struct of vectors, and you can easily convert to a DataFrame from DataFrames.jl:

julia> df = DataFrames.DataFrame(pairs(data)...)8×17 DataFrame
  Row  backend                mode                 operator  variant           ⋯
      │ String                 Type                 Function  Function          ⋯
 ─────┼──────────────────────────────────────────────────────────────────────────
    1 │ ForwardDiff (forward)  AbstractForwardMode  gradient  value_and_gradien ⋯
-   2 │ Enzyme (reverse)       AbstractReverseMode  gradient  value_and_gradien
-                                                              14 columns omitted

Here's what the resulting DataFrame looks like with all its columns. Note that the results may be slightly different from the ones presented above (we use Chairmarks.jl internally instead of BenchmarkTools.jl, and measure slightly different operators).

backendmodeoperatorvariantfuncmutatinginput_typeoutput_typeinput_sizeoutput_sizesamplestimebytesallocscompile_fractiongc_fractionevals
ForwardDiff (forward)AbstractForwardModegradientvalue_and_gradient!!ffalseVector{Float64}Float64(3,)()19303.07733e-832.01.00.00.0922.0
Enzyme (reverse)AbstractReverseModegradientvalue_and_gradient!!ffalseVector{Float64}Float64(3,)()31671.13812e-6192.09.00.00.025.0
+ 2 │ ForwardDiff (forward) AbstractForwardMode gradient value_and_gradien + 3 │ ForwardDiff (forward) AbstractForwardMode pullback value_and_pullbac + 4 │ ForwardDiff (forward) AbstractForwardMode pullback value_and_pullbac + 5 │ Enzyme (reverse) AbstractReverseMode gradient value_and_gradien ⋯ + 6 │ Enzyme (reverse) AbstractReverseMode gradient value_and_gradien + 7 │ Enzyme (reverse) AbstractReverseMode pullback value_and_pullbac + 8 │ Enzyme (reverse) AbstractReverseMode pullback value_and_pullbac + 14 columns omitted

Here's what the resulting DataFrame looks like with all its columns. Note that the results may vary from the ones presented above (we use Chairmarks.jl internally instead of BenchmarkTools.jl, and measure slightly different operators).

backendmodeoperatorvariantfuncmutatinginput_typeoutput_typeinput_sizeoutput_sizesamplestimebytesallocscompile_fractiongc_fractionevals
ForwardDiff (forward)AbstractForwardModegradientvalue_and_gradient!!ffalseVector{Float64}Float64(3,)()19953.06649e-832.01.00.00.0922.0
ForwardDiff (forward)AbstractForwardModegradientvalue_and_gradient!!ffalseMatrix{Float64}Float64(3, 3)()24571.02336e-7192.05.00.00.0280.0
ForwardDiff (forward)AbstractForwardModepullbackvalue_and_pullback!!ffalseVector{Float64}Float64(3,)()27861.78479e-7416.04.00.00.0140.0
ForwardDiff (forward)AbstractForwardModepullbackvalue_and_pullback!!ffalseMatrix{Float64}Float64(3, 3)()26197.38237e-72000.010.00.00.038.0
Enzyme (reverse)AbstractReverseModegradientvalue_and_gradient!!ffalseVector{Float64}Float64(3,)()28616.09271e-7192.09.00.00.048.0
Enzyme (reverse)AbstractReverseModegradientvalue_and_gradient!!ffalseMatrix{Float64}Float64(3, 3)()27846.10313e-7192.09.00.00.048.0
Enzyme (reverse)AbstractReverseModepullbackvalue_and_pullback!!ffalseVector{Float64}Float64(3,)()30936.16894e-7192.09.00.00.047.0
Enzyme (reverse)AbstractReverseModepullbackvalue_and_pullback!!ffalseMatrix{Float64}Float64(3, 3)()85336.218e-7192.09.00.00.015.0
+