A minimal engine for automatic differentiation
Cascade is a lightweight C++ library for automatic differentiation and error propagation. It provides a streamlined engine to compute gradients of arbitrary functions and propagate uncertainties in the inputs to the outputs. The library simplifies the implementation of gradient-based optimization algorithms and the error analysis in scientific computing and engineering applications.
Create a build
folder in the root directory and cd
it. Build the library and tests executable with:
cmake -DCMAKE_BUILD_TYPE=Release ..
cmake --build .
The library is built in build/src
and the tests executable in build/tests
.
To further install the library do:
cmake --build . --target install
If you want to install the library in a custom directory set the install path first:
cmake -DCMAKE_BUILD_TYPE=Release -DCMAKE_INSTALL_PREFIX=/your/install/path ..
cmake --build . --target install
The installation folder has the following structure:
/your/install/path
├── bin
│ ├── example_covariances
│ ├── example_derivatives
│ ├── example_optimization
│ └── run_tests
├── include
│ ├── cascade.h
│ ├── functions.h
│ └── var.h
└── lib
├── libcascade_shared.so
└── libcascade_static.a
To use Cascade in your project simply include the cascade.h
header file and link against the library.
example_derivatives.cpp
#include "cascade.h"
#include <iostream>
using namespace cascade;
int main()
{
Var x = 2.5;
Var y = 1.2;
Var z = 3.7;
Var f = pow(x, 2.0) * sin(y) * exp(x / z);
// Propagate the derivatives downstream
f.backprop();
// Recover the partial derivatives from the leaf nodes
double fx = x.derivative();
double fy = y.derivative();
double fz = z.derivative();
std::cout << "Value of f: " << f.value() << std::endl;
std::cout << "Gradient of f: (" << fx << " " << fy << " " << fz << ")" << std::endl;
return 0;
}
g++ example_derivatives.cpp -o example_derivatives -I/your/install/path/include -L/your/install/path/lib -lcascade_static
./example_derivatives
Value of f: 11.4487
Gradient of f: (12.2532 4.45102 -2.09071)
example_covariances.cpp
#include "cascade.h"
#include <iostream>
using namespace cascade;
int main()
{
// Create variables by providing their values and standard deviations (default to 0.0)
Var x = {2.1, 1.5};
Var y = {-3.5, 2.5};
Var z = {5.7, 1.4};
// Set the covariances between them (default to 0.0)
Var::setCovariance(x, y, 0.5);
Var::setCovariance(x, z, 1.8);
Var::setCovariance(y, z, -1.0);
// Compute a function of the variables
Var f = (x + y) * cos(z) * x;
bool changed = Var::setCovariance(f, x, 1.0);
if (!changed)
{
std::cout << "Covariance involving a functional variable cannot be set" << std::endl;
}
// Computing covariances involving functional variables triggers backpropagation calls
std::cout << Var::covariance(f, x) << std::endl;
std::cout << Var::covariance(f, y) << std::endl;
std::cout << Var::covariance(f, z) << std::endl;
std::cout << Var::covariance(f, f) << std::endl;
std::cout << f.sigma() * f.sigma() << std::endl;
return 0;
}
g++ example_covariances.cpp -o example_covariances -I/your/install/path/include -L/your/install/path/lib -lcascade_static
./example_covariances
Covariance involving a functional variable cannot be set
-0.723107
12.8668
-3.87443
28.4044
28.4044
Cascade uses reverse mode automatic differentiation, most commonly known as backpropagation, to compute exact derivatives of arbitrary piecewise differentiable functions. The key concept behind this technique is compositionality. Each function defines an acyclic computational graph where its nodes store the intermediate values of the operations that make up the result, and each edge stores the derivative of a parent node with respect to a child node. One can then efficiently apply the chain rule on the function by sorting the graph nodes in a topological order and allowing the derivatives to flow backwards, from the output to the inputs. For example, the graph of
Here each node is given a name and is already indexed following a topological order. The partial derivative of the function with respect to
Formally,
If you have a set of
and covariance matrix
Whenever you compute a value
For uncorrelated variables one recovers the well-known formula for error propagation:
The same idea can also be used to compute the covariance between two variables that are both functions of
You can see that computing covariances between arbitrary piecewise differentiable functions reduces to a differentiation problem.