-
Notifications
You must be signed in to change notification settings - Fork 125
GSoc 2023
In mathematics and computer algebra, automatic differentiation (AD) is a set of techniques to numerically evaluate the derivative of a function specified by a computer program. Automatic differentiation is an alternative technique to Symbolic differentiation and Numerical differentiation (the method of finite differences). Clad is based on Clang which provides the necessary facilities for code transformation. The AD library is able to differentiate non-trivial functions, to find a partial derivative for trivial cases and has good unit test coverage.
Vector mode support will facilitate the computation of gradients using the forward mode AD in a single pass and thus without explicitly performing differentiation n times for n function arguments. The major benefit of using vector mode is that computationally expensive operations do not need to be recomputed n times for n function arguments.
For example, if we want to compute df/dx
and df/dy
of a function f(x, y)
using the forward mode AD in Clad, then currently we need to explicitly differentiate f
two times. Vector mode will allow the generation of f_d(x, y)
such that we will be able to get partial derivatives with respect to all the function arguments (gradient) in a single call.
After successful completion of the project the code snippet should work as expected:
#include <clad/Differentiator/Differentiator.h>
#include <iostream>
double someComputationalIntensiveFn();
double fn(double x, double y) {
double t = someComputationalIntensiveFn(); // should be computed only once
// in the derived function.
double res = 2 * t * x + 3 * t * x * y;
return t;
}
int main() {
auto d_fn = clad::differentiate(fn, "arr");
double d_x = 0, d_y = 0;
d_fn.execute(3, 5, &d_x, &d_y);
std::cout << "Derivative of fn wrt d_x: " << d_x << "\n";
std::cout << "Derivative of fn wrt d_y: " << d_y << "\n";
}
Task ideas and expected results:
Extend and generalize our ForwardModeVisitor to produce a single function with the directional derivatives; Add a new mode to the top-level clad interface clad::differentiate
for vector mode; Extend the unit test coverage; Develop tutorials and documentation; Present the work at the relevant meetings and conferences.
Necessary skills: Intermediate C++; Understanding basic differential calculus; intermediate knowledge of clang and llvm.
Mentors: Vassil Vassilev (vgvassilev); Parth Arora (parth-07)
In mathematics and computer algebra, automatic differentiation (AD) is a set of techniques to numerically evaluate the derivative of a function specified by a computer program. Automatic differentiation is an alternative technique to Symbolic differentiation and Numerical differentiation (the method of finite differences). Clad is based on Clang which provides the necessary facilities for code transformation. The AD library is able to differentiate non-trivial functions, to find a partial derivative for trivial cases and has good unit test coverage.
Clad currently only supports differentiation with respect to single-dimensional arrays. Support for differentiation with respect to pointers is limited as well. This project aims to add support for multi-dimensional arrays (and pointers) in Clad.
After successful completion of the project the code snippet should work as expected:
#include <iostream>
#include "clad/Differentiator/Differentiator.h"
double fn(double arr[5][5]) {
double res = 1 * arr[0][0] + 2 * arr[1][1] + 4 * arr[2][2];
return res * 2;
}
int main() {
auto d_fn = clad::gradient(fn);
double arr[5][5] = {{1, 2, 3, 4, 5},
{6, 7, 8, 9, 10},
{11, 12, 13, 14, 15},
{16, 17, 18, 19, 20},
{21, 22, 23, 24, 25}};
double d_arr[5][5] = {};
d_fn.execute(arr, d_arr);
std::cout << "Derivative of d_fn wrt arr[0][0]: " << d_arr[0][0] << "\n"; // 2
std::cout << "Derivative of d_fn wrt arr[1][1]: " << d_arr[1][1] << "\n"; // 4
return 0;
}
Necessary skills: Intermediate C++; Understanding basic differential calculus; intermediate knowledge of clang and llvm.
Mentors: Vassil Vassilev (vgvassilev); Parth Arora (parth-07)
Clad is an automatic differentiation (AD) clang plugin for C++. Given a C++ source code of a mathematical function, it can automatically generate C++ code for computing derivatives of the function. Clad has found uses in statistical analysis and uncertainty assessment applications.
Object oriented paradigms (OOP) provide a structured approach for complex use cases, allowing for modular components that can be reused & extended. OOP also allows for abstraction which makes code easier to reason about & maintain. Gaining full OOP support is an open research area for automatic differentiation codes.
This project focuses on improving support for differentiating object-oriented constructs in Clad. This will allow users to seamlessly compute derivatives to the algorithms in their projects which use an object-oriented model. C++ object-oriented constructs include but are not limited to: classes, inheritance, polymorphism, and related features such as operator overloading.
- Study the current object-oriented differentiable programming support in Clad. Prepare a report of missing constructs that should be added to support the automatic differentiation of object-oriented paradigms in both the forward mode AD and the reverse mode AD.
- Some of the missing constructs are: differentiation of constructors, limited support for differentiation of operator overloads, reference class members, and no way of specifying custom derivatives for constructors.
- Add support for the missing constructs.
- Add proper tests and documentation.
Necessary skills: C++, Automatic Differentiation, Clang frontend Mentors: Vassil Vassilev (vgvassilev); Parth Arora (parth-07)
If you have used clad and you have a particular project proposal please contact vgvassilev.
Clad is an automatic differentiation (AD) clang plugin for C++. Given a C++ source code of a mathematical function, it can automatically generate C++ code for computing derivatives of the function. Clad has found uses in statistical analysis and uncertainty assessment applications. In scientific computing and machine learning, GPU multiprocessing can provide a significant boost in performance and scalability. This project focuses on enabling the automatic differentiation of CUDA GPU kernels using Clad. This will allow users to take advantage of the power of GPUs while benefiting from the accuracy and speed of automatic differentiation.
- Research about automatic differentiation of code involving CUDA GPU kernels. Prepare a report and an initial strategy to follow.This may involve brainstorming and the need for innovative solutions.
- Enable reverse-mode automatic differentiation of CUDA GPU kernels and calls to CUDA GPU kernels from the host code.
- Add proper tests and documentation.
Necessary skills: C++, Automatic Differentiation, CUDA, Clang frontend
Mentors: Vassil Vassilev (vgvassilev); Parth Arora (parth-07)
Clad is an automatic differentiation (AD) clang plugin for C++. Given a C++ source code of a mathematical function, it can automatically generate C++ code for computing derivatives of the function. Clad has found uses in statistical analysis and uncertainty assessment applications.
Automatic differentiation techniques involve computing partial derivatives of each intermediate variable encountered while automatically differentiating a function. Users are generally interested in the derivatives of a subset of the output variables with respect to a subset of the input variables. In this case, partial derivatives of many intermediate variables may not contribute to final derivatives and therefore can be ignored and not computed. Activity analysis finds intermediate variables whose partial derivatives contribute to the final required derivatives. It allows the AD tool to only compute the set of partial derivatives that are required. By not computing partial derivatives for such intermediate variables, both the memory requirement and the run time of the generated program can be reduced.
- Research about automatic differentiation activity analysis techniques. Prepare an activity analysis model report with an initial strategy to follow. This may involve brainstorming and the need for innovative solutions.
- Implement the proposed activity analysis mode.
- Add tests and documentation.
Necessary skills: C++, Automatic Differentiation, Program analysis techniques, Clang frontend
Mentors: Vassil Vassilev (vgvassilev); Parth Arora (parth-07)
If you have interest in working on the project there is a list of things to do in order to maximize your chances to get selected:
- Contact the mentors and express interest in the project. Make sure you attach your CV;
- Download the source code of the project, build it and run the demos;
- Start familiarizing yourself with the codebase;
- If you have questions you can always contact a mentor.
The mentors are interested in working with all candidates but unfortunately the rules allow only one to be selected. There are a few tasks which give bonus points to candidate's application:
- Submit a valid bug -- demonstrates that the candidate has completed step 2 and 3 from the previous section.
- Fix a bug -- demonstrates the technical skills of the candidate and shows he/she can work independently on the project. The mentors can suggest looking into these good first issues. Fixing one issue may be enough to become a successful candidate.
Good luck!