MatX has many features directly inspired by MATLAB and Python (Numpy/Scipy) for translating these high-level languages into C++. The table below aims to give users of either of those tools a translation guide to writing efficient MatX code by learning the syntax mapping between the tools. Most of these conversions can also be found inside either the unit tests or the source code as well.
Some general rules to keep in mind about these three tools:
- The Python column assumes both Numpy and Scipy are being used for certain library calls.
- Both Python and MATLAB use the term "multi-dimensional array". MatX calls these tensors.
- The MatX syntax for most operations show only what would be in the
set
call. For example, multiplying two tensors might look like:set(a, a*b);
. Since operations can be chained, the MatX operation is only intended to show what would be in the right hand set of theset
call. - MATLAB uses 1-based indexing while Python and MatX use 0-based indexing.
- MATLAB uses inclusive ranges on indexing, while Python and MatX use exclusive ranges.
- MATLAB and Python may or may not make a copy of a tensor behind-the-scenes to improve performance. MatX makes this explicit. by never making a copy unless the function call mentions that it copies.
- MATLAB uses column-major (FORTRAN) memory order, while Python and MatX use row-major (C). When converting optimized MATLAB scripts, it may be beneficial to transpose the dimension so the fastest changing dimension is the inner-most dimension.
Operation | MATLAB | Python | MatX | Notes | Examples |
---|---|---|---|---|---|
Basic indexing | A(1,5) |
A[0,4] |
A(0,4) |
Retrieves the element in the first row and fifth column | |
Tensor addition | A + B |
A + B |
A + B |
Adds two tensors element-wise | |
Tensor subtraction | A - B |
A - B |
A - B |
Subtracts two tensors element-wise | |
Tensor multiplication | A .* B |
A * B |
A * B |
Multiplies two tensors element-wise | |
Tensor division | A ./ B |
A / B |
A / B |
Divides two tensors element-wise | |
Tensor slice (contiguous) | A(1:4,2:5) |
A[0:5,1:6] |
Slice({0,1}, {5,6}); |
Slices 4 elements of the outer dimension starting at 0, and 5 elements of the inner dimension, starting at the second element. | |
Tensor slice (w/stride) | A(1:2:end,2:3,8) |
A[::2,1:9:3] |
A.Slice({0,1}, {matxEnd,6}, {2,3}); |
Slices N elements of the outer dimension starting at the first element and picking every second element until the end. In the inner dimension, start at the first element and grab every third item, and stop at the 8th item. | |
Cloning a dimension | reshape(repmat(A, [4,1]), [4 4 4]) |
np.repeat(np.expand_dims(A, axis=0), 5, 0) |
A.Clone<3>({5, matxKeepDim, matxKeepDim}) |
Takes a 4x4 2D tensor and makes it a 5x4x4 3D tensor where every outer dimension replicates the two inner inner dimensions | |
Slice off a row or column | A(5,:) |
A[4,:] |
A.Slice<1>({4, 0}, {matxDropDim, matxEnd}) |
Selects the fifth row and all columns from a 2D tensor, and returns a 1D tensor | |
Permute dimensions | permute(A, [3 2 1]) |
np.einsum('kij->ijk', A) |
A.Permute({2,1,0}) |
Permutes the three axes into the opposite order | |
Get real values | real(A) |
np.real(A) |
A.RealView() |
Returns only the real values of the complex series | |
Matrix multiply (GEMM) | A * B |
np.matmul(A, B) |
gemm(A, B) |
Computes the matrix multiplication of A * B |
|
Compute matrix inverse | inv(A) |
np.linalg.inv(A) |
inv(A) |
Computes the inverse of matrix A using LU factorization | |
1D FFT | fft(A) |
np.fft.fft(A) |
fft(A) |
Computes the 1D fast fourier transfor, (FFT) of rows of A | |
1D IFFT | ifft(A) |
np.fft.ifft(A) |
ifft(A) |
Computes the 1D inverse fast fourier transfor, (IFFT) of rows of A | |
2D FFT | fft2(A) |
np.fft.fft2(A) |
fft2(A) |
Computes the 2D fast fourier transfor, (FFT) of matrices in outer 2 dimensions of A | |
2D IFFT | ifft2(A) |
np.fft.ifft2(A) |
ifft2(A) |
Computes the 2D inverse fast fourier transfor, (IFFT) of matrices in outer 2 dimensions of A | |
Covariance | cov(A) |
np.cov(A) |
cov(A) |
Computes the covariance on the rows of matrix A |