In this paper, we investigate the nonlinear, finite dimensional and data independent random Fourier feature expansions that can approximate the popular Gaussian kernel. With recursive least squares algorithm, we develop the Random Fourier Feature Recursive Least Squares algorithm (RFF-RLS), which shows significant performance improvements in simula-tions when compared with several other online kernel learning algorithms such as Kernel Least Mean Square (KLMS) and Kerne Recursive Least Squares (KRLS). Our results confirm that the RFF-RLS can achieve desirable performance with low computational cost. As for the random Fourier features, the randomization generally results in redundancy. We use an algorithm, namely, Vector Quantization with Information Theoretic Learning (VQIT) to decrease the dictionary size. The resulting sparse dictionary can match the original data distribution well. The RFF-RLS with VQIT can outperform the RFF-RLS without VQIT.
Change to in the discription of Algorithm 1 in page 3. This formula is one of the iterative formulas in the RLS algorithm, but it was written incorrectly in the article. Therefore, in this document we correct the wrong formula and published the code of the experiment.
Matlab
If you use this code, please cite the following paper:
[1] Z. Qin, B. Chen, and N. Zheng, “Random fourier feature kernel recursive least squares,” in Neural Networks (IJCNN), 2017 International Joint Conference on. IEEE, 2017, pp. 2881–2886.