Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

time series #82

Open
atteson opened this issue Apr 20, 2018 · 3 comments
Open

time series #82

atteson opened this issue Apr 20, 2018 · 3 comments

Comments

@atteson
Copy link

atteson commented Apr 20, 2018

I tried to extend this to use for time series (in particular, gaussian times series with long memory) by extending Stationary but I couldn't seem to work out the details. The top of Stationary indicated 2 methods that needed to be implemented but it seems more were required and I gave up after 4 or 5. Documentation on how to write a new kernel would be helpful here.

@chris-nemeth
Copy link
Member

Thanks for raising this issue. We're currently working on a readthedocs page to provide more information on the package. There is a notebook which gives a time series example, but I think what you're looking for will require new kernels to be implemented. Could you give us some more information about what you're trying to do and perhaps we can help you create the kernel you need?

@atteson
Copy link
Author

atteson commented Apr 23, 2018

Thanks for responding. What I have in mind is to use a stationary kernel where the covariance with n lags is given by something like:

gamma(n) = n == 0 ? sigma : lambda * n^-alpha

where sigma, lambda and alpha are parameters which I'd like to ultimately try to optimize. I've written most of this this as a stand alone type right now (including calculation of derivatives for optimization where ReverseDiff is used for the function above since I'd like to experiment with different functions) but would be happy to try to integrate it if you can help with how to develop a custom kernel.

@chris-nemeth
Copy link
Member

We've developed the stationary kernels based on weighted squared Euclidean distance as this is common to most of the standard kernels. I think it would be difficult to put your kernel into that framework by, for example, using a similar style to the cov function we have for SE. The easiest way forward is probably to use the Const or Lin kernels as a template. Most of the functions here, such as get_params will be straightforward and the cov function will be the gamma function you've defined. For the gradients, you should only need the first function.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants