-
Notifications
You must be signed in to change notification settings - Fork 39
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Numerical computation of Kolmogorov Sinai entropy #6
Comments
Are you talking about the Pesin's identity/inequality? I think in practice people just assume hyperbolicity and calculate the sum of positive Lyapunov exponents. |
I am talking about the Kolmogorov Sinai entropy: http://www.scholarpedia.org/article/Kolmogorov-Sinai_entropy From what I recall from the lectures I have attended: You define a partition size ε and get the Shannon entropy at orbit of length n. Then you get the shannon entropy at orbit of length n+1 and define the KS entropy as: KS = lim(ε -> 0), lim(N -> oo) (1/N)*Sum(Hn+1 - Hn), the sum being from n=0 to N-1. I wish github would allow latex, god damn. This is from the book of Schuster "Deterministic Chaos". I looked at the page you cited, but the word "Kolmogorov" is not even mentioned on the entire page, so I am sceptical as to whether there is some straight-forward connection. |
Alright after some search apparently they are connect, but I still don't see a straightforward way of computing them. Using just the sum of positive lyapunovs is not enough of course :P people can already do it with the |
Oh, I didn't realize that the Scholarpedia entry of the Pesin's formula didn't mention KS entropy by name. Not a good reference indeed.
Why do you think so? I don't think you want to directly estimate KS entropy, because of combinatorial explosion of the pre-images of the partitions you have to track. Checkout, e.g., Eq (4.4) and the equation next to it from Eckmann & Ruelle (1985) and (A3) from Hunt & Ott (2015). I think this is also how people compute in practice: e.g., Monteforte Wolf (2010).
PS: Yeah, github should support equations... |
Hmm... But not impossible even without the Lyapunov exponents? |
Just as an addition: A dynamical system does not need to be hyperbolic for the Pesin's identity to be fulfilled, but that is indeed a sufficient condition. |
@RainerEngelken Cool! Thanks for filling the important detail. (BTW, I like your JuliaCon talk!) |
Hi people! First of all let me just say that I am very glad that anybody else besides myself wants to improve these libraries! A bit off-topic, but let me also say that unfortunately in the coming months I won't have any time to make any improvements because I need to focus on my phd... :( However, I promise that if anybody wants to make a PR I will definitely review it and merge it asap; I can also make you collaborators. Alrighty, now on-topic. I am not familiar with SRB and the papers Rainer cited, but it is much appreciated! If at somebody somebody wants to contribute something for this issue they can take advice. I think we can then say that this method is a "low-priority"? Since Takafumi pointed out that in research people often use sum of maximum λs? @RainerEngelken how did you do it for the neural system you had? Computed the sum as well or did a method to compute the entropy directly? |
In addition, a warning/heads-up: If you want to do a PR, please consider doing it after this issue is closed: JuliaDynamics/DynamicalSystemsBase.jl#18 so that you don't have to rewrite anything again and again. It will be a massive change of the internals so pretty much all code that doesn't use |
Yeah, I think this issue is low-priority, too. The breaking changes propagated from DifferentialEquations.jl sound much more important. PS: Good luck on your PhD! |
This paper:
has a straight-forward numeric algorithm on how to calculate an approximately equal quantity to KS-entropy (that they call K_2) |
solution for this issue exists in the book by Schuster and Just, equation 6.107 |
EDIT: solution for this issue exists in the book by Schuster and Just, equation 6.107.
EDIT2: solution for this issue exists also in the book by Kantz and Schreiber, section 11.4.5
EDIT3: solution for this also exists in P. Grassberger and I. Procaccia, “Estimation of the Kolmogorov entropy from a chaotic signal,” Phys. Rev. A, vol. 28, no. 4, pp. 2591–2593, 1983. They show how to compute a quantity approximately equal to KS.
From @Datseris on October 2, 2017 16:36
Another commonly used entropy is the "KS-entropy".
I would be reluctant to point people to the original papers (Ya.G. Sinai, On the Notion of Entropy of a Dynamical System, Doklady of Russian Academy of Sciences, (1959), 124, 768-771. and A.N. Kolmogorov, Entropy per unit time as a metric invariant of automorphism, Doklady of Russian Academy of Sciences, (1959), 124, 754-755.) as they are veeery mathematical and were written before computers became a thing. I also do not know any paper that treats the subject (I haven't searched for one!).
I think @RainerEngelken has done this, maybe he can help us a bit here with some comments.
Copied from original issue: JuliaDynamics/DynamicalSystems.jl#28
Want to back this issue? Post a bounty on it! We accept bounties via Bountysource.
The text was updated successfully, but these errors were encountered: