3

I am having a tough time understanding what is the difference between Shannon's source entropy, Kolmogorov-Sinai (KS) entropy and Topological entropy is. The document that I am following is:

http://www.maths.manchester.ac.uk/~cwalkden/magic/lecture08.pdf

Definitions say that Kolmogorov_sinai entropy, also known as metric entropy is the supremum of Shannon's entropy.

  • $h_{KS} = \sup \frac{1}{M}H$ where $M$ denotes the number of unique symbols, $H$ is the Shannon's entropy.

Shannon's entropy is given by:

\begin{align} H= -\sum_{k=1}^M\mathsf{p}_k \ln(\mathsf{p}_k) \end{align} is the entropy of the source which emits M=2 symbols, $\mathsf{p}_k$ is the probability to find a symbol. The source entropy of a dynamical system is given as: \begin{align} H_\text{source} = \lim _{M \rightarrow \infty} \frac{1}{M} H \end{align}

  • What is the meaning of the limit term? Is Source entropy and Shannon's entropy the same thing?

  • If the sender sends 10 compound symbols at a time and each symbol can either be 0 or 1, then what is the entropy? As an example on how to calculate Shannon's entropy, consider an array A representing a message:

    A = [1 0 0 1 0 0 0 1 0 1 1 0 0 1 0 0 0 1 0 1];
    
    N = length(A);
    p_1 = sum(A==1)/length(A);
    p_0 = sum(A==0)/length(A);
    ShannonH = -p_1*log2(p_1)-(1-p_1)*log2(1-p_1)
    

In this example M=2. The code outputs ShannonH = 0.9710 for this example. So based on the formula of KS entropy is supremum of ShannonH/2 = 0.9710/2 = 0.4855

Is this how KS entropy is calculated? I cannot understand what the supremum of Shannon entropy is an how to calculate KS entropy.

  • What is Topological entropy?

An example to explain will be really helpful. Please correct me if anything is wrong or misleading. Thank you.

stafusa
  • 12,435
Srishti M
  • 291
  • I think the supremum is taken over all finite messages, based on what I saw in the article. edit: it said partitions. I guess the partitions are the different ways you can put the 0s and 1s for a certain length of a mesaage? – Emil Oct 15 '17 at 20:22
  • @Emil: Yes, partitions means dividing the state space of the dynamical system or if we have a time series then based on the mean of the time series we can assign symbols. But I cannot understand the meaning of infinum and supremum--is it the same as minimum and maximum of a set? – Srishti M Oct 15 '17 at 20:56
  • @Shristi M: it is almost like maximum but more general, you can find it on wikipedia. Smallest upper value or something like that – Emil Oct 16 '17 at 06:04

1 Answers1

2

The clearest paper on the subject I've seen is Young's (e-print). Here's her summary ($h_\mu\equiv h_\text{KS}$):

Both $h_\mu$ and $h_\text{top}$ measure the exponential rates of growth of $n$-orbits:
- $h_\mu$ counts the number of typical $n$-orbits, while
- $h_\text{top}$ counts all distinguishable $n$-orbits.

The Kolmogorov–Sinai entropy is a metric, measure-theoretic entropy (rate) based on the Shannon entropy, while the topological entropy is a combinatorial entropy that gives the growth rate of the number of allowed sequences. Also:

  • KS (metric) entropy: supremum over partitions for a given measure;
  • topological entropy: supremum over measures.

Notice that we might need to infer from the adopted definitions or context whether one is referring to entropy proper or entropy rate.

With respect to explicit calculations, for a particular case you can check the post Topological entropy in Markov chains.

About the math terms: the supremum is a generalization of maximum; and the limit is intended to give you the asymptotic entropy per symbol.

stafusa
  • 12,435