02/02/2018, 14:00 - 15:00 in MAP/0G/018
Boreland, Gareth (Queen’s University Belfast)
The concept of entropy is fundamental in Information Theory, and Shannon’s source coding theorem for the discrete, memoryless source gives a natural interpretation of the entropy of a probability distribution on a finite set. We will summarise how we can define the Kolmogorov-Sinai entropy in the case of a source with memory, that is, where the letters sent in successive positions are not independently distributed. We recall how the Kolmogorov-Sinai Theorem and the Shannon- McMillan-Breiman Theorem allow the source coding problem to be solved for ergodic sources. Kolmogorov-Sinai entropy can be defined on any discrete dynamical system,and it is invariant on isomorphism.
The graph entropy H(G,P) of a graph G with probability distribution P on its vertices was a concept introduced by János Körner in 1973 to solve the problem of source coding for a memoryless source over a finite alphabet whose symbols are not all mutually distinguishable. We will outline the problem that Körner solved, and then summarise some early results which attempt to generalise graph entropy to the setting of a source with memory, including a definition of graph entropy which is invariant under isomorphism and a version of the Kolmogorov-Sinai Theorem.