5

I am interested in knowing if some one here knows book/notes for statistical mechanics from the information theoretic viewpoint.

Additional Request from user83014

"Jaynes wrote a paper called Information theory and statistical mechanics (1957). I tried to read this but found it somewhat hard to follow, and I suspect it's because it's one of the earlier works, and that a modern text might be more refined since we've had time to work it out. On the other hand, what I've read from physics textbooks don't explicitly talk about information theoretic concepts like shannon entropy.

"Is there a good modern introduction to the information theoretic view of entropy?"

sammy gerbil
  • 27,277

3 Answers3

5

I would highly recommend the two seminal papers by E. T. Jaynes,

http://journals.aps.org/pr/abstract/10.1103/PhysRev.106.620

and,

http://journals.aps.org/pr/abstract/10.1103/PhysRev.108.171

Also check out the book by E. T. Jaynes, which has a focus on the foundations in probability but is rather light on applications in physics:

http://www.amazon.com/Probability-Theory-E-T-Jaynes/dp/0521592712/ref=sr_1_1?ie=UTF8&qid=1454400478&sr=8-1&keywords=e+t+jaynes

Or for an excellent book-length historical overview with a focus on both physics and information theory, see his long paper "Where do we stand on maximum entropy?"

http://bayes.wustl.edu/etj/articles/stand.on.entropy.pdf

N. Virgo
  • 33,913
3

Information Theory: A Tutorial Introduction by James V Stone, Sebtel Press 2016

Richly illustrated with accessible examples such as everyday games like '20 questions.' Online MatLab and Python computer programs provide hands-on experience. Written in an informal style. An ideal primer for novices who wish to learn the essential principles and applications of information theory.

Ergodic theory and information by Patrick Billingsley, J Wiley 1965

Review by Arshag Hajian

Far from modern but very reader-friendly. Loose, free, somewhat controversial style. Uses many simple familiar examples and interesting side discussions to motivate notions and techniques. Coverage of the usual topics in information theory is very limited. Useful for a beginner in the field.

Elements of Information Theory by Thomas M. Cover & Joy A. Thomas, J Wiley 2ed 2006 748pp

Clear, thought-provoking instruction. Engaging mix of mathematics, physics, statistics, and information theory.

Covers all the essential topics in information theory : entropy, data compression, channel capacity, rate distortion, network information theory, and hypothesis testing. Problem sets, summary at the end of each chapter, historical notes.

The ideal textbook for upper-level undergraduate and graduate courses in electrical engineering, statistics, and telecommunications.

Information Theory, Part I: An Introduction to the Fundamental Concepts by Arieh Ben-Niem, World Scientific 2017, 350pp

"...Written for those using the concepts of information theory to study problems considered outside of usual realm of information theory."

Unlike many books, which refer to the Shannon's Measure of information (SMI) as "Entropy," this book makes a clear distinction between the SMI and Entropy. In the last chapter, Entropy is derived as a special case of SMI.

Friendly, simple language, full of practical examples. Emphasis is on the concepts and their meaning rather on the mathematical details.

Entropy and Information Theory (1ed 1990 2ed 2011) by Robert M. Gray

Not an introduction but a thorough, formal development of Shannon's mathematical theory of communication.

Its practical goals are the theory of probabilistic information measures and their application to coding theorems for information sources and noisy channels. But it is mostly devoted to tools and methods in random variables, random processes, and dynamical systems. Examples are entropy, mutual information, conditional entropy, conditional information, and discrimination or relative entropy, along with the limiting normalized versions of these quantities such as entropy rate and information rate. It is the only up-to-date treatment of traditional information theory emphasizing ergodic theory.


Acknowledgement : Mathematics SE : A good textbook to learn about entropy and information theory

sammy gerbil
  • 27,277
2

Regarding the request from user83014 for a modern source, I can recommend the quite recent publication "Entropy? Honest!" by Tommaso Toffoli, a professor at Boston University that conducted most of his research in the field of cellular automata.

Other than that Arieh Ben-Naim published several popular science books on the topic in recent years and introduced some disruptive ideas in his arXiv publications. I did not have a look at the books themselves yet: The list of contents looks interesting and promising but he published so many seemingly very similar books within the last 15 years that I am unsure which one to start with. I personally think that sadly his talk at MaxEnt2011 was not particularly good. Before watching it, I expected it to make me want to get my hands on one of his books but it really didn't.

2b-t
  • 1,749