0

This question is inspired by the following example given in Barnett's book on quantum information:

enter image description here

enter image description here

enter image description here

(PDF here, starting at the bottom of page 5)

I am not a physicist, but I'll accept that the calculation of the increase in physical entropy associated with this procedure is correct. My confusion is about how the second law of thermodynamics can be saved by claiming that the process of measuring the position of the molecule produces an entropy of at least $k_B\ln2$.

My question is: Why can we just assume that "entropy" of information is comparable with physical entropy? The best explanation that I've been able to find is that they're defined in similar ways and satisfy the same basic properties. This seems no different from saying "decibels and earthquake intensity are both measured logarithmically and are therefore comparable", which of course makes no sense.

As I said above, my only physics background is a first-year course, so I'd appreciate if answers could try to be accessible to someone who hasn't done thermodynamics...

NNN
  • 189
  • (1/2) Also, there is a famous quote on this subject by Claude Shannon, who discovered (and named) entropy of information: – Rococo Jan 20 '19 at 23:34
  • (2/2) "My greatest concern was what to call it. I thought of calling it 'information,' but the word was overly used, so I decided to call it 'uncertainty.' When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, 'You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one knows what entropy really is, so in a debate you will always have the advantage." – Rococo Jan 20 '19 at 23:34

0 Answers0