2

Is there a relation between Information and Entropy (as defined by Shannon) and Entropy in Physics? Other than that they have similar formulas

For example: Is information physical? Is it conserved for a closed system?

When Entropy increases in a closed system, is information lost (in some sense)? Can we answer as to where that information went (if it was ever there)?

Qmechanic
  • 201,751
  • check out https://www.scottaaronson.com/blog/?p=3327. This is an incredible breakdown of the argument and definition of information is physical. There are loopholes but they are extreme. Look at comments 15/27 too for a very simple argument. – J Kusin Sep 14 '20 at 03:43

3 Answers3

1

Shannon entropy is just a generalization of thermodynamic entropy.

Yes information is physical.

if you have all the information about a system you can use it to extract energy from it and do work by manipulating the particles of the system out of thermal equilibrium. But increasing information requires some kind of computation which increases entropy because it requires energy, so second law cant be avoided.

If you have a system in thermal equilibrium and you separate two parts of it with a door, and you have all the information about the system you can open the door each time a fast particle comes to it and let it in one part and when a slow one comes let it the other part. So you will make one part hoter and if you get rid of the door you can extract energy from the heat flow. But to get that information you need a lot of computation and for that a lot if energy

Kugutsu-o
  • 856
1

Yes, information is physical.

Yes, information is conserved. But you have to remember that all information is conserved, not just that tiny part which is known to you. Microscopic information tends to be very sensitive and fragile.

Yes, entropy relates to information, though not directly in physics because physical information is quite difficult to extract on microscopic level. We cannot observe single atoms yet, because they are smaller then X-ray resolution. We can only observe quantum-mechanical tunnel current disturbances, but it is indirect approach.

Also the nature of molecular physics lies is chaotic movements, what makes observation of single atoms even more difficult. Add here mathematical chaos which exists perpetually, and you will see that retrieving information is still impossible task even today. Even more, add here quantum mechanical chaos which also exists perpetually and breaks the link between you as observer from the system you observe.

sanaris
  • 875
  • 5
  • 8
  • how does entropy relate to information? Is entropy proportional to information? If so, where does excess information go, given that entropy increases for a closed system for irreversible processes? – PhyEnthusiast Feb 02 '20 at 19:09
  • 2
    Without an explicit definition of information any statement about it is void and unverifiable. – GiorgioP-DoomsdayClockIsAt-90 Feb 02 '20 at 19:23
  • @GiorgioP if physics would get a single solvable definition, there would be no physics after it gets solved – sanaris Feb 02 '20 at 19:24
  • 1
    @sanaris Still, it is difficult to make physics without relating things to an even partial definition. If one takes Shannon's definition of information and information entropy, it is not obvious the exact meaning of statements like "information is physical" or "information is conserved". I am not saying that they are false. Simply that they are not trivial consequences of Shannon's definition. If I am wrong, please give me a reference. – GiorgioP-DoomsdayClockIsAt-90 Feb 02 '20 at 19:33
  • We can have information as single record about exact coordinates of every atom of some body, written in memory of some virtual logical machine which does not consume energy (what is also impossible because as all machines it must consume energy to retrieve and change information according to Carnot theory). – sanaris Feb 02 '20 at 20:22
  • 1
    @sanaris Yes I know. But where is the connection with Shannon's definition which is a definition based on probabilities, not on individual configurations? – GiorgioP-DoomsdayClockIsAt-90 Feb 02 '20 at 21:03
1

Is there a relation between Information and Entropy (as defined by Shannon) and Entropy in Physics?

Yes, as far as I know, the relation is given by Landauer's principle, which states that in order to erase information, it is necessary to dissipate energy or, in other words, erasing a bit of information creates $kT \ln{2}$ entropy. This states the equivalence of logical irreversibility and thermodynamic physical irreversibility.

(taken from: The physics of forgetting: Thermodynamics of Information at IBM 1959–1982 by Aaron Sidney Wright https://www.mitpressjournals.org/doi/pdfplus/10.1162/POSC_a_00194)

Is information physical?

Yes, information is physical. As far as I concern, Swanson asserted that information can be defined per unit volume rather than per symbol as in Shannon’s information theory (sic).

(taken from: The physics of forgetting: Thermodynamics of Information at IBM 1959–1982 by Aaron Sidney Wright https://www.mitpressjournals.org/doi/pdfplus/10.1162/POSC_a_00194)

Is it (information) conserved for a closed system?

Yes. You can see this previous post (Where does deleted information go?) for more details about it.

When Entropy increases in a closed system, is information lost (in some sense)?

Yes, that's because of Landauer's principle: 'erasing a bit of information creates $kT ln 2$ entropy.'

Can we answer as to where that information went (if it was ever there)?

Yes, we can. You can find the answer to that question in this post Where does deleted information go?