27

I recently attended a talk by Dr. Ravi Gomatam on 'quantum reality', where the speaker suggested, that conservation of energy is not a fundamental law, and is conditional, but the conservation of information is fundamental. What exactly is the meaning of information? Can it be quantified? How is it related to energy?

4 Answers4

16

If one measures lack of information by the entropy (as usual in information theory), and equates it with the entropy in thermodynamics then the laws of thermodynamics say just the opposite: In a mechanically, chemically and thermally isolated system described in the hydrodynamic approximation, energy is conserved whereas entropy can be created, i.e., information can get lost. (In an exact microscopic Hamiltonian description, the concept of information makes no sense, as it assumes a coarse-grained description in which not everything is known.)

The main stream view in physics (aside from speculations about future physics not yet checkable by experiment) is that on the most fundamental level energy is conserved (being a consequence of the translation symmetry of the universe), while entropy is a concept of statistical mechanics that is applicable only to macroscopic bodies and constitutes an approximation, though at the human scale a very good one.

  • What you have said, is just a statistical definition of information. Can physical information be defined is some other way ( a quantum mechanical definition?) –  Mar 08 '12 at 15:41
  • Information is a statistical concept, also in telecommunication engineering, say. It captures the scientific aspect of information, though not its subjective value for human beings. Maybe you can ask morespecifically after having read http://en.wikipedia.org/wiki/Physical_information – Arnold Neumaier Mar 08 '12 at 16:00
  • Ramanujan, do you happen to know any online links to explain what Dr Gomatam means by the assertion that conservation of energy (and I'll assume this really means mass-energy) is not fundamental? That alone is a very unusual assertion, so it's not clear exactly what he intended. I looked at his home page, but nothing looked promising based on the titles. – Terry Bollinger Mar 08 '12 at 23:41
  • @ramanujan : it would be interesting to know about the "conditions" at which the speaker says energy is conserved or for that matter not conserved. I have recently asked a similar question in physics.SE, and got an answer that conservation of energy is fundamental which comes from Noether's Theorem, basically because of symmetry in space-time. – Vineet Menon Mar 09 '12 at 05:52
  • @TerryHey, sorry if you have already seen this link. http://www.bvinst.edu/faculty/~gomatam.htm There is a link under lectures 'quantum reality - Why physicists dont understand it', it seems he gave the same lecture at my institute –  Mar 09 '12 at 12:11
  • All of this talk on conservation of energy makes any sense under the assumption that there is finite energy that can be measured (in principle: that we can compute total energy as a finite quantity). Or, one could talk about local conservation of energy (not on the scale of the universe). –  Apr 03 '13 at 06:50
  • @ArnoldNeumaier, you say that Information is a statistical concept but what does that mean in the light that all matter is energy and energy can only be described in terms of information? I see a distinction without a fundamental difference. – MetaStack Mar 21 '18 at 04:31
  • 1
    @LegitStack: Energy is never described in terms of information but in terms of a Hamiltonian. – Arnold Neumaier Mar 21 '18 at 13:38
  • "... and equates it with the entropy in thermodynamics then the laws of thermodynamics say just the opposite: In an isolated system, energy is conserved wheras entropy can be created, i.e., information can get lost." This is not correct. If a Hamiltonian system is isolated, its information entropy does not change, not even if the system changes macroscopic state into one with higher thermodynamic entropy.

    – Ján Lalinský Mar 21 '18 at 15:17
  • @ArnoldNeumaier my point is matter is a derivative of information; information isn't derived from matter. Thus by your statement "information is a statistical concept," you implied matter is more fundamental than the mere statistics we use to describe it. My point is that implication, that unspoken assumption isn't true, rather we use matter to discover what information underlies our observations. – MetaStack Mar 21 '18 at 15:57
  • @JánLalinský: Thanks; I made my statement more precise. – Arnold Neumaier Mar 21 '18 at 17:01
  • @LegitStack: ''matter is a derivative of information'' - maybe in popular science accounts of it but not in physics. – Arnold Neumaier Mar 21 '18 at 17:02
  • Still, if the system can exchange heat, in general its energy is not conserved. – Ján Lalinský Mar 21 '18 at 17:09
  • @JánLalinský: Indeed. I changed back to my previous formulation, which was correct. The point is that to talk sensibly about information one needs coarse-graining, i.e., a description in which some microscopic details are averaged over. Only then the concept of information has any nontrivial formal meaning. – Arnold Neumaier Mar 23 '18 at 11:24
  • I do not see how coarse-graining, an artificial procedure of forgetting the details, is necessary for the concept of information entropy. The latter can be defined as a functional of probability distribution on phase space. – Ján Lalinský Mar 23 '18 at 13:07
  • 1
    @JánLalinský: Imposing a probability distribution means for a classical system (implied by your reference to phase space) having coarse-grained the system, as without coarse-graining the system is deterministic. – Arnold Neumaier Mar 23 '18 at 14:05
  • That is a strange use of the term "coarse-graining". Anyways, after this initial "coarse-graining" (introduction of probability distribution) is done, information entropy of isolated system does not change in time, not even if thermodynamic entropy does. The two are different things, so your claim that information gets lost as thermodynamic entropy increases is not very plausible. – Ján Lalinský Mar 23 '18 at 20:43
7

In contrast to @ArnoldNeumaier, I'd argue that the information content of the World could be constant: it almost certainly can't get smaller and how it and if it gets bigger depends on the resolution of questions about the correct interpretation of what exactly happens when one makes a quantum measurement. I'll leave the latter (resolution of quantum interpretation) aside, and instead discuss situations wherein information is indeed constant. See here for definition of "information": the information in a thing is essentially the size in bits of the smallest document one can write and still uniquely define that thing. For the special case of a statistically independent string of symbols, the Shannon information is the mean of the negative logarithms of their probabilities $p_j$ of appearance in an infinite string:

$H=-\sum_j p_j \log p_j$

If the base of the logarithm is 2, H is in bits. How this relates to the smallest defining document for the string is defined in Shannon's noiseless coding theorem.

In the MaxEnt interpretation of the second law of thermodynamics, pioneered by E. T. Jaynes (also of the Jaynes-Cumming model for two level atom with one electromagnetic field mode interaction fame), the wonted "observable" or "experimental" entropy $S_{exp}$ (this is what the Boltzmann H formula yields) of a system comprises what I would call the true Shannon information, or Kolmogorov complexity, $S_{Sha}$, plus the mutual information $M$ between the unknown states of distinguishable subsystems. In a gas, $M$ measures the predictability of states of particles conditioned on knowledge about the states of other particles, i.e. is is a logarithmic measure of statistical correlation between particles:

$S_{Exp} = S_{Sha} + M$ (see this reference, as well as many other works by E. T. Jaynes on this subject)

$S_{Sha}$ is the minimum information in bits needed to describe a system, and is constant because the basic laws of physics are reversible: the World, therefore, has to "remember" how to undo any evolution of its state. $S_{Sha}$ cannot in general be measured and indeed, even given a full description of a system state, $S_{Sha}$ is not computable (i.e. one cannot compute the maximum reversible compression of that description). The Gibbs entropy formula calculates $S_{Sha}$ where the joint probability density function for the system state is known.

The experimental (Boltzmann) entropy stays constant in a reversible process, and rises in a non-reversible one. Jaynes's "proof" of the second law assumes that a system begins with all its subsystems uncorrelated, and therefore $S_{Sha} = S_{exp}$. In this assumed state, the subsystems are all perfectly statistically independent. After an irreversible change (e.g. a gas is allowed to expand into a bigger container by opening a tap, the particles are now subtly correlated, so that their mutual information $M > 0$. Therefore one can see that the observable entropy $S_{exp}$ must rise. This ends Jaynes's proof.

See also this answer, for an excellent description of entropy changes an irreversible change. The question is also relevant to you.

Energy is almost unrelated to information, however, there is a lower limit on the work must do to "forget" information in a non reversible algorithm: this is the Landauer limit and arises to uphold the second law of thermodynamics simply because the any information must be encoded in a physical system's state: there is no other "ink" to write in in the material world. Therefore, if we swipe computer memory, the Kolmogorov complexity of the former memory state must get pushed into the state of the surrounding World.

Afterword: I should declare bias by saying I subscribe to many of the ideas of the MaxEnt interpretation, but disagree with Jaynes's "proof" of the second law. There is no problem with Jayne's logic but (Author's i.e. My Opinion): after an irreversible change, the system's substates are correlated and one has to describe how they become uncorrelated again before one can apply Jaynes's argument again. So, sadly, I don't think we have a proof of the second law here.

  • I think you've misunderstood Jaynes' proof. Jaynes doesn't say the system's components become uncorrelated, he says that (some of) the correlations become irrelevant for making predictions about the system's future behaviour, so you can safely forget about them. Thus we pretend that the system's components have become uncorrelated, even though we know they haven't really, because this allows us to do calculations that would be completely intractable otherwise. – N. Virgo Jul 16 '13 at 06:44
  • He makes the argument much more clearly in section 4 of this paper. – N. Virgo Jul 16 '13 at 06:47
  • 1
    @Nathaniel I'm trying not to get too far off the track here. I am quite familiar with the argument you cite above, but I still think it's begging the question (by brining in further assumptions about what is and what is not "relevant" mutual information). I am not convinced there is an argument that fully resolves the Loschmidt paradox aside from saying that the second law is about boundary conditions of the universe. I understand that not everyone agrees on this point. Moreover, please understand that I would not call ANY of Jayne's assumptions unreasonable. – Selene Routley Jul 16 '13 at 07:01
  • @Nathaniel indeed he is truly astounding both as a philosopher and in his intuition for the physical world. But I do believe that his assumptions are "moot" in the sense that their truth is ultimately an experimental fact. AS you say, one has to find a way forward through intractable calculations. Please stay tuned for a question from me on these matters as I have been thinking about this stuff for a long time. – Selene Routley Jul 16 '13 at 07:03
  • 2
    Jaynes argument for the second law is very explicitly based on an empirical fact. That fact is that we, as experimenters and engineers, are able to directly influence the initial conditions of an experiment, but we can't affect the final conditions except indirectly, by changing the initial conditions. Jaynes' argument says that given this asymmetry, the second law follows. However, this empirical fact in itself is then in need of an explanation, which Jaynes' argument can't help us with, and that's probably where you have to start thinking about the boundary conditions of the universe. – N. Virgo Jul 16 '13 at 16:55
  • 2
    @Nathaniel I'm glad we seem to agree then. One sometimes reads people citing Jaynes's work as a "mathematical proof" for the second law, independently of experiment and that it explains the arrow of time. I don't believe that Jaynes himself ever claimed this status for his work - indeed he seems often to be "going the other way", i.e. beginning with physical reality and experiment and calling on these to shed light on the philosophical foundations of chance and randomness. This is why I baulk at calling his work a "proof" of the second law. – Selene Routley Jul 17 '13 at 04:10
  • I'm pretty sure there's a paper where Jaynes strongly argues against the view that Loschmidt's paradox can be proven without brining in empirical evidence, but I can't seem to find it at the moment. (I've read most of his papers, so it can be hard to remember which one is which.) If I come across it I'll let you know. – N. Virgo Jul 18 '13 at 15:05
  • @Nathaniel I'd really appreciate this. I saw a statement somewhere on Wikipedia with words to the effect that "most physicists" would agree that Jaynes's "proof" is nothing more than a mathematical trick. I wanted to chime in at this point but I had nothing more than my own impressions of his work to argue with. I first stumbled on Jaynes's work whilst browsing idly outside what I was meant to be reading at the time - his unrelated work on the quantum optics of a two level atom coupled with a lone electromagnetic mode (Jaynes-Cumming model) and was utterly blown away by .... – Selene Routley Jul 18 '13 at 23:57
  • @Nathaniel ....his foundations of thermodynamics and probability works - wow this guy can think and write clearly! So I'd like to have some primary source to argue against this view that I've seen that he ever claimed a proof for the second law so that I am at least trying to behave as a real historian and rather put forward the view that his works should be taken as an awesome review of the foundations of statistical thermodynamics that really strenghtens our understanding. Now I can't find the reference on Wikipedia I wanted to challenge :( – Selene Routley Jul 19 '13 at 00:05
3

Energy is the relationship between information regimes. That is, energy is manifested, at any level, between structures, processes and systems of information in all of its forms, and all entities in this universe is composed of information. To understand information and energy, consider a hypothetical universe consisting only of nothingness. In this universe imagine the presence of the smallest most fundamental possible instance of deformation which constitutes a particle in this otherwise pristine firmament of nothingness. Imagine there is only one instance of this most fundamental particle and let us dub this a Planck-Particle PP. What caused this PP to exist is not known, but the existence of the PP constitutes the existence of one Planck-Bit (PB) of information. Resist the temptation to declare that energy is what caused our lone PP to exist. In this analogy, as in our reality, the ‘big’ bang that produced our single PP is not unlike the big bang that caused our known universe in that neither can be described in terms of any energy relationship or regime known to the laws of physics in this universe.

This PB represents the most fundamental manifestation of information possible in this universe. Hence, the only energy that exists in this conceptual universe will be described by the relationship (there’s that word again) between the lone PP and the rest of the firmament of nothingness that describes its universe. Call this energy a Planck-quantum (PQ). Note that this PQ of energy in this universe only exists by virtue of the existence of the PP alone in relation to the surrounding nothingness. With only one PP there are few descriptions of energy that can be described. There is no kinetic energy, no potential energy no gravity no atomic or nuclear energy no entropy, no thermodynamics etc.. However, there will be some very fundamental relationships pertaining to the degrees-of freedom defined by our PP compared to is bleak environment that may be describable as energy.

Should we now introduce a second PP into our sparse universe, you may now define further relationship and energy regimes within our conceptual growing universe, and formulate Nobel worthy theories and equations which describe these relationships. Kinetic energy suddenly manifest as the relationship of distance between our lonely PP’s suddenly comes into existence. Likewise, energy as we know it describes the relationships manifested between information regimes which are describable by the language of mathematics.

debyton
  • 47
-4

Electron and Information.

Information is transferred through EM waves. There isn't EM wave without electron. (H. Lorentz)

Information is the new atom or electron, the fundamental building block of the universe ... We now see the world as entirely made of information: it's bits all the way down. (Bryan Appleyard)

It is important to realize that in physics today, we have no knowledge of what energy is. We do not have a picture that energy comes in little blobs of a definite amount. It is not that way. (Richard Feynman about an electron)

Electron is a quantum of information. Electron is a keeper of information. Why? An electron has six ( 6 ) formulas: $$E=h*f\qquad \text{and}\qquad e^2=ah*c ,$$ $$E=Mc^2\qquad \text{and}\qquad -E=Mc^2 ,$$ $$E=-me^4/2h^2= -13.6eV\qquad \text{and}\qquad E= \infty$$ and obeys five (5) Laws :

  • a) The Law of conservation and transformation energy/ mass
  • b) The Heisenberg Uncertainty Principle / Law
  • c) The Pauli Exclusion Principle/ Law
  • d) Dirac - Fermi statistic
  • e) Maxwell / Lorentz EM law

It means in different actions electron must know six different formulas and must observe five laws. To behave in such different conditions a single electron itself must be a keeper of information.

The laws of physics dictate that information, like energy, cannot be destroyed, which means it must go somewhere. (Michael Brooks, Book ‘ The big questions’. Page 195-196.)

It means an electron (as a little blobs of a definite amount of energy) even in different situations never loses its information.

AccidentalFourierTransform
  • 53,248
  • 20
  • 131
  • 253
  • 2
    Unfortunately, this is mostly incoherent. By the time you catch yourself signing your answers with your full name in bold, it's time to step back and reevaluate things. – Nat May 13 '18 at 13:20