0

(With apologies that my physics background is minimal...)

Why is the term "information" used to describe a physical system, such as the movement of air through a jet engine?

If I understand correctly, this is in part because Shannon's information theory can be used to analyze things like entropy in a system, but I wonder why specifically the term "information" is used. To a layperson suggests data and communication, and seems to be quite metaphorical rather than descriptive.

EDIT: this post, offers some answers related to my question but they are either far too short (ie just one sentence) or math-heavy. John Forkosh's excellent answer below gets to more of the "why would we call it information" that I'm curious about.

  • The Wiki article you've linked defines "physical information" as a form of a more general definition of 'information'. I believe what you're asking about is a question of semantics. – GodotMisogi Oct 28 '18 at 11:01
  • @GodotMisogi – yes, I'm looking for someone to explain why the term "information" is the right term to describe the components of a physical system, when we think of information as abstract and non-physical. Why does mass, velocity, etc boil down to "information"... – JeffThompson Oct 28 '18 at 11:08
  • The wiki article on 'information' defines it as: "Information is any entity or form that provides the answer to a question of some kind or resolves uncertainty." Physics is an 'entity' or 'form' that attempts to 'provide answers' to 'the question' of how the universe works by defining quantities such as mass, velocity, etc. to characterise the attributes of a physical system. – GodotMisogi Oct 28 '18 at 11:14
  • "Why is the term "information" used to describe a physical system, such as the movement of air through a jet engine?" It isn't, and the Wiki link you gave doesn't say that it is. – alephzero Oct 28 '18 at 12:44
  • @alephzero – it's not in that article, but I had a long conversation with an engineering professor who designs jet engines for Rolls Royce and he said that's the term they use (and where this question originated) – JeffThompson Oct 29 '18 at 09:42
  • If the answers in the linked question aren't exactly what you're expecting, it may be better to set up a bounty there and mention in the description that you'd like a balance of significant detail but light math. –  Oct 29 '18 at 11:42

1 Answers1

2

Shannon information is indeed closely related to thermodynamic entropy by some elaborate mathematical gobbledy-gook we needn't get into. Moreover, I think that's what's responsible for your confusion, because your comment clarification "Why does mass, velocity, etc boil down to 'information'" is barking up a somewhat different tree vis-a-vis the meaning of information, as follows.

Every physical system is mathematically characterized by what's very generally called a state, which evolves in time. When you ask "Why is the term 'information' used to describe a physical system", it's the state of the system that's what's being described. And that description typically consists of a collection of numbers specifying position, velocity, etc, just like your comment says. And then information simply refers to all those numbers. Certainly, if you were driving along in your car, and I referred to your location and speed as "information", you wouldn't argue with that common-sense usage of the word. And in physics, there's nothing much more esoteric or profound about it.

For your jet engine example, the system is the gasses, and information comprising its state is typically temperature, pressure, volume, and some other stuff. And in this case, it's the Navier-Stokes equations https://en.wikipedia.org/wiki/Navier%E2%80%93Stokes_equations which describe the behavior of such jet engine gasses. That is, the state always contains enough information so that the system's subsequent behavior can be calculated using those equations. I notice you've got lots of stackoverflow posts, so maybe just think of it as "programmed" rather than "calculated". Then the state is all the input necessary so that a program can be written to model the flow of those gasses. And we're just calling all that necessary input data "information".

So again, there's nothing esoteric or profound about it here. However, when you go beyond your comment "Why does mass, velocity, etc boil down to 'information'", and maybe start talking about thermodynamics (and statistical mechanics), or maybe about black holes and other stuff, then "information" typically takes on a more elaborate mathematical meaning beyond its everyday usage. And that's the tree we don't want to be barking up here.

  Edit
--------

Re Jeff's "I'll dig into your other suggestions" comment, let me try to non-mathematically (and very briefly) suggest the gist of the overall information$\sim$entropy idea, especially since that wikipedia article gets pretty mathematical almost immediately. I'll instead use algorithmic complexity https://en.wikipedia.org/wiki/Kolmogorov_complexity which can (my opinion) be discussed less mathematically, and is related to Shannon entropy, e.g., https://www.quora.com/What-is-the-relationship-between-Kolmogorov-complexity-and-Shannon-entropy

First, entropy measures "disorder", i.e., a low-entropy system is very ordered, whereas a high-entropy system is very disordered. Consider a wall of neatly-arranged bricks (low entropy), versus a jumbled pile of bricks (high). The state (as per above discussion) of the bricks would be a complete description of each brick's position. For the wall, a very short description suffices since we can write one little formula that works for all the bricks. For the jumbled pile, however, there's no such brick-next-to-brick relationship, and we have to laboriously write out each individual brick's position.

So the state of the low-entropy ordered wall can be described much more concisely/compactly/shortly than the state of the high-entropy jumbled pile.

And that's related to algorithmic complexity as follows. Imagine a string of random characters (all from the lowercase a...z alphabet) "kduwygxostlqr..." and an equal-length easily-recognizable ordered string "abcdeabcdeabcde...". So which one has more "information"? Answer: use the zip program to compress both strings, and then the length of the resulting zip file measures the original string's information content. Clearly, the ordered string is more compressible, and hence contains less information.

So now the entropy$\sim$information relation should be obvious: the state that can be described more concisely is the one with lower entropy. And in both cases, it's the random brick pile and the random string that are high-entropy and high-information. Moreover, we can get lots more quantitative than "low" and "high". But that's where all the math illustrated in that wikipedia page comes into play.

  • Thanks, this is a much more intuitive definition than other posts I've found. I would be interested to hear a bit more about the more complex meaning arising from black holes, etc if you don't mind! – JeffThompson Oct 28 '18 at 13:35
  • 1
    @JeffThompson Your original remark "my physics background is minimal" (from which I inferred ditto with respect to math background) is why I explicitly avoided that. Otherwise, I'd just have written a comment referring you to stuff like https://en.wikipedia.org/wiki/Entropy_(information_theory) and https://en.wikipedia.org/wiki/Black_hole_information_paradox Or just google entropy information and black hole information for lots more hits. Only you'll be able to tell which one(s) best suit your background level and interests. –  Oct 29 '18 at 04:52
  • Got it – I've read a bit of Stephen Hawking's writing about black holes and information. I'll dig into your other suggestions more. – JeffThompson Oct 29 '18 at 08:49
  • @JeffThompson I hadn't guessed you'd be sufficiently interested to try reading that stuff, and added an edit above to try to introduce the Shannon-entropy$\sim$information idea less mathematically than that wikipedia page. (And I'm sure there must be some better-thought-out, more-intuitive/less-mathematical, discussions which google can cough up from some .edu sites.) –  Oct 30 '18 at 05:31