3

Disclamer: I'm not a physics professional, so pardon me if the question is stupid/incomperhensible/generally doesn't make sense. And I've googled it, but didn't find an answer.

Getting to the point, I would love to know if a bigger, more complex system experiences higher levels of entropy, compared to a simpler and smaller system?

Thanks in advance to anyone who would enlighten me on the subject.

Qmechanic
  • 201,751
Egil
  • 131
  • The two notions of entropy and complexity are quite different. In general, complexity for a system implies having more constitutive parts or more complicated dynamics. Entropy can be considered as a measure of “disorder” (in its general sense) in the system. So a complex system can be less disordered than a simpler system; in other words, entropy of the complex system can be less than that of the simpler system. Beyond this, a more detailed understanding of these technical terms is necessary. – AlQuemist Dec 15 '15 at 12:19
  • For a brief introduction to “entropy”, see < http://hyperphysics.phy-astr.gsu.edu/hbase/therm/entrop.html >. For “complexity”, see: Bennet, C. H. “How to define complexity in physics, and why” (1990) < https://goo.gl/xr2xxZ > – AlQuemist Dec 15 '15 at 12:26
  • @PhilosophiæNaturalis: I think that's an answer ;) – ACuriousMind Dec 15 '15 at 13:54
  • Not quite! It is merely a hurried comment. @ACuriousMind The question is very broad and actually, difficult to answer briefly. – AlQuemist Dec 15 '15 at 13:56
  • @PhilosophiæNaturalis Well, you answered my question nevertheless, concisely and to the point. – Egil Dec 15 '15 at 14:00
  • @PhilosophiæNaturalis - I lean more towards the statistical mechanics explanation from microstates to phase space to whatnot. To me, a very complex system may still have very few accessable states that it can occupy. So you are completely correct that entropy and complexity are distinct. – Jon Custer Dec 15 '15 at 15:03
  • @JonCuster: Indeed, this question can be answered at different levels of elaboration and abstraction. Here a simple general (not technical) answer is expected by the querent — as far as I understood. – AlQuemist Dec 15 '15 at 15:06
  • @PhilosophiæNaturalis - no doubt. Just food for thought when you get around to writing up that answer! – Jon Custer Dec 15 '15 at 15:13
  • Nothing more complex than the concept of Entropy itself! ;-) Wanna start a bar room brawl in a physics faculty? Mention the E-word! – Gert Dec 15 '15 at 15:41
  • @Gert Ah, that would be a fine, scientific way to troll a community of educated people :) – Egil Dec 15 '15 at 15:48
  • Heh. I've seen a few conversations that seem to go, "What is entropy?" and then 6 people say, "Why, that's simple. Entropy is X." Unfortunately, with 6 people, there are usually 7 or 8 completely different values of X. But, I assure you, it's simple. – elifino Dec 16 '15 at 01:31

2 Answers2

1

Complexity behaves as the time derivative of entropy. In a closed system, entropy and complexity increase together initially, in other words the greater the disorder the more difficult it is to describe the system. But things change later on. Toward the end, as entropy approaches its final maximum where there is also maximal disorder, complexity diminishes. See my publication: https://doi.org/10.1016/j.techfore.2021.121457 or pre-print: https://osf.io/6nwf9/

Chris
  • 17,189
  • 2
    Welcome to [Physics.SE]! It is generally good practice here to disclose that you're the author of articles you link to; self-links that don't disclose your connection to the material may be deleted as spam. – Michael Seifert Jul 08 '22 at 14:28
  • Your answer could be improved with additional supporting information. Please [edit] to add further details, such as citations or documentation, so that others can confirm that your answer is correct. You can find more information on how to write good answers in the help center. – Community Jul 08 '22 at 14:29
  • While this link may answer the question, it is better to include the essential parts of the answer here and provide the link for reference. Link-only answers can become invalid if the linked page changes. - From Review – Miyase Jul 08 '22 at 15:03
0

The answer to this question is: yes, there is indeed a connection between entropy and complexity. However, there is no single answer. In fact, there are several formulations of entropy, and entropy alone is not sufficient to describe complexity. To be more precise, we can first distinguish between Shannon entropy (usually denoted by the symbol H or h for differential entropy), which quantifies the unevenness in the probability distribution $p(x)$, and entropy S, which is defined as a function of the density matrix. The Shannon entropy of a discrete random variable X is defined as: \begin{equation} H(X)=-\sum_{x \in X} p(x)\log_{2} p(x) \end{equation} The logarithm base provides the entropy’s unit. In particular, a fully localized probability distribution with $p=1$ reflects the minimum $H(X) = 0$ reached for a constant random variable, i.e., a variable with a given outcome. On the other hand, $H(X)$ is maximum for a uniform distribution. In other words, the more unexpected or improbable an event is, the greater the entropy H is. Take, for example, the prediction "the sun will not rise tomorrow". Should this prediction come true, the affirmative content would be extremely high. In fact, we expect the sun to rise every day, so the probability of it not rising is practically zero. Information and complexity are related in the sense that a complex structure requires a more extensive description than a simple structure (which is thus characterized by less information). To describe a circle we need only one parameter, the radius, but more parameters are needed to characterize more complex geometric structures. This statement suggests that there is a relationship between the amount of information needed to describe a structure and its complexity. On the other hand, if we consider an ideal and isolated gas consisting of N molecules, we can see that its distribution in space does not follow any particular order. The system can be in any of its possible states, and with the same probability. In this case, the distribution of molecules in the gas is uniform and the entropy is maximum. This situation corresponds to the maximum information required to describe a completely disordered system. Can we say that the maximum disorder corresponds to the maximum complexity? The answer is no. Lopez-Ruiz, Mancini, and Calbet introduced a measure of complexity they called LCM, which defines complexity as the product of entropy and "disequilibrium" D. The latter is a measure of how far we are from an equilibrium situation, where equilibrium is characterized by a situation in which there are no states that are more probable than others. When we introduce disequilibrium, the LCM complexity vanishes for both fully ordered and fully random systems. In fact, D vanishes in the latter scenario, but H = 0 exists in the former. As a result, the measure C is expected to reach maximum values halfway between order and randomness.This definition agrees with what Theodore Modis wrote. However, this definition is not unique. In the field of systems engineering, Sinha K. defines complexity in a completely different way Sinha thesis;however, his definition uses Shannon entropy. Complexity (in this case algorithmic) can be defined in terms of Kolmogorov. In this case, complexity, although not computable, corresponds to the shortest description one can make of a system.In this brief explanation I have taken the liberty of sacrificing scientific rigor in the hope of giving a more intuitive explanation. The conclusion is that there is no single definition of complexity and the various formulations do not allow us to establish a unique relationship between complexity and entropy.

Upax
  • 186