24

Well, zero of course. Because

$S = -\text{tr}(\rho \ln \rho)$ and $\rho$ for a pure state gives zero entropy.

But... all quantum states are really pure states right? A mixed state just describes our ignorance about a particular system. So how can properties like entropy and temperature come out of ignorance of information? That doesn't make sense to me.

Qmechanic
  • 201,751
Nick
  • 2,949
  • 4
    where did you think entropy came from classically? – By Symmetry Nov 09 '14 at 22:05
  • 1
    entropy is not an easy concept! but the more you read and think about it, the more sense it makes. You can write an entire book discussing its meaning(s). –  Nov 09 '14 at 22:09
  • Classically? Basically the heat-engine way it was originally defined. – Nick Nov 09 '14 at 22:38
  • Your question stands at the border between quantum and classical mechanics. Nobody knows exactly where is this border. E.g., an object consisting in many particles, shall we describe it by quantum, or by classical mechanics? A classical object is in so rapid exchange of energy and particles with the environment, that we can't even say at every moment how many particles and what energy it possesses.

    So, your mixed state is a quantum superposition of states of a many-particle object and we don't know the phases, or we have to do with a classical many-body object?

    – Sofia Nov 09 '14 at 23:22
  • @Nick You should read Jaynes: http://scholar.google.com/scholar?q=jaynes+information+theory&btnG=&hl=en&as_sdt=0%2C5 – joshphysics Nov 10 '14 at 00:30
  • 2
    Isn't a pure state equivalent to a "microstate" in statistical mechanics? I think one can only talk about the entropy of the macrostate to which a particular microstate is a part of (and most macrostates consist of large sets of microstates, although in principle there can be macrostates consisting of a single microstate). Would "macrostates" in QM stat. mech. normally be mixed states, consisting of different statistical ensembles of pure states? I'm not sure but if so, the same idea could apply, in that you could only talk about the entropy of the mixed state the pure state was part of. – Hypnosifl Nov 10 '14 at 03:07
  • Quantum open systems aren't in pure states. It doesn't have anything to do with ignorance, but with quantum correlations with surroundings. – juanrga May 17 '19 at 17:01

5 Answers5

18

I think it is a mistake, in this case, to think of entropy as "a description of our ignorance." Rather, I would suggest that you think of entropy as a well-defined, objective property provided that you specify which degrees of freedom in the universe are inside and outside of your system. The content of this statement isn't really different, but it emphasizes that entropy is an objective property and not observer-dependent.

If your included list is "everything" (or at least everything that has ever interacted together in the history of your system), then what you said is true: if you started out with a pure state it will always remain so, and there isn't much thermodynamics to speak of.

The basic question of thermodynamics (and, more broadly, statistical mechanics) is what happens in any other case - most typically, the case in which the degrees of freedom you specify are continuously coupled to an open system in some way. Somewhat amazingly, there is a general answer to this question for many such arrangements.

More concretely, in classical thermodynamics one of the important things about entropy and temperature is that they tell you how much work you can extract of a system. So one way to reform your question is: "How can properties like maximum work extracted come out of ignorance of information?" But it is easy to think of situations when this is the case. As a toy model, imagine a sailor trying to navigate a sailboat in some storm, with the wind changing wildly and rapidly. If he somehow knows beforehand exactly when and how the wind will shift, he will have a much easier time moving in the direction he wants.

Ultimately, a similar game is being played on a microscopic level when one speaks, for example, of the maximum efficiency possible in a heat engine. The explicit connection is made by Landauer's Principle, which is the direct link you're looking for between the included degrees of freedom (or, if you insist, "knowledge") and work. This law was inspired by the famous thought experiment Maxwell's Demon, which is a microscopic equivalent to my weather-predicting sailor.


Rococo
  • 7,671
  • 1
  • 23
  • 54
  • 2
    "Rather, I would suggest that you think of entropy as a well-defined, objective property provided that you specify which degrees of freedom in the universe are inside and outside of your system." Bingo. – DanielSank Nov 12 '14 at 05:45
  • 1
    @Nick, regarding your frying pan question (sorry, I don't have enough reputation to comment under that question), you may be interested in a recent proposal to modify the standard definition of entropy to apply to closed systems. However, see also this rebuttal, which argues the conventional view: that pure states have an entropy of 0 and a temperature that is ill-defined. – Rococo Nov 12 '14 at 06:14
  • 3
    I like the idea behind your first sentence, but it does not really follow that entropy is not observer-dependent (it is observer-dependent!). Rather, you are defining a "standard observer" who has access to a given set of observables (or technically, a thermodynamic state space). Then, for all standard observers, the entropy is uniquely defined. However, you can still imagine observers with access to more information (e.g. Maxwell demons) for whom the observed entropy is different. – Mark Mitchison Nov 21 '14 at 12:14
  • Hi @MarkMitchison- I'm not sure we disagree in any deep way, but there were two things I wanted to emphasize: – Rococo Dec 01 '14 at 22:13
  • The definition of entropy does not in any way require the notion of an observer, but does require a specification of the subspace considered, to get the density matrix.

  • An observer may measure different entropy depending on which aspects of the system he/she considers. Concretely, for a system of two entangled particles one will measure a different entropy for each of the particles independently than the full entangled state.

  • – Rococo Dec 01 '14 at 22:25
  • I agree that the notion of a "standard observer" who can only make measurements of some given complexity is an important one, and a crucial part of the connection between use of entropy in thermodynamic and informational contexts. But this is an additional structure beyond what I am discussing in my answer, and is not the sense in which I am calling entropy "objective." – Rococo Dec 01 '14 at 22:29