7

In quantum mechanics, we talk about (1) vectors, (2) states, and (3) ensembles (e.g., a beam in a particle accelerator). Suppose we want to translate this into mathematical definitions. If I'd never heard of the von Neumann density matrix, I'd approach this problem as follows. Two vectors can represent the same state if they differ only by a phase, so we should define states as equivalence classes of vectors that differ by a phase. However, I would not see any reason to go a second step and define a further level of equivalence-classing, in which a hydrogen atom in its ground state is considered to be equivalent to a beam of hydrogen atoms in their ground states.

Von Neumann is obviously a lot smarter than I am, and his notion of a density matrix appears to be universally accepted as the right way to describe a state. We use the same density matrix to describe one hydrogen atom or a beam of them. Can anyone offer any insight into why there seems to be no useful notion of state that works the way I'd have thought, rather than the way von Neumann did it?

Does it matter whether we're talking about classical QM or QFT? Do we not want to distinguish states from ensembles because in QFT particles can be created and annihilated, so fixing the particle number is not really what we want to do in defining the notion of a state?

Related: https://mathoverflow.net/q/117125/21349

  • Marginal comment: Not every (normalized) vector represents a state — there are super-selection sectors. – Diego Mazón Aug 08 '13 at 21:19
  • related: http://physics.stackexchange.com/q/56545/ –  Aug 09 '13 at 00:03
  • A sunbeam consists of a large number of photons. Each one might be polarised, and yet the beam as a whole might not be polarised. This is the situation studied by Quantum Stat Mech and the density matrix. Incoherent mixtures. – joseph f. johnson Nov 29 '15 at 22:13

3 Answers3

4

The need for something like the density matrix formalism should be kept conceptually distinct from the need to accommodate the creation and annihilation of particles. In fact, the Hilbert space of a quantum field theory is a Fock space, which being an infinite direct sum of Hilbert spaces each of which corresponds to having a fixed number of particles, allows for pure states representing any number of particles one wishes.

The fundamental need to the density matrix formalism (or something like it) is that pure states can only accommodate ensembles that are prepared in a restricted way. In quantum statistical mechanics, one can show via some simple examples (like an unpolarized beam of particles) that for systems whose state has been prepared in a certain way there does not exist a pure state that can accommodate all outcomes of measurements on the system. One, instead needs to find a mathematical object representing the state that allows for more general statistical mixtures of pure states. This is precisely what the density matrix does for you.

I feel like you may already know all of this, however, in which case it's not actually clear to me what the question is asking.

Edit, August 8, 2013.

There is a rather nice, careful discussion of all of this terminology in Quantum Mechanics, A Modern Development by Ballentine. Here is how some of these terms are defined in that text:

A state is identified with the specification of a probability distribution for each observable.

The density operator is then identified as an object which mathematically represents a state. This state can either be pure or mixed in the standard way.

A state preparation procedure is any repeatable process that yields well-defined probabilities for all observables.

An ensemble is the conceptual infinite set of similarly prepared systems.

Note. I think that the usage of the word "similarly" here as opposed to "identically" is deliberate because we want to emphasize that state preparation procedures that aren't identical can still prepare a system in the same state. There is, for example, more than one way to prepare a harmonic oscillator so that it's in thermal equilibrium with density operator $\rho = \mathrm{tr}(e^{-\beta H})$. In this sense, I think the answer to the question

Does a density matrix correspond one-to-one with a method of preparing an object?

is no. However, as far as I can tell, the "ensemble" concept described here really doesn't add much to the concept of "state" except as perhaps a way of interpreting the probability distributions that are being identified with states.

As far as I can tell, however, there isn't anywhere near complete uniformity in the usage of all of these terms as all of this is a pretty delicate business.

I also found the following SE post that is related and that you might find illuiminating:

Is the density operator a mathematical convenience or a 'fundamental' aspect of quantum mechanics?

joshphysics
  • 57,120
  • 1
    Sorry if the question is unclear -- it probably reflects the lack of clarity in my understanding. Maybe a pure state is the most natural realization of my vaguely defined notion of "state?" So then talking about a "state" might require that we've fixed a basis...? Does a density matrix correspond one-to-one with a method of preparing an object? If so, then maybe my "ensemble" corresponds to "method of preparation"...? –  Aug 09 '13 at 00:01
  • @BenCrowell I added an edit that is hopefully illuminating to some extent and perhaps more relevant to the question. I would also highly recommend Ballentine's discussion of states etc. in beginning of chapter 2 to which I refer in the edit. – joshphysics Aug 09 '13 at 05:15
1

I was re-reading von Neumann's tome in which he recapitulates his views on his own invention, the density matrix. As well as Dirac's inclusion of this in his second edition of his Principles, and the explanation of it in Landau--Lifschitz (second ed)...now, Landau independently had invented it, too.

Von Neumann's rationale is that if our information about the state of a system is incomplete, then all we know are the probabilities of the results of measurements made on it. As usual, he means specify a method of preparation of the system. But this time, the method is not specified as completely as possible. It need not always produce the same pure state. Note well: von Neumann does not assert that a macroscopic system cannot be in a pure state. Never in his life did he assert this, as far as I know. The whole point of what he is doing is, suppose our knowledge of its state is incomplete.

Next he states some physically reasonable axioms for what laws those probabilities ought to obey. (Technically, he prefers to speak of expectation values of observables instead of the probabilities of the results of a measurement of those observables, but these are equivalent by the usual tricks he already developed about projection valued measures)

Next he states some physically reasonable axioms for what laws those probabilities ought to obey. (Technically, he prefers to speak of expectation values of observables instead of the probabilities of the results of a measurement of those observables, but these are equivalent by the usual tricks he already developed about projection valued measures.) Then he proves the mathematical theorem that there exists a matrix (or operator) $U$ such that the expectation of an observable $Q$ is given by trace $(UQ)$. He proves uniqueness, too: different mappings from observables to expectations yield distinct $U$. He also characterises the $U$'s that arise from mappings in this way. Such a $U$ he calls a density matrix (or operator).

Thus, the density matrix represents what we know about a system when our knowledge is incomplete.

He also motivates the density matrix by supposing that, for example, we knew the probabilities that the system was in one or another pure state. This knowledge he calls a mixture, and calls such a probability mixture "a mixed state". He shows that there exists a density matrix $U$ which yields the expectation values of all observables applied to that mixture. He is aware that different mixtures yield the same density matrix.

Landau--Lifschitz take a slightly different point of view. They consider a subsystem, which is not a closed system, of a large, macroscopic system. For example, an unpolarised light beam which has been produced by the sun. The joint system is quite macroscopic, but all our quantum measurements are on the subsystem of the light beam and ignore all the quantum numbers of the sun. L--L like to say that a macroscopic system cannot be in a pure state. They show that all expectation values of quantum measurements on the joint system which ignore the quantum numbers of the sun can be found by tracing out over the ignored variables, using von Neumann's formula for the appropriate $U$.

L--L also include the same motivation von Neumann included, using a statistical mixture of a finite number of pure states, but later explicitly warn against thinking that the density matrix represents a probabilistic mixture (synonym, statistical mixture) of pure states. They call their motivation "purely formal".

L--L include a profound physical discussion of what the quantum pure states of a macroscopic system would look like, how they would behave. What their energy levels would look like. You must read both von Neumann and Landau. The former is logically precise, writes clearly, etc.,, but never has any physical intuition. The latter spews out profound physical insights unpredictably, but writes sloppily, unintelligibly, contradicts himself, etc.

When reading, pay careful attention to the difference between saying "the probability that the result of measuring an observable $Q$ will be $q_i$", "the probability that the system, upon measuring $Q$, will be found to be in the state |$q_i$>", which are both correct and precise and accurate, and (the, IMHO, incorrect) "the probability that the system was in the state |$q_i$>". But, it is only the long debate on Quantum Measurement that has taught us this distinction.

1

Von Neumann did not intend the density matrix to be the description of a state, and it is still not universally accepted. It describes a mixed state, not a pure state. It is the functional equivalent of an ensemble in Classical Stat Mech, but it is better not to think of it as an ensemble. It does not correspond one to one with a method of preparing an object. IT is still essentially statistical in nature, just Quantum Statistical, not classical ensemble-based statistical.