5

I’m reading Isham’s Lectures on Quantum Theory, and in Chapter 5, General Formalism of Quantum Theory, Section 5.1.2, he states the following rule.

Rule 1. The predictions of results of measurements made on an otherwise isolated system are probabilistic in nature. In situations where maximum amount of information is available, [emphasis mine] this probabilistic information is represented mathematically by a vector in a complex Hilbert space $\mathcal{H}$ that forms the state space of the quantum theory. In so far as it gives the most precise predictions that are possible, this vector is to be thought of as the mathematical representative of the physical notion of ‘state’ of the system.

Question: Why only in situations where maximum amount of information is available can we represent states with a vector in Hilbert space? What is even meant by “maximum amount of information“?

Atom
  • 1,931
  • 2
    As an aside, I understood that states are rays rather than vectors in a complex Hilbert space. From chapter 2 of the first book of Weinberg's The Quantum Theory of Fields: "Physical states are represented by rays in Hilbert space. ... A ray is a set of normalized vectors (i.e., $(\Psi, \Psi)$ = 1) with $\Psi$ and $\Psi'$ belonging to the same ray if $\Psi = \xi\Psi$, where $\xi$ is an arbitrary complex number with $|\xi|=1$" – Alfred Centauri May 14 '20 at 15:27
  • Since the statistical vs. quantum mechanical probability is mentioned (which is correct in a basic sense), keep in mind the nuance around this distinction: https://physics.stackexchange.com/q/98703/20427 –  May 14 '20 at 15:44
  • I just found out that Isham actually shines light on this point in the following section 5.1.3, Some Comments on Rules 1,2 and 3. See comment number 6. – Atom May 14 '20 at 18:11

3 Answers3

4

To understand what it means to have maximum knowledge, maybe it helps to consider a situation in which you don't have that knowledge. Suppose I was giving you a qubit and telling you that I threw a fair coin and prepared either a $|0\rangle$ if it was heads, or a $|1\rangle$ if it was tails. You would then describe the state with a density matrix

$$\rho = \frac{1}{2} (|0\rangle \langle 0| + |1\rangle \langle 1|) \tag{1} $$

Observables measured on this state involve two kinds of uncertainty: First since you don't know whether I gave you a $|0\rangle$ or a $|1\rangle$ and second since there are quantum fluctuations. Maximum knowledge possible is the case if all uncertainties in your measurement observables are solely due to quantum fluctuations.

There is also a technical definition for this. The state (1) has so called quantum Von Neumann entropy $S = - Tr [\rho \, Log \rho] \neq 0$. States that lie inside a Hilbert space are called pure states and always have zero entropy. This entropy is thus a measure of your knowledge about the state.

curio
  • 1,027
2

Probably the author means that a state in which you know everything you can know about that system, is represented by such a vector (or rather a ray, as Alfred Centauri remarked).

When there is uncertainty in the state, which could be considered classical uncertainty (where quantum uncertainty would be the uncertainty in experimental outcomes that we still have, even if we know the exact quantum state vector), the state is a classical probability distribution over this vector space, which can be represented by a density operator. These are the objects of study (or the states) in quantum statistical mechanics.

doetoe
  • 9,254
  • Quantum states are not described by probability distributions over a vector space. They are described by density operators. – Norbert Schuch May 14 '20 at 15:48
  • @NorbertSchuch Is that not what I wrote? – doetoe May 14 '20 at 16:36
  • You write "the state is a classical probability distribution over this vector space". That's just an interpretation. The correct description is a density matrix, which has far less information and highly ambiguous when interpreted as a probability distribution over pure states. – Norbert Schuch May 14 '20 at 16:52
  • @NorbertSchuch Yes, several probability distributions give rise to the same density operator. When doing your calculations with any of them, you'd get the same outcomes, so both descriptions are equivalent. I'm not at all saying that there is an advantage in encoding this state in any other way than in a density operator (though I like the probabilistic interpretation). – doetoe May 14 '20 at 17:09
1

Instead of state vector you can also represent states of a quantum system using a density matrix. If you have a basis $\{|n\rangle\}$ and a state vector $|\Psi\rangle=\sum_nc_n|n\rangle$ you can calculate the density matrix as follows $$\rho=|\Psi\rangle\langle\Psi|=\sum_{m,n}c_m{c_n}^{\!\!*}|m\rangle\langle n|$$ Expectation values can be calculated using $$\langle \hat A\rangle=\text{tr}(\rho \hat A)$$ Since we can calculate expectation values this density matrix contains at least as much information as the state vectors. But density matrices are more general objects. They can also describe experimental uncertainty. A general density matrix is of the form $$\rho=\sum_ip_i|\psi_i\rangle=\sum_{m,n}d_{m,n}|m\rangle\langle n|$$ Here $p_i$ is the chance that the system is in $|\psi_i\rangle$. This is not a quantum uncertainty but represents our experimental uncertainty about the system. It could be in state $\psi_1$ or it could be in state $\psi_2$ etc. In this form it is impossible to find $c_m,c_n$ such that $d_{m,n}=c_m{c_n}^{\!\!*}$.

When it is possible to write $\rho=|\Psi\rangle\langle\Psi|$ we have what's called a pure state. This corresponds to maximal knowledge because the system is for sure in the state $|\Psi\rangle$.

  • It's a bit weird to say that the density matrix contains more (information). It contains equal or less information than a pure state as evident from the fact that the entropy of a pure state is zero whereas that of a generic density matrix is $\geq 0$. I guess what you meant was that it is a more general object. I agree with that, of course, but to say that it contains more information is not correct. –  May 14 '20 at 16:00
  • @DvijD.C. I have to agree that information is a bit of a misnomer in this situtation. I updated my answer. – AccidentalTaylorExpansion May 14 '20 at 16:04