51

I have a radioactive particle in a box, prepared so as to initially be in a pure state

$\psi_0 =1\ \theta_U+ 0\ \theta_D$

(U is Undecayed, D is Decayed). I put a Geiger counter in the box.

Over time (t), the theory says that the state should evolve into a pure state that is a superposition of Undecayed and Decayed, with the Decayed part getting bigger and bigger

$\psi_t =a\ \theta_U+ b\ \theta_D$

Eventually the counter will 'click', indicating that the particle has Decayed. Now I know that the state is 100% Decayed.

However, before this happened, the silence of the counter also indicated that the particle hadn't Decayed yet. So all the time up to that point, I also knew that the state was 100% Undecayed.

But this would be contradicting what the theory suggests (a superposition with a non zero contribution of the Decayed state, after some time), so I'm guessing it's an incorrect way of analysing the experiment.

I want to know where the mistake lies.

In other words, it seems to me the Geiger counter is always measuring the state of the particle. Silence means Undecayed, click means Decayed. So the particle would never actually Decay since I continuously know its state is

$\psi_t =1\ \theta_U+ 0\ \theta_D$

which means its chance of decaying would be perpetually zero (Zeno's effect, I've heard?).

How do I deal with this constant 'passive' measuring?

Qmechanic
  • 201,751
Juan Perez
  • 2,949
  • 4
    Great question. Similar to this: https://physics.stackexchange.com/q/232502/ but I look forward to the answer. – JPattarini Sep 24 '19 at 16:35
  • 3
    Decay is spontaneous. In the way it is semi-classically modelled it is indeed not clear what kind of measurement has to be taken into account. I asked a couple of questions along these lines: https://physics.stackexchange.com/q/258104/109928 and https://physics.stackexchange.com/questions/258256/how-can-quantum-tunnelling-lead-to-spontaneous-decay Anyway, it seems the Geiger counter is not part of the picture, it simply registers the fact that a decay did occur. – Stéphane Rollandin Sep 24 '19 at 17:20
  • 3
    Maybe the detector here is not actually be the thing collapsing the wavefunction of the nucleus. The detector is collapsing the wave function of the emitted gamma ray (or whatever). What actually collapses the nucleus is its interaction with whatever causes it to decay in the first place (the vacuum, stray radiation, other particles?) – KF Gauss Sep 25 '19 at 04:31
  • 7
    The Geiger counter only detects that the radioactive atom has decayed because the atom ejects a particle that passes through the Geiger tube (and even if the ejected particle does pass through the tube, it's not 100% guaranteed that it will actually be detected). So I don't think it's fair to say that the Geiger counter is continuously measuring the state of the atom. – PM 2Ring Sep 25 '19 at 04:42
  • 1
    But surely the cat knows – TaW Sep 25 '19 at 15:57
  • 1
    Not clicking only tells you the decay products did not reach the detector yet, doesnt it? – lalala Sep 25 '19 at 20:15
  • 1
    @lalala Even then, it tells you something, gives you some minimal information. Or I could change the experiment: instead of a box, the particle is placed directly inside the detector tube. – Juan Perez Sep 25 '19 at 20:56
  • 2
    Nothing profound going on here; it's just another case of wrongly treating "collapse" as a physical change/event. It's not. – R.. GitHub STOP HELPING ICE Sep 26 '19 at 01:21
  • Who's observing the Geiger counter :-( – copper.hat Sep 26 '19 at 17:24

6 Answers6

20

Good question. The textbook formalism in Quantum Mechanics & QFT just doesn't deal with this problem (as well as a few others). It deals with cases where there is a well-defined moment of measurement, and a variable with a corresponding hermitian operator $x, p, H$, etc is measured. However there are questions which can be asked, like this one, which stray outside of that structure.

Here is a physical answer to your question in the framework of QM: Look at the a position wave function of the decayed particle $\psi(x)$ (*if it exists: see bottom of post if you care). When this wave function "reaches the detector" (though it probably has some nonzero value in the detector the entire time) the Geiger counter registers a decay. Using this you get a characteristic decay time. This picture is a good intuition, but also an inexact/insufficient answer, because the notion of "reaches the detector" is only heuristic and classical. A full quantum treatment of this problem should give us more: a probability distribution in time $\rho(t)$ for when the particle is detected. I will come back to this.

So what about the Zeno effect? Based on the reasoning you gave, the chance of decaying is always zero, which is obviously a problem! Translating your question to position space $\psi(x)$, your reasoning says the wave function should be projected to $0$ in the region of the detector at every moment in time that the particle hasn't been found. And in fact you're right - doing this does cause the wave function to never arrive at the detector! (I actually just modeled this as part of my thesis). This result is inconsistent with experiment, so we can conclude: continuously-looking measurement cannot be modeled by straightforward projection inside the detector at every instant in time.

A note, in response to the comments of Mark Mitchison and JPattarini: this "constant projection" model of a continuous measurement can be rescued, by choosing a nonzero time between measurements $\Delta t \neq 0$. Such models can give reasonable results, and $\Delta t$ can be chosen based on a characteristic detector time, but in my view such models are still heuristic and a deeper, more precise explanation should be aspired to. Mark Mitchison gave helpful replies and linked sources in the comments for anyone who wants to read more on this. Another way to rescue the model is to redefine the projections to be "softer", as in the sources linked by JPattarini.

Anyway, despite the above discussion, there is still a gaping question: If continuous projection of the wave function is wrong, what is the correct way to model this experiment? As a reminder, we want to find a probability density function of time, $\rho(t)$, so that $\int_{t_a}^{t_b}\rho(t)dt$ is the probability that the particle was detected in time interval $(t_a, t_b)$. The textbook way to find a probability distribution for an observable is to use the eigenstates of the corresponding operator ($|x\rangle$ for position, $|p\rangle$ for momentum, etc) to form probability densities like $|\langle x | \psi \rangle|^2$. But there is no clear self-adjoint "time operator", so textbook quantum mechanics doesn't give an answer.

One non-textbook way to derive such a $\rho(t)$ is the "finite $\Delta t$ approach" mentioned in the note above, but besides this there are a variety of other methods which give reasonable results. The issue is, they don't all give the same results (at least not in all regimes)! The theory doesn't have a definitive answer on how to find such a $\rho(t)$ in general; this is actually an open question. Predicting "when" something happens in Quantum Mechanics (or the probability density for when it happens) is a weak point of the theory, which needs work. If you don't want to take my word for it, take a look at Gonzalo Muga's textbook Time in Quantum Mechanics which is a good summary of different approaches on time problems in QM which are still open to be solved today in a completely satisfactory way. I am still learning more about these approaches, but if you are curious, the one I found most clean so far uses trajectories in Bohmian Mechanics to define when the particle arrives at the detector. That said, the measurement framework in QM in general is just imprecise, and I would be very happy if a new way of understanding measurement were found which gives a higher level of understanding of questions like this one. (yes I am aware of decoherence arguments, but even they leave questions like this unanswered, and even Wojciech Zurek, the pioneer of decoherence, does not argue that it fully solves problems with measurement)

(*note from 2nd paragraph): Sure you can in principle hope to position representation to get a characteristic decay time like this, but it might not be as easy as it sounds because QFT has issues with position space wave functions, and you'd need QFT to describe annihilation/creation of particles. Thus even this intuition doesn't always have mathematical backing.

doublefelix
  • 6,872
  • 1
  • 23
  • 52
  • 1
    It's true that I've always thought of QM variables becoming defined only when measured by a detector (an electron "does not have a position" until measured). But as Macro Ocram says, this particle can decay without having to interact with a detector. So the variable "did it decay?" may have a definitive value even if not measured. Perhaps such "variable" is not a proper "QM observable" with a corresponding hermitian operator, and thus the theory can't study it properly? I'm not sure if that's what you were aiming at. – Juan Perez Sep 24 '19 at 21:39
  • I edited the answer to make it clearer, let me know if you still have questions. And yes you can have a particle "decay" without interacting with a detector, in the sense that the "decayed" term is much bigger than the "hasn't decayed" term, but it's almost never true that you'd have exactly 0 for either one of those terms for an unstable particle in a real-life situation. – doublefelix Sep 25 '19 at 00:24
  • Good answer, however it seems this answer calls the accepted answer to a similar question here into doubt: https://physics.stackexchange.com/q/232502/ – JPattarini Sep 25 '19 at 00:48
  • @doublefelix "Predicting "when" something happens in Quantum Mechanics ... is a major weak point of the theory" - Although I admittedly haven't read the text you suggest, this statement seems objectionable. Schrodinger's equations tell you exactly how the state of a system evolves through time, there's no ambiguity there. I'm sure you can ask all sorts of tricky questions (especially when you start trying to inject "wavefunction collapse" into quantum), but it seems to me that the basic quantum is solid here. – aquirdturtle Sep 25 '19 at 02:47
  • @aquirdturtle I believe the statement wasn't about the wavefunction evolution (which is deterministic), but rather about continuous observation of the wavefunction (which is nondeterministic for the observer as well as fundamentally uncertain because of the time-energy uncertainty). – John Dvorak Sep 25 '19 at 05:18
  • 4
    Speaking of which - modelling continuous measurement as infinitely many measurements in sequence would allow you to measure time with infinite precision, which would require infinite energy via the uncertainty principle. Perhaps we could model continuous measurement as a sequence of anytime-within-interval measurements where the interval length is determined by the energy of the measurement process. – John Dvorak Sep 25 '19 at 05:22
  • A continuous measurement is not described by a projection at every moment in time. It is defined as a sequence of measurements (in general a POVM rather than a projection) occurring at a rate $(\Delta t)^{-1}$, where $\Delta t$ is much smaller than the decay time but much larger than the correlation time of the environment (i.e. the time taken for the emitted particle to irreversibly propagate beyond the nucleus' domain of influence). The latter condition is crucial to avoid entering the Zeno regime. Note that this is a measurement on the output field and not the nucleus itself. – Mark Mitchison Sep 25 '19 at 10:06
  • 2
    All of this has been well known since the 90s and the corresponding theory is routinely tested in quantum optics laboratories (for example). It is extremely misleading to claim that there is somehow something mysterious or unknown about the description of continuous measurements. It's textbook material, see for example Wiseman & Milburn's book or this review, or just Google to find many more references. – Mark Mitchison Sep 25 '19 at 10:06
  • 1
    OP's question asked about a model in which projection happens at every moment in time, and it is true that in such a model the particle is never detected. @MarkMitchison (and probably also the link from JPattarini ) is talking about a different (more fruitful) model for continuous measurement which chooses a time between measurements $\Delta t$. This model can yield reasonable results, though it depends on a parameter $\Delta t$ which doesn't have physical grounds (why would nature measure in particularly that interval?). I'm going to edit the answer to include that now. – doublefelix Sep 25 '19 at 11:32
  • 1.I do not see the OP insist that measurements occur at every moment in time, in the sense that you have interpreted. 2.The $\Delta t$ is not some fundamental property of "nature", it is determined by the physical processes occurring in the detector. In particular, a photomultiplier or Geiger counter requires some time $\Delta t$ to reset itself after each click. This time will be different for each kind of detector. The point is that, so long as this $\Delta t$ satisfies the conditions I wrote above, the description of the continuous measurement does not depend on the parameter $\Delta t$. – Mark Mitchison Sep 25 '19 at 12:01
  • 3
  • The last few lines of OP's question pretty clearly suggest a continuously measuring device $\Delta t=0$, otherwise mentioning the zeno paradox would be irrelevant, & there would be no confusion. 2. While I think the "finite $\Delta t$ approach" is one of the better ones, in my subjective view the choice of $\Delta t$ requires some loose interpretation: choosing $\Delta t$= "minimal time in between two successive counts" is not clearly the same thing as "time in between successive measurements when no particles are found".
  • – doublefelix Sep 25 '19 at 12:28