0

So I'm quite confused by an answer: https://physics.stackexchange.com/a/510626/150174

Let $|\psi\rangle$ be the initial state, and let $U_t=e^{-i Ht}$ be the evolution operator, assuming a time-independent Hamiltonian. I will also assume for simplicity that we are working on a discrete basis. If you want to work with continuous variables, you can replace sums with integrals and you should mostly be fine.

Suppose we start at $t=0$, and measure the state at times $\{t_k\}_{k=1}^N$, letting it evolve freely in the intermediate times.

Measuring at $t=t_1$ gives the outcome $x$ with probability $p(x,t_1)=|\langle x|U_{t_1}|\psi\rangle|^2$, and a post-measurement state $|x\rangle$. Write the coefficients of $|\psi\rangle$ in the basis of $|x\rangle$ as $|\psi\rangle=\sum_x c_x |x\rangle$, and define the kernel of the evolution as $K(x,y;\delta t)\equiv\langle x|U_{\delta t}|y\rangle$.

Finally, let us define $\Delta_k\equiv t_k- t_{k-1}$. We can then write $p(x,t_1)$ (assuming a discrete set of possible outcomes) as

$$p(x,t_1)=\left|\sum_y K(x,y;\Delta_1)c_y\right|^2.$$

Because we don't know the post-measurement state after the first measurement, we now need to switch to a density matrix formalism to take into account this classical uncertainty. We therefore write the post-measurement state as :

$$\rho_1=\sum_x p(x,t_1) \mathbb P_x, \text{ where } \mathbb P_x\equiv |x\rangle\!\langle x|.$$

At time $t_2$, before the second measurement, the state is therefore given by

$$\tilde\rho_2=\sum_x p(x,t_1)\, U_{\Delta_2}\mathbb P_x U_{\Delta_2}^\dagger,$$

which then results in an outcome $x$ with probability $p(x,t_2)=\sum_y |K(x,y;\Delta_2)|^2 p(y,t_1)$, and a post-measurement state

$$\rho_2=\sum_{x}p(x,t_2) \,\mathbb P_x = \sum_{x,y} |K(x,y;\Delta_2)|^2 \Big|\sum_z K(y,z; \Delta_1)c_z\Big|^2 \, \mathbb P_x.$$

You can keep going and compute the state at each successive measurement time $t_k$. If this reminds you of Feynman's path integral formulation, it's because it kind of is. The difference is that here you break the interference at every measurement time, and so the final state is determined by a mixture of quantum interference and classical probabilities.

Define now for ease of notation $q_k\equiv p(x,t_k)$. What is the probability of finding at specific $x$ for the first time at the $k$-th measurement? This will equal the probability of not finding it in the previous measurements and finding it at the $k$-th, that is,

$$(1-q_1)(1-q_2)\cdots (1-q_{k-1})q_k.$$

Note that with this formalism you can also answer other questions about the probability of finding a given result once or more at specific combinations of times. For example, the probability of measuring $x$ at least once will be given by

$$1-\prod_{k=1}^N (1-q_k).$$ I don't know if there is a nice way to write these expressions in general. Maybe, if you write probabilities back in terms of the kernels, but I haven't tried, and the post already got a bit too long.

I mean it starts of with nice pure states which have $0$ entropy and it then starts to use density matrices with mixed states which have non-zero entropy.

Question

Is there some kind of $2$'nd law of thermodynamics here too then about the measurement of time through this? And what is the temperature and heat associated here? I was thinking using the Shanon entropy definition would help this cause?

glS
  • 14,271
  • Density matrices are just the most general states in quantum mechanics. They have no more to do with thermodynamics than a general probability distribution does. Shannon/von-Neumann entropy $\neq$ thermodynamic entropy. But if you want to find connections between time measurements and thermodynamics, you can have a look at one of our papers, here. – Mark Mitchison Oct 28 '19 at 15:42
  • @MarkMitchison This is so funny I was working on something that has made me ask a whole load of questions. One of my friends even prompted me to read precisely that paper. And here I am talking to the author of that very paper :P – More Anonymous Oct 28 '19 at 15:46
  • @MarkMitchison an on topic question would then be is can't I calculate the entropy of the density matrix as non-zero when it's a mixed state? – More Anonymous Oct 29 '19 at 19:26
  • The von Neumann entropy of a mixed state is non-zero, by the definition of a mixed state. Is that your question? – Mark Mitchison Oct 30 '19 at 14:08
  • @MarkMitchison I guess it boils down to how to show there is (or isn't) a 2' nd law or heat or temperature in the context I'm talking about? – More Anonymous Oct 30 '19 at 16:38
  • @MarkMitchison Also this whole debate of is information entropy = thermodynamic entropy is part of a broader debate which I haven't seen done in this context: https://physics.stackexchange.com/q/263197/150174 – More Anonymous Oct 31 '19 at 14:46

0 Answers0