In quantum mechanics, we are usually told that the absolute phase angle of a wave function has no physical meaning, and that it's only the difference in angle that matters. That is, if we take a wave function ψ() and add 90° to the entire thing, it still represents the exact same physical situation. But I'm having a hard time nailing down exactly how that blends into the classical notion of a phase angle in low-frequency radio waves.
In radio waves, the phase angle of a radio wave is pretty obvious. I can just set up an antenna to receive a signal and plot a graph of it against time. There's a phase where the EM field is changing rapidly (i.e. in the middle of a sine wave), and there's a phase where it's changing slowly (i.e. at the extremes of a sine wave). This notion of a phase angle is also perfectly good at explaining any interference patterns involving radio waves. Importantly, the absolute angle is physically measurable. 0° is distinguishable from 90°.
Compare that to quantized visible light, when we do interference experiments with a single-photon light source. We talk about an abstract wave function, and we use its complex phase angle to determine the probability of when and where we will detect a photon. Why is it that in this case, we accept that the absolute phase angle has no physical meaning? Maybe there is some experiment we haven't yet thought of, which will distinguish the rising and falling edges of a photon?
I am just having trouble connecting the dots between the two notions of a phase angle. As we move between frequency domains, when do we exactly lose this phase information?