When some photon detector detects a photon, is it an instantaneous process (because a photon can be thought of as a point particle), or does the detection require a finite amount of time depending on the wavelength of the photon?
EDIT: I guess what I am wondering is if a photon has a wavelength and travels at a finite speed, then if a photon had a wavelength of 300,000,000m, would its interaction with the detector last 1s? Or does the uncertainty principle say that a photon with wavelength 300,000,000m (and therefore energy E), it cannot be known exactly when it hit the detector with an accuracy better than 1s. Or is it more like this: suppose there is a stream of photons moving towards the detector with wavelengths of 300,000,000m and they reach the detector at a rate of 10 photons/second and the detector has a shutter speed such that the shutter is open for 1s at a time, then it would record 10 photon hits (records all the photons). But if the shutter speed is only 0.5s, then it would record 2.5 hits on average?
EDIT2: I'm not interested in the practical functioning of the detector and amplification delays. I'm looking at and ideal case (suppose the photon is 'detected' the instant an electron is released from the first photomultiplier plate). It is a question regarding the theory of the measurement, not the practical implementation.