Imaging the light racing out from distant sun, as beam of light shoots aways is a circular pattern (spherical actually), remembering that, light comes in photons or packets of energy.
so how come is that we do not see "gaps" in the light coming from distant stars as these "rays" should have gaps that are getting farther apart as distance grows

- 45,515

- 193
-
Do you see any "gaps" in the light for an ordinary light bulb? – my2cts Apr 16 '19 at 20:21
-
8@my2cts I would assume the OP is asking about very far distances and is coming from the point of view that the area that light could reach is not showered by a continuous distribution of photons – BioPhysicist Apr 16 '19 at 20:26
-
Suns emit a lot of photons. And even then, look at how few stars you can see on the night sky - there's billions to be seen just in our galaxy, and yet you can only see a few thousand (individual solar systems; large collections including the "band" of the Milky way or entire galaxies can also be visible, but that still fits - there's even more photons coming from those :)). Why? Because for most stars, too few photons arrive in any given time to create an image in your brain. You need to use a telescope and/or long exposure to see more. – Luaan Apr 17 '19 at 08:56
2 Answers
You are right that single photon detection is a discrete event. But you are under the false assumption that these "rays" are discretely distributed.
Ideally, a photon would have an equal probability of being emitted through any solid angle out of the star. i.e. it is a uniform probability distribution with respect to the solid angle. There aren't single rays that are evenly distributed around the star that the photons travel along.
For a water analogy, it is not like the star is a spherical shower head where photons can only be released from discrete locations. So, even if you might have a different random distribution of photon detection events at different angular locations relative to the star, you will still always see photons (this is neglecting stars that are so far away from us that their light never reaches us due to the expanding universe).
Of course, if you are far enough away you will experience fewer and fewer photons. However this is not limited to certain "rays". This will be true at any angle at a large enough distance.

- 56,248
-
4The question is being asked by somebody who thinks in terms of light rays that are getting farther apart as they go farther from the source, and it is not a wrong way to think about it. So, very few photons going into a vast space ---> big gaps. And indeed photons from a given star arrive very far spaced out, less than one per second into a human eye. – Kphysics Apr 16 '19 at 21:13
-
@Kostas But the sparsity of photons is not due to not being along the correct ray. Nevertheless I have added something to my answer about being far away from a star. – BioPhysicist Apr 16 '19 at 21:25
-
9The faintest stars we can see deliver 140 photons per second to our eyes. – Keith McClary Apr 17 '19 at 03:12
-
1@KeithMcClary- that link ends with 14 photons per second, not 140. With the .1 sec window (mentioned in the link), this fits with the detection limit I learned in the 1980's, which is 1 to 2 photons. – amI Apr 17 '19 at 08:11
-
@aml Actually it's not 100% clear but right before that 14 photons statement, it says about a brighter star: "Following the same calculation, only ~90 photons are available within the shutter time [...]", the shutter time given as 0.1s before. To me it is implied that the 14 photons for the dimmest star are arriving in that shutter time as well. – smcs Apr 17 '19 at 09:30
-
Poisson statistics says if there are on average 14 photons per unit of time (seemingly 0.1s is the correct unit of time for the neural system), then it will fluctuate with a standard deviation of +-4 photons. This is responsible for the fact that stars twinkle, but planets dont. – Kphysics Apr 17 '19 at 21:40
Very good question. Here is a more QM explanation. It is almost the same as if you would (only for your case) take the Sun as an atom, that is surrounded by an electron field as per QM.
Now the wavefunction of the electron describes the probability distribution of the electron being at a certain position in space around the nucleus.
You would think that the electron can only be at certain discrete number of positions? Well as per QM, the answer is no. In simple words, the electron is at a certain energy level around the nucleus as per QM, but inside that energy level, the electron could be anywhere.
Since the atomic system (and the electron) emits the photons, and the electron could be anywhere (inside the certain energy level as per QM) how would you tell where the electron is at the moment of emission?
So you would imagine that the electron could only take certain fixed positions around the nucleus, and emit the photon from those positions. In reality the electron's position is described by the wavefunction, and it is continuous. Simply said, the electron could be anywhere (inside that certain energy level as per QM).
So in your case if you look at just one single atom, and the atom emits photons from far away, the photons will be continuously distributed. There will be no gaps between the photons.
Now if you look at the Sun, which is made of a whole lot of atoms, you can take it analogously, the photons will be distributed continuously.

- 28,452
-
I'm skeptical of this answer, because the wavefunction of the electron is only spread out as long as it is not measured, and we can't measure the wavefunction directly. – Allure Apr 17 '19 at 06:52
-
@Allure what I am saying is it is continuous, that is why photons are emitted continuously over the surface of the electron field as per QM. – Árpád Szendrei Apr 17 '19 at 07:26