Let’s put a lot of material points at coordinates (0,0)
.
Let’s give them completely random velocities.
Let’s evolve this system for some time t0
.
Let’s look at the system from the point of view or randomly chosen material point and call this point E.
From its perspective every other point is moving away from E. The recessional velocity from E of every other point is proportional to the distance of that point from E.
You can see it easily if you consider the points on half-axis between (0,0) and current position of E like a point that stayed at (0,0) or the one that got twice as far, or half as far as E. Whenever you consider relative velocities between E and the point it turns up to be proportional to distance from E. This is also true for all other material points, not just the ones lying on the half-axis and doesn't depend on which point we chosen as E.
So v = H*D
(where D
is the distance from the point, v
is recessional velocity and H
is velocity of point E in (0,0)
resting coordinate system).
You can also calculate that t0 = 1/H
Is this model sufficient to explain Hubble law? Just that galaxies used to be close together and move away from there at constant random speeds?
What astronomical observations (apart from CMB) can’t be explained by this model (plus gravity that might have given galaxies slight deceleration in the beginning which diverges from linearity a bit for very distant galaxies)?
Here's some screenshots from simulation I made while thinking about this.
Point chosen as E is not in the epicenter (which is the middle of the square, it's a square because I chose random velocities by choosing random horizontal and vertical component independently, distribution just influences point density, not observed recessional velocities at t0
). Lines illustrate recessional velocities.
And here's exactly the same system just a bit later. You can no longer tell that you are not in the epicenter because you can't see far enough in all directions.