13

In this (in my opinion) intriguing speech, Rupert Sheldrake tells the story of the drop in the measured value of $c$ between 1928 and 1945. When he goes to visit the Head of Metrology of the Physics Lab in Teddington, he says (I summarize):

$c$ cannot change, it is a constant! We explain the drop you are talking about with "intellectual phase locking". Anyway, we have now solved the problem. We fixed the speed of light by definition in 1972. It might still change, but since we define the meter from $c$, we would never know.

Is that true? If $c$ changed, would we be able to see it? And how does science explain the famous drop in the measured value of $c$?

usumdelphini
  • 1,793

2 Answers2

14
  1. The speed of light was defined at its present value in 1983, not 1972.
  2. We could know that $c$ because $\alpha\propto1/c$ (fine structure constant) and we have better ways of determining $\alpha$ than $c$
    1. Not actually true: we cannot determine if physical constants change, cf. this Physics.SE Q&A
  3. "Official" science uses error bars when measuring things, "unofficial" scientists ignore these crucial components.

enter image description here

(based on data from Wikipedia and Henrion & Fischhoff 1986 (NB: PDF)). The relevant section of Henrion & Feschhoff reads,

A related measure [to the chi-squared statistic], the Birge ratio, $R_B$, assesses the compatibility of a set of measurements by comparing the variability among experiments to the reported uncertainties. It may be defined as the standard deviation of the normalized residuals:$$R_B^2=\sum_ih_i^2/(N-1)$$ Alternatively, the Birge ratio may be seen as a measure of the appropriateness of the reported uncertainties...If $R_B$ is much greater than one, then one or more of the experiments has underestimated its uncertainty and may contain unrecognized systematic errors...If $R_B$ is much less than one, then either the uncertainties have, in the aggregate, been overestimated or the errors are correlated.

According to Henrion & Fischhoff, the Birge ratio in the range 1875-1941 was 1.47 while the range 1947-1958 had a ratio of 1.32; the combined ranges give $R_B= 1.42$. This means that pretty much all the data taken prior to the 1960's was not accounting for error correctly. Since then, we have improved (a) our experiments to reduce the errors and (b) our ability to correctly account for errors.

Kyle Kanos
  • 28,229
  • 41
  • 68
  • 131
  • 1
    Your PDF link gives 404. – Ruslan Jan 17 '18 at 09:22
  • BTW, doesn't using of the quantity in the quote assume that speed of light is not changing? – Ruslan Jan 17 '18 at 09:58
  • @Ruslan: I updated the document link. I will get back to you about the Birge ratio assuming $c$ is constant; I don't believe it's a requirement, but I'll look more into it. – Kyle Kanos Jan 17 '18 at 11:06
  • We could know that $c$ because $\alpha\propto1/c$ ... and we have better ways of determining $\alpha$ than $c$ No, a change in the fine structure constant is not necessarily a change in $c$. It is not possible to tell whether a universal constant with units changes over time. –  Jan 11 '19 at 14:54
  • @BenCrowell fixed. – Kyle Kanos Jan 11 '19 at 15:00
3

We fixed the speed of light by definition in 1972.

Already by 1960, J.L. Synge (Relativity, the General Theory Ch. III §2) taught:

"For us time [or rather, duration] is the only basic measure. Length (or distance [or, indeed, quasi-distances]), in so far as it is necessary or desirable to introduce it, is strictly a derived concept, and will be dealt with later in that spirit. "

In 1983, on the other hand, the 17th CGPM defined (effectively) the SI unit of "speed", i.e. the ratio of SI base units "m / s", as the $1 / 299792458$th of the speed of light (in vacuum). However, the value of the speed of light (in vacuum) itself is of course unaffected by any particular definition of units.

He [the speed of light (in vacuum)] might still change,

That appears doubtful. As long as reference is made to the same definition of "ligth (in vacuum)", and as long as "length" is strictly understood as a derived measure (with the same derivation or definition used consistently), the notion "speed of light (in vacuum)" plainly remains unchanged.

A note in consideration of the already published answer by Kyle Kanos:
Of course, the speed of light (in vacuum) being constant (by definition of "length", and thus of "speed") does not preclude the electro-magnetic coupling (referenced to vacuum) of some particular given charged particles to be found of different value, in different trials;
nor, for instance, the length of some particular given "platinum-iridium bar" to be found of different value, in different trials.

And how does official science explains really the famous drop in the measures of c?

My own assessment (which is hereby public and open for comments/responses):
It appears doubtful that people who claim to have "measured the speed of light (in vacuum)", (rather than, for instance, having measured distances between different identifiable parts of experimental equipment, or whether they were at rest to each other in the first place; or having measured the index of refraction in a particular experimental region) were able to assign any finite range of systematic uncertainty (or confidence intervals) to their reported results at all. Thus any possible such "drop" appears insignificant; and one may not strictly speak of such reports as "measures of c" in the first place.

user12262
  • 4,258
  • 17
  • 40