3

I have two parts to this question.

First, I understand that the meter is defined as the distance light travels in 1/299,792,458 seconds. But how is this distance actually measured? The second is obviously from an atomic clock, but Wikipedia makes it appear that the distance is calculated by the counting of wavelengths. For the count of wavelength to be useful you must calculate the physical length of it, which is dependent on the speed of light, which is defined with meters and to increase the accuracy of the measurement, NIST recommends the use of a specific wavelength of laser (which is in meters). Does this not make the actual measurement of the length of the meter circular? Or does the fact that the speed of light is a constant defined in meter/second over come the appearance of the circular logic? The way I see the circular logic would be if the speed of light ever changed, the length of the meter would also change, which make it impossible for us to know the speed of light changed without referencing it back to an older physical object.

Further, the measurement is done in a vacuum but we can not actually create a perfect vacuum, so I assume there would be a pressure range allowed on the vacuum, and pressure measurements are also based on the definition of a meter (I would think this would further add circular logic to laboratory measurements performed in air and adjusted for refraction).

Second, how is this measurement for the meter actually used to calibrate physical objects? I.e. If I buy a meter stick that has been calibrated against the national standard, how do they actually compare the length of the stick vs the wavelength measurements?

Qmechanic
  • 201,751
OSUZorba
  • 133

2 Answers2

5

The length of a meter bar can be measured using a HeNe laser. The laser used is chosen because we are very good at stabilizing the frequencies it outputs. This means that if we can measure the frequencies before the testing, they will remain steady during the testing. A cesium clock can be used to determine the frequencies of light used with great precision, as the second is defined from said clocks.

Once you have a good measurement of frequency, you have a good measurement of wavelength (assuming a reasonable medium... a vacuum is best, as the speed of light is defined in a vacuum, thus no uncertainty). With a wavelength in hand, you can do interferometry.

In the approach linked above, they start with a mirror at one end of the bar, and they call that 0. They then slowly move the mirror towards the other end of the bar, counting the number of fringes (light/dark patterns) that occur. They use an automated fringe counter to do this -- no human has to count the roughly 3.1 million fringes that will occur from one side to the other. Such counting is relatively straight forward for even simple equipment. If your mirror travels around 1mm/s, the signal from these lights and darks will fall comfortably within the audible spectrum (3.1kHz), where even low end audio equipment would be able to process the signal.

And, as always, we never try to truly create a 1 meter bar. Instead, we create a bar which is very close to a meter, and the measure it. The bar they measured came out to 1.00000105m, which would be recorded in metrology certifications used to create other bars calibrated from this one.

Cort Ammon
  • 48,357
4

Practical measurements of distance were done by counting wavelengths, a fairly simple off the shelf interferometer will measure to 1nm. For this you do need to know the wavelength of the light used (in nm), which comes from the definition of the meter. This was done by measuring the wavelength against the old meter standard - so it is circular.

It is probably still how most measuring devices are actually calibrated - it is much easier to compare two lengths (a reference and the test part) with an interferometer than measure time of flight along a gauge block.

The new definition of the meter - measuring distances by purely time of flight doesn't depend on the wavelength and so only depends on the second, which is itself set from the atomic properties of the atom in your clock and the fixed speed of light.

Since the second is so accurately defined and the speed of light is constant, any new more accurate measurements of the speed of light mean the length of the meter changes - but these are so small they have no practical implications.

It is done in vacuum because the speed of light in air depends on air density, and so temperature. Typically about 1 ppm / deg C. You don't need a perfect vacuum because once you get down to 1/millions of atmospheric pressure the difference between speed of light in 'almost no air' and vacuum is tiny. You can even correct for it if you know the pressure.

  • Thanks for your detailed response. The only thing it is missing is the answer to how they physically calibrate measuring devices to this standard, since at the end of the day people need a calibrated ruler to actually perform measurements. – OSUZorba Sep 07 '19 at 04:14