24

Why is the speed of light defined as $299792458$ $m/s$? Why did they choose that number and no other number?

Or phrased differently:

Why is a metre $1/299792458$ of the distance light travels in a sec in a vacuum?

Qmechanic
  • 201,751
Xplane
  • 421
  • 1
  • 3
  • 8
  • 8
    It's arbitrary, but historically, since a meter was already an accepted length, they defined the speed of light to keep the meter the same. – Mitchell Apr 30 '11 at 22:08
  • 11
    Because it's written 10 in base-299792458. You see, in base-299792458, 1m is the length of your stick, and 10m is the speed of light. That's a beautiful coincidence, right? – Lie Ryan May 01 '11 at 07:04
  • 7
    Isn't the speed of light = 1? –  May 01 '11 at 21:35
  • Because the meter is defined as the length of the path travelled by light in vacuum in $1⁄299,792,458$ of a second. Metre – Pratik Deoghare May 28 '11 at 13:06
  • 2
    duplicate of http://physics.stackexchange.com/q/3644/4552 –  May 18 '13 at 18:50
  • 3
    The speed of light is exactly $299792458 m/s^2$ because the Devil altered it from its natural value of $300000000 m/s^2$ as a demonstration of his increasing power. See, for example, the end of Section 2 here: http://library.msri.org/books/sga/from_grothendieck.pdf – WillO Jan 24 '16 at 00:14
  • 1
    Change the numerical constant and replace "second" with "metre" in this answer https://physics.stackexchange.com/a/243162/104696 – Farcher Jun 16 '17 at 12:56
  • 1
    Metrology exists primarily to serve commerce rather than science or some desire for nice round numbers. This means any changes in the standard must necessarily be backwards compatible. – David Hammen Jun 16 '17 at 14:12
  • @Mitchell if it's arbitrary, why don't why choose 1m/s as light speed? – Qian Chen Aug 26 '21 at 21:01

6 Answers6

44

The speed of light is 299 792 458 m/s because people used to define one meter as 1/40,000,000 of the Earth's meridian - so that the circumference of the Earth was 40,000 kilometers.

Also, they used to define one second as 1/86,400 of a solar day so that the day may be divided to 24 hours each containing 60 minutes per 60 seconds.

In our Universe, it happens to be the case that light is moving by speed so that in 1 second defined above, it moves approximately by 299,792,458 meters defined above. In other words, during one solar day, light changes its position by $$ \delta x = 86,400 \times 299,792,458 / 40,000,000\,\,{\rm circumferences\,\,of\,\,the\,\,Earth}.$$ The number above is approximately 647,552. Try it. Instruct light to orbit along the surface of the Earth and you will find out that in between two noons, it will complete 647,552 orbits. Why it is exactly this number? Well, it was because how the Earth was created and evolved.

If it didn't hit a big rock called Megapluto about 4,701,234,567.31415926 years ago, it would have been a few percent larger and it would be rotating with a frequency smaller by 1.734546346 percent, so 647,552 would be replaced by 648,243.25246 - but because we hit Megapluto, the ratio eventually became what I said.

(There were about a million of similarly important big events that I skip, too.)

The Earth's size and speed of spinning were OK for a while but they're not really regular or accurate so people ultimately switched to wavelengths and durations of some electromagnetic waves emitted by atoms. Spectroscopy remains the most accurate way in which we can measure time and distances. They chose the new meter and the new second as a multiple of the wavelength or periodicity of the photons emitted by various atoms - so that the new meter and the new second agreed with the old ones - those defined from the circumference of the Earth and from the solar day - within the available accuracy.

For some time, people would use two independent electromagnetic waves to define 1 meter and 1 second. In those units, they could have measured the speed of light and find out that it was 299,792,458 plus minus 1.2 meters per second or so. (The accuracy was not that great for years, but the error of 1.2 meters was the final accuracy achieved in the early 1980s.)

Because the speed of light is so fundamental - adult physicists use units in which $c=1$, anyway - physicists decided in the 1980s to redefine the units so that both 1 meter and 1 second use the same type of electromagnetic wave to be defined. 1 meter was defined as 1/299,792,458 of light seconds which, once again, agreed with the previous definition based on two different electromagnetic waves within the accuracy.

The advantage is that the speed of light is known to be accurately, by definition, today. Up to the inconvenient numerical factor of 299,792,458 - which is otherwise convenient to communicate with ordinary and not so ordinary people, especially those who have been trained to use meters and seconds based on the solar day and meridian - it is as robust as the $c=1$ units.

Luboš Motl
  • 179,018
  • I'm still so confused. The meter is based on the speed of light and vise versa. It's cyclical. Making them completely arbitrary. They're historically based on the earth's circumference, but we know that's inconsistent. So are we basing both measurements on an inconsistent measurement? To me it seems like we have a cyclical measurement based on an inaccurate measurement which really doesn't help. Is there a reliable constant that the meter and speed of light are based on? – David Hobs Apr 09 '16 at 00:18
  • 1
    David, one meter is a unit, a purely human convention. Be sure that the speed of light, which is a fact about Nature, doesn't "depend" on the choice of one meter or anything man-made in general. Even without humans, there would be light and it would move by a speed. Nature doesn't need humans or their inventions and conventions for that. It's just complete nonsense that there is something inaccurate about the definition of 1 meter or the speed of light we know. – Luboš Motl Apr 10 '16 at 08:46
  • 2
    One meter is just defined in such a way that the speed of light in SI is totally accurately known, 299 792 458. Now, with this well-defined meter (assuming that you are OK with the definition of 1 second and you can reconstruct it), one may measure distances. The question is how accurately we may measure distances in terms of 1 meter as it is currently defined. The accuracy is amazingly good because when we switched to the new definition, the accuracy of $c$ was already good - and we could measure $c$ in terms of the old meter - so the accuracy hasn't "dropped". – Luboš Motl Apr 10 '16 at 08:48
  • @LubošMotl can you please tell me where you say "assuming that you are OK with the definition of 1 second and you can reconstruct it". So you agree, that the second is still defined by the Cesium atom right? So the second is not defined by the speed of light. – Árpád Szendrei Aug 12 '18 at 05:41
  • Yes, a second is defined through the atom and the speed of light doesn't appear in the definition. I've never claimed otherwise. – Luboš Motl Aug 28 '18 at 05:17
  • We could've easily made it 299792460, though. There'd only be a 0.00000067% difference in the definition of the meter. Plus, that way, we'd have a speed of light divisible by 3, 4, and 5. Which...I suppose isn't that important, but still. – Math Machine Jul 15 '21 at 04:23
  • The error margin when the decision was made was just 1.2 meters per second, I remember it very well, so yours would be almost 2 sigma off. Surely this error is considered more important than some unimpressive numerological coincidence. – Luboš Motl Jul 16 '21 at 07:17
17

The number is arbitrary. We could have chosen it to be 1 m/s, but that would have been a little awkward, as we'd be going around telling people to go left at the next corner .00000084 meters down the road.

It used to be that the meter was defined in terms of two scratches on a stick. Therefore, we could measure the speed of light experimentally. Before redefining things they measured as accurately as they could and then used the result as the new official standard.

Defining the speed of light means that the meter is no longer defined in terms of scratches on a stick. If it were, we'd have the second, the meter, and the speed of light all defined independently, while really they are dependent. That means that future, more accurate measurements could show that our system was inconsistent. As a result, the meter is now simply the distance that light moves in 1/299792458 seconds.

I think the main motivation behind this is that speed of light is always the same everywhere (for a local measurement) and is fairly easy to measure. Before the redefinition, the accuracy in the measurement of the speed of light was getting to be so good that it was limited by the accuracy with which we could measure a meter.

Many precise distance measurements actually are made using light rather than meter sticks. See, for example, LIDAR, in which light is bounced off a source, and you see how long it takes to come back. Precise distance measurements (or measurements of change in a distance) can also be made with light using interferometry. See LIGO for example.

Mark Eichenlaub
  • 52,955
  • 15
  • 140
  • 238
16

It's a rather nice historical full circle.

The meter was originally defined as 1/10,000,000 the distance from the equator to the north pole - through Paris - as Luboš explained above.

This was dreamed up by some guys in the French revolution with the idea that units of physics should depend only on the universe (or at least Paris - which is pretty much the same thing) and not on historic details like the length of a king's arm or foot.

This also meant that you could independently arrive at your own impartial measurement of the unit - there was no master metre in a vault somewhere. Thus making all men, and their measurements, equal.

Like many of the French revolution's ideas this didn't quite work out exactly as planned - the survey was off - and soon the demands of industry meant that the only sufficiently accurate standard was a master metre in a vault somewhere.

By redefining the metre in terms of the speed of light - which is universal (even outside Paris) we are going back to that original noble principle.

Extra Note: You don't need to go to the north pole. It's easy to measure the latitude (angle north/south of the equator) of a place with just a telescope, you can then measure the distance on the ground between two places north/south of each other and get a definition of the metre. This lead to the discovery that the Earth isn't a sphere, so it doesn't actually work, but did lead to a whole new science of measuring the Earth.

2

Because of historic reasons. Before the current definition a meter was defined as the length of some piece of material, the prototype meter bar. Then the speed of light was measured based on that length, which turned out to be 299792458 lengths of the bar per second. This was then used to redefine the meter to the distance light in vacuum travels in 1/299792458 seconds, so that the old and new definition match as closely as possible.

noah
  • 10,324
  • In between the prototype metre bar (actually, three of them, first a provision brass prototype bar made in 1795, then the first official platinum prototype bar made in 1799, and finally an improved platinum-iridium prototype bar made in 1889), there was a 23 year period (1960 to 1983) where the metre was defined as a specific number of wavelengths of a specific frequency of light. – David Hammen Jun 16 '17 at 14:19
2

Clearly calculations would be easier if we had $c\stackrel{def}{=}3\times10^8{\rm m\,s^{-1}}$ and other physical constants would be convenient values: the freespace electric constant would be $\frac{1}{36\,\pi}\times 10^{-9}{\rm F\,m^{-1}}$ and the freespace wave impedance would be $120\pi\,\Omega$.

However, a primary goal in revision of metrology standards is to cause the smallest disruption possible to former practice when any changes are made. Revised systems of measurement have to be as backwardly compatible as possible. There was already a definition of the meter in place, and that ultimately came from the physical meter bar, as described in noah's answer. A change in the meter to shift of $299792458{\rm m\,s^{-1}}$ to $300\, 000\, 000{\rm m\,s^{-1}}$ would be a change of about 700 parts per million ($7\times 10^{-4}$), and that would cause major disruption to current practices in science, industry and just about every human endeavor.

0

The metre was defined (by a physical platinum bar in Paris) 2 hundred years, 8 generations, before the speed of light was found to be a good universal constant. The length of a metre was in used in commerce, sheet metal industry, clothing fabric inventory, etc. So, when we discovered that speed of light was a good universal constant, we wanted to relate the length that light travels in 1 second to a metre, but we didn't want to change the original definition of a metre too much. And the 2.99792458 number is a good compromise.