63

Why is a second equal to the duration of 9,192,631,770 periods of radiation corresponding to the transition between two hyperfine levels of the ground state of the caesium-133 atom?

Why is the number of periods so complicated?

It could be any simple number, why is it exactly 9,192,631,770?

Řídící
  • 6,725
A. Vats
  • 757

3 Answers3

119

That number, 9192631770, was chosen to make the new definition of the second as close as possible to the less precise old second definition. This means that--except for the most precise measurements--instruments calibrated before the new second was defined would not have to be recalibrated.

Mark H
  • 24,002
Farcher
  • 95,680
87

It's a definition of a unit, which is an arbitrary choice. In the past we used to define a second as 1⁄86,400 of a solar day and later as "the fraction 1/31,556,925.9747 of the tropical year for 1900 January 0 at 12 hours ephemeris time" but both are pretty poor ways of measuring time because the Earth's movement in the solar system is subject to perturbations and changes in the mass distribution of the planet (by winds in the atmosphere and ocean currents and even large earthquakes alter the length of a day, even though the change is small compared to the "noise" by the former).

When we invented atomic clocks we had better ways of defining the basic unit of time. The currently accepted definition is as "9192631770 cycles of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the cesium 133". This definition, too, has shortcomings. We have now better atomic clocks than those one can build with cesium atoms and so it can be expected that the definition will change as soon as the national and international bodies responsible for these definitions decide to act on the availability of better clocks.

In the past we also used to define the meter by a wavelength of red-orange light from an optical line of Krypton 86. That made the speed of light a measured quantity. On the other hand, one of our best tested physical facts is that the speed of light is a constant, so we should treat it as one in the way we define our units. Hence we now define the speed of light as a simple numerical constant and the meter as the distance light can traverse in a given time. The definitions of meter and second are therefor linked for the future by one constant factor.

If we could measure distances with higher precision than we can measure time (we don't and it is unlikely that we will have them in the future), then we would make a new physical definition for one meter and use the constant definition of the speed of light to derive one second as the time light takes to traverse a certain distance.

If relativity holds strictly, then the two ways of defining distance/time are equivalent and we can always chose the definition that is the most precise and reproducible.

CuriousOne
  • 16,318
  • 5
    Did you intentionally avoid the "why" (i.e., backwards compatibility) as it had been addressed in other answers? If so, you should have mentioned so. +1 anyway for a clear conceptual explanation and anecdotal information. – Tamoghna Chowdhury Mar 13 '16 at 14:42
  • 3
    @TamoghnaChowdhury: Good point! No, I didn't do that intentionally. I guess it was too "obvious" to mention, but you are correct, it should have been there. – CuriousOne Mar 13 '16 at 18:32
  • 1
    While correct and informative, this answer does not in fact address the question as posed. – Emilio Pisanty Mar 13 '16 at 19:11
  • 4
    @EmilioPisanty: It does address the fact where the "second" originally came from. I wasn't aware that the OP was struggling on such a basic level as to not understand that one would want a new definition to be backward compatible. At the same time, as a now deleted false answer showed, not everybody is aware that units, at the end of the day, are arbitrary. To me the latter is a much more important fact than the compatibility. – CuriousOne Mar 13 '16 at 19:15
  • 4
    Thence "correct" and "informative". But this still fails to address "why is it exactly 9192631770?". – Emilio Pisanty Mar 13 '16 at 19:18
  • @EmilioPisanty: It fails to address that because I am not in the heads of the Conférence générale des poids et mesures. I can't address that question. It would take a historian of science to say what made them chose exactly ...770. Have you been there? Did you hear and read the arguments? :-) Don't get me wrong... but even as a working physicist back in the days I would have been just as happy with ...771 or ...769. Have you ever done an experiment where the difference would have mattered? – CuriousOne Mar 13 '16 at 19:25
  • Luckily, that's what the scientific literature is for, particularly the large output of such conferences documenting their decision-making process in metrological journals. I don't mean to pick a fight - as I said, nothing in this answer is wrong - but there is an answer to the question as posed. – Emilio Pisanty Mar 13 '16 at 19:31
  • @EmilioPisanty: By the site rules you are welcome to vote the question as off-topic and send it to history of science. You are also welcome to browse the science literature and find the information that you think is missing. To me, as a physicist, the EXACT choice is completely irrelevant. To me, as a physicist, any choice of units is ultimately irrelevant. To a metrologist sitting on the committees it may mean the start of a decade long dog-fight for a digit in the 10th or 11th place. :-) – CuriousOne Mar 13 '16 at 19:35
  • Minor comment, not worthy of a downvote: The parenthetic remark (large earthquakes measurably alter the length of a day) is incorrect. Change measurably to theoretically and it will be correct. The change in length of day due to earthquakes is smaller than the noise in the signal, so much smaller that it's perhaps dubious that the change in length of day due to even the largest earthquake would ever be measurable. – David Hammen Mar 13 '16 at 20:44
  • @DavidHammen: You are correct. I will change that. – CuriousOne Mar 13 '16 at 20:51
  • I thought large earthquakes do show up in the graph of actual day lengths. E.g. reduce the period to where it would have been after several weeks of uniform change. – JDługosz Mar 13 '16 at 23:12
  • 2
    @JDługosz: David Hammen is correct. The changes by a large earthquake are $\mu s$ effects that can be simulated, but they are very hard to actually detect on top of the $ms$ background noise caused by the shifting of atmosphere and water. I am still looking for a paper that has one teased out of the data. If you have a citation, I would love to see it. – CuriousOne Mar 13 '16 at 23:22
  • 3
    Um, "both are pretty poor ways of measuring time" is false when the purpose of "time" is to coordinate human activity. The only (pardon the pun) time when SI time makes sense is when you're using time as a physical unit for physics or extremely-high-precision engineering; the rest of the time it's a nuisance (see: leap seconds). – R.. GitHub STOP HELPING ICE Mar 14 '16 at 01:27
  • @CuriousOne: Leap days (years) and leap seconds are completely different topics. The former are predictable/computable because they just reflect a fixed numeric relationship. The latter are produced by observation and correction to reconcile completely different physical bases of time to match, and require a source of authority and means of authoritative input. – R.. GitHub STOP HELPING ICE Mar 14 '16 at 03:07
  • @CuriousOne: Nature has nothing to do with it; it's a matter of human convention to decide that you want to use SI seconds rather than solar seconds (1/86400 days) to coordinate activity. The former is TAI (raw) or UTC (unpredictably adjusted to avoid desync with solar time); the latter is UT1. – R.. GitHub STOP HELPING ICE Mar 14 '16 at 03:50
  • 1
    @CuriousOne: I said that from the very beginning of my comments - see: "is false when the purpose of 'time' is to coordinate human activity" - and was clear that you need a physical unit of time for high-precision use in some scientific and technical applications. My only point was that a "physicist's viewpoint" that solar seconds and the like are "poor ways of measuring time" is biased towards a particular usage case for time that conflicts with the usage cases of everyone else. – R.. GitHub STOP HELPING ICE Mar 14 '16 at 04:08
  • 1
    FWIW, the backwards compatibility means that the SI second is 1/86400 of the mean solar day from around 1820, as I mentioned here. – PM 2Ring Nov 13 '21 at 19:16
24

Most physical units have to be defined in terms of something measurable, and a good definition of a physical unit is one in which the measurement of the unit is very precisely repeatable.

Since prehistory, a day was a very natural way of measuring time, and was highly repeatable, in that the procedure of measuring a day can be performed anywhere on Earth with essentially the same result. The day as a unit then got split into two, for (roughly) sunrise to sunset versus sunset to sunrise, and each half of a day got split into 12 hours. Splitting something into 12 parts was a natural choice at the time, due to the widespread use of a duodecimal (base 12) numbering system in ancient Sumer and India at the time. The later splitting of an hour into 60 minutes, and then later after that splitting a minute into 60 seconds, were natural choices at the time, due to the use in other cultures of sexagesimal (base 60) numbering systems. That definition of a second as being $\frac{1}{24 \times 60 \times 60}$ of a mean solar day was in use from the time it was defined by the Persian scholar al-Biruni 1,016 years ago, until 1967.

However, although measuring time based on the length of a mean solar day was as precisely repeatable of a definition of time units as one could hope for for centuries, astronomical observations in the 1800's and 1900's showed that the duration of the mean solar day wasn't precisely constant, but instead was very gradually getting longer, making the mean solar day a less desirable basis for defining time units. In 1967, the transition time between the two hyperfine levels of the ground state of cesium 133 was about the most precisely repeatable measurements of time that was technologically possible, and certainly more precisely repeatable than measuring the duration of a mean solar day, so in 1967 the definition of a second was changed to be based on cesium 133.

However, redefining the second to be something like 10000000000 of those cesium periods, just because 10000000000 is a "nice" number using modern base 10 numbering, would be a hugely disruptive change for all those people (everybody) who had been using a second as it had been defined for the previous 967 years. To minimize that disruption, the new definition of a second was made to be as close as possible to the same amount of time as the ancient definition of a second.

It's helpful for performing precise calculations to define the second as being an integer number of those cesium 133 periods, and 9192631770 cesium 133 periods was within the range of how long the old second was, i.e., within the experimental error of comparing those two durations as precisely as was technologically possible, so a definition of the second as being precisely 9192631770 of those periods was chosen.

The above history of the definition of a second is slightly oversimplified; see Wikipedia's second article for a more detailed account.

Red Act
  • 7,736
  • Some day there will be a people who use base 37, and they will define a hutag as 37 decades, and a mullir as a 37th of a millihour. – Christopher King Mar 13 '16 at 22:26
  • 1
    Measuring a mean solar day is not that easy: there are only four times a year when an actual solar day is exactly 24 hours long and even those are usually not midnight to midnight, or midday to midday, or dawn to dawn, or dusk to dusk. So it takes time to measure a mean solar day, during which its length may actually change. – Henry Mar 14 '16 at 11:29
  • @PyRulez Why would somebody who counts base 37 think that decades and millihours are interesting? :-P – David Richerby Mar 14 '16 at 20:45
  • @DavidRicherby why would someone who counts base 10 think years and hours are interesting? – Christopher King Mar 14 '16 at 22:44
  • @PyRulez Years are interesting at non-equatorial latitudes because of the seasons. Hours, sure -- we could use any other division of the day (inherently interesting because of our own daily cycles of activity) into other periods of a similar sort of length. But my point is that DECades and MILLIhours are only related to "natural" timescales by the factors of ten, so a base 37-civilization would be unlikely to even consider them. – David Richerby Mar 14 '16 at 22:47
  • @DavidRicherby the hour, minute and second are related to natural time scales thorough base 12 and 60 (like it says in the post). – Christopher King Mar 14 '16 at 22:49
  • 1
    Love how you explain why the second is as long as it is. I would also like to add that those people using sexagesimal (yeah, copy-pasted that :) numbering were geniuses! With that system, you get nice round numbers if you divide by 2, 3, 4, 5, 6, 10 and 12. They should package cookies and M&Ms etc in packs of 12 or 60 so you don't end up fighting over who gets the last one! – Stijn de Witt Mar 15 '16 at 21:55