3

According to the definition a second is defined as the

duration of 9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the cesium-133 atom.

Why specifically 9,192,631,770? Is there a scientific purpose for such?

And if there is a scientific purpose, why is this method (cesium atomic clock) the most accurate for it?

Qmechanic
  • 201,751
HyperLuminal
  • 1,968
  • 1
    Somewhat duplicate to https://physics.stackexchange.com/q/73766/ – Ignacio Vergara Kausel Apr 27 '15 at 13:28
  • Yes, it's arbitrary. The purpose of that number is to make the official, modern second very close to the historical second $\frac{1}{86400}$ of one solar day. For a system of units that is less arbitrary (but less practical for everyday measurements) see http://en.wikipedia.org/wiki/Planck_units – Solomon Slow Apr 27 '15 at 13:39
  • The second had a definition before the invention of cesium clocks (or the earlier spectroscopy-based definitions). The standards labs then translated the definition over to a new standard, and had to make it as accurate as it had been before. Just be glad it isn't 9,162,631,771 periods. Is it arbitrary? Well, yes, at some point somebody defined a second and we are stuck with it. – Jon Custer Apr 27 '15 at 13:39
  • 2
  • @ACuriousMind - I disagree. Questions about metrology are very on-topic, and this is a question about metrology. – David Hammen Apr 27 '15 at 15:45
  • @DavidHammen: Metrology is an iffy subject for on-topicness, it depends on the context of the question (see the link ACM provided). This particular one seems to fall afoul of Qmechanic's #2. – Kyle Kanos Apr 27 '15 at 19:01

2 Answers2

3

The practice of dividing the degree used to measure angle into sixty minutes of arc, and that into sixty seconds of arc is over 2000 years old. The corresponding practice of dividing the hour used to measure time into sixty minutes, and that into sixty seconds, is over 1000 years old. Why sixty? That's over 5000 years old. The Sumerians and Babylonians used base 60 arithmetic.

Old practices die hard. In the case of angle and time, they haven't died yet. The French metric system promulgated in the late 18th century worked fantastically for mass and length (and related concepts such as area and volume). One key factor in this success is that there were no standards for mass, length, area, and volume.

The French also tried to metricize angle and time; there they failed. Old practices die hard, particularly when they are very well standardized. We still use degrees, minutes, and seconds to describe angle, and hours, minutes, and seconds used to describe time. Given that there are 24 hours in a day, 60 minutes in an hour, and 60 seconds in a minute means there are 86400 seconds in a day.

The success of the meter and kilogram and the failure of decimal angle and decimal time taught early metrologists something. When no standard exists, make one up. When a standard does exist, it's best to follow it. Now that we have well established standards for everything that is physically measurable, meteorologists follow the second rule. Redefinitions of a standard are always consistent with previous definitions. For example, the definition of the meter has changed multiple times. The current definition appearing to be completely arbitrary. It's not arbitrary. It's consistent with the initial definition of the meter.

The same goes for time. While the definition of a second has been refined many times, it has always been done in a manner that is consistent with the thousand year old concept that a second is 1/86400th of a day.

David Hammen
  • 41,359
  • Great! So it is arbitrary. Come to think of it now, every SI unit I can think of is arbitrary, in the sense that though they use fundamental constants, they take an arbitrary amount of it (e.g. meter= 1/299,792,458 of the length of light in one second). Except for hertz, which represents one cycle. So is there any standard that can be based off pure constants, instead of an arbitrary portion of a constant or constants? – HyperLuminal Apr 27 '15 at 16:55
  • @HyperLuminal - It's not arbitrary. It's based on the length of a day (something that was important before science became science) and the fact that 60 is a very important number (60 is arguably a much better base than base 10). – David Hammen Apr 27 '15 at 17:01
  • But, 1. A day is not stable in length and is not a fundamental constant, and 2. 60 is still arbitrary. – HyperLuminal Apr 27 '15 at 17:03
  • @HyperLuminal - What, exactly, do you mean by a "pure constant"? One? Two? Pi? Two times pi? Four times pi? Those are "pure constants". Anything else, not so pure. There are systems of units based on those "pure constants" -- the speed of light is one, the universal gravitational constant is one, the Coulomb constant is one, the Boltzmann constant is one, and the Planck constant is two times pi. That's pretty pure. Those are the natural units. They aren't very useful except in theoretical physics. – David Hammen Apr 27 '15 at 17:06
  • Basically, the reason the second seems arbitrary is that it is based off 1/246060 of a day, which is arbitrary; is 1/246060 of a day constant everywhere? If I were to define time, I would define it by a non-shifting constant of nature. – HyperLuminal Apr 27 '15 at 17:11
  • 1
    @HyperLuminal - Other than "one" (in particular, one Planck time), there is no such beast. Everything else is arbitrary. If you were king and chose to define time something markedly different from a second, you would be dethroned rather quickly. The definition of the second is not arbitrary. It has deep historic roots. – David Hammen Apr 27 '15 at 17:53
  • @HyperLuminal There are several systems of "natural units" which are based only on fundamental physical constants, but (a) it's not possible to get rid of all the scale factors that way, and (b) the unit quantities of each dimension (length, time, mass, etc) tend to be either way too big or way too small to be convenient for much outside of theoretical physics. – zwol Apr 27 '15 at 18:45
  • 1
    @DavidHammen - the experience in the French Revolution should be a warning to anyone who wants to tinker with historical time keeping traditions. See decimal time - 10 hours of 100 minutes of 100 seconds, which lasted about 6 months... not to mention the 10 day week. People were really losing their heads! – Floris Apr 04 '16 at 22:33
2

It's historical. The second was originally defined so that $60\cdot 60\cdot 24\,\rm s$ added up to a solar day. But that's a little hairy to measure, because the length of the day varies through the year. The sunrise-to-sunrise time varies from winter to summer. The noon-to-noon time interval, which would be operationally defined as the interval between solar meridian crossings, also varies through the year due to the eccentricity of Earth's orbit around the sun; the pattern it makes is called the analemma. So practically the second was historically defined so that $60\cdot 60\cdot 24\cdot 365.25\,\rm s$ adds up to a year. And just to be precise, the original SI definition of the second was that it was the appropriate fraction of the year 1900 — nice and specific, but not repeatable.

With this definition of the second, several new facts about nature were discovered:

  • There is an easy-to-define atomic transition between electronic excited states in cesium atoms with a frequency close to 9.2 GHz.
  • This frequency only depends on basic facts of nature, like the strength of the electromagnetic force and the masses of the electron and nucleons.
  • The rotational period of the Earth actually isn't all that stable. For example, a big earthquake, which moves a lot of rock by a (geologically) small distance, changes the Earth's moment of inertia and the length of the day changes to conserve angular momentum. I dimly remember that the Christmas earthquake in 2004 lengthened the day by 5–10 μs, which changes the length of the day in the tenth significant figure.

With the invention of the cesium fountain clock, we had a better frequency standard than the rotation of the Earth. So the best value for the transition frequency of this clock, in units of the prevailing definition of the second, was arbitrarily taken to be "the" second.

There is a similar transition planned for 2018 to define the kilogram in terms of the current best values for several fundamental constants.

I have heard murmurs that there is a new atomic-clock technology based on a faster (THz) transition in a different atom, which may eventually supplant the cesium standard, but I can't find anything online with a brief search.

rob
  • 89,569
  • "So practically the second was historically defined so that 60•60•24•365.25 adds up to a year." - I don't think that part is right. – b_jonas Apr 27 '15 at 14:15
  • The next generation clocks will probably be based on optical transtions (optical clocks) – Martin J.H. Apr 27 '15 at 14:46
  • It implies that the length of the year has something to do with it, when it does not. It's the mean solar day. – BowlOfRed Apr 27 '15 at 18:30
  • @BowlOfRed See my link; the 1960 definition for the ephemeris second was "the fraction 1/31,556,925.9747 of the tropical year for 1900 January 0 at 12 hours ephemeris time." – rob Apr 27 '15 at 19:00
  • 1
    To this day, astronomers use 86400 seconds as the definition of a day, and 365.25 days as the definition of a year. Leap seconds and leap days are a royal pain. From a scientific/mathematical point of view, a time standard must somehow be uniform to qualify as reasonable. A time standard that involves leap seconds (UTC) or leap days (the Gregorian or Julian calendar) is not reasonable per this point of view. – David Hammen Apr 04 '16 at 22:57