6

Optical clocks, based on optical transitions either in cold atomic lattices or trapped ions, have been shown to up to one million times better accuracy/precision compared to the cesium microwave atomic clock ($10^{-21}$ frequency stability for optical, $10^{-15}$ for cesium). Nonetheless, the SI second is still based on cesium fountains.

Currently, what are the main hurdles stopping the second being redefined based on optical clocks, rather than cesium microwave clocks?

It is clear that cesium clocks (and other microwave clocks) will always be useful because of their compact and rugged nature compared to bulky optical clocks. However, that advantage alone should not be a show stopper.

Qmechanic
  • 201,751
KF Gauss
  • 7,823
  • To me this really seems like just splitting hairs since the optical frequency strontium based clock you are likely referencing is still considered a type of atomic clock. I'd imagine there is always a lot of thought that goes into when new standards are put forth for such things. Hopefully someone can dive deeper and deliver a well researched answer as I'm just as interested. – Triatticus Jul 06 '23 at 01:11
  • @Triatticus Microwave clocks work based on microwave transitions, while optical clocks work based on optical transitions. The precision and accuracy you get from the latter is very different than the former. This isn't "splitting hairs" unless you don't care about a factor of over one million in precision? – KF Gauss Jul 06 '23 at 01:21
  • 1
    According to this paper (just doing a google search) https://iopscience.iop.org/article/10.1088/1681-7575/ab3a82, one issue is "many atomic species are potential contenders to become the new primary frequency standard." – Andrew Jul 06 '23 at 02:31
  • 1
    Another article: https://www.lne.fr/en/we-talk-about-it/redefinition-second-unit "Today, there is no doubt that optical clocks will eventually become the new time standards. However, there is still a long way to go to move from the laboratory to a robust and durable global metrological infrastructure capable of supplanting cesium. " – Andrew Jul 06 '23 at 02:33
  • 1
    Yup, you have to get the majority of primary standards labs to evaluate the options and agree. The process is long and involved to reach a decision, which itself is backed up by lots of data. – Jon Custer Jul 06 '23 at 02:33
  • @KFGauss what I mean to say is that they are both however known as atomic clocks, so calling one an atomic clock and one an optical clock is the splitting hairs part, I also acknowledged the multibillion year clock you are likely talking about. However even those researchers called such a device an atomic clock. – Triatticus Jul 06 '23 at 02:44

1 Answers1

5

You really shouldn't rush decisions like this. Time keeping is arguably the oldest branch of science, and it usually takes a long time for any changes to the way that people keep time to be decided upon and to be implemented. When the process is rushed, the results may be... undesirable. Bear in mind that time keeping has legal ramifications, so it requires political consensus as well as scientific and engineering considerations.

It can be difficult to reverse bad time-related decisions, or to make substantial changes to time keeping systems. Eg, we still use the base 60 system of subdividing time that we inherited from the Babylonians. (The French did briefly try a metric version of time, but it was a disaster).

The recent history of precision chronology has been a litany of committees that fail to reach decisions, with poor communication and misunderstanding between various involved parties, punctuated by hasty decision making, and technical details hidden by paywalls, particularly in regard to leap seconds. (This has led to the embarrassing situation that the POSIX standard is internally inconsistent regarding leap seconds).

For a summary of the sordid details, please see A brief history of time scales, by Steve Allen of the Lick Observatory. He has more info about issues related to time keeping (especially leap seconds) here.


As for creating a new second standard, we first need to decide which transition to use. It would be unfortunate to make a choice based on some current cutting-edge clock if there's a better design just around the corner. It's not just about having high precision: the design also needs to be reasonably robust, so that it's not too hard to reproduce. We need a global network of clocks; a single master time keeper is not sufficient. This gets tricky with extremely precise clocks because such clocks are highly sensitive to various relativistic effects. The current atomic clock network has to take clock altitude and latitude into account. Optical clocks are much more sensitive to altitude, and a global optical clock network would likely take the variations due to the Moon's gravitational potential into account explicitly.

There's some talk about using a nuclear isomer transition for time keeping: a nuclear clock, but this is still a very experimental technique. In theory, a nuclear clock is more robust, since nuclear transitions are less sensitive to the temperature and to stray electromagnetic fields than electronic transitions are.

FWIW, here's a table of atomic clock precision that I compiled for an earlier answer, with data courtesy of Wikipedia.

Atom Type Uncertainty
Cs-133 Beam 1e-13
Rb-87 Beam 1e-12
H-1 Beam 1e-15
Cs-133 Fountain 1e-16
Sr-87 Lattice 1e-17
Mg+Al Lattice 8.6e-18
Yb-177 Lattice 1.6e-18
Al+ Lattice 9.4e-19
Sr-87 Fermi gas 2.5e-19

"Beam" refers to a standard off-the-shelf beam maser. "Fountain" is an atomic fountain, that value is for NIST-F2. "Lattice" is an optical lattice. "Fermi gas" is a 3D quantum gas optical lattice.

NIST-F1 (also an atomic fountain) has an uncertainty around 5e-16. Together, NIST-F1 and NIST-F2 form the primary time & frequency reference for the USA.

PM 2Ring
  • 11,873