27

Having read that atomic clocks are more accurate than mechanical clocks as they lose a second only in millions of years, I wonder why it is necessary for a reference clock to worry about this, if the definition of the second itself is a function of the number of ticks the clock makes.

Why don't we just use a single simple mechanical clock somewhere with a wound up spring that makes it tick, and whenever it makes a tick, treat it as a second having elapsed?

(Assuming this clock was broadcasting its time via internet ntp servers to everyone in the world)

Qmechanic
  • 201,751
  • 16
    The most long-term accurate clocks are mechanical - rotating neutron stars. – Peter Mortensen Feb 22 '22 at 15:19
  • 9
    @PeterMortensen ...only when they are in a proper mood. https://en.wikipedia.org/wiki/Glitch_(astronomy) – fraxinus Feb 22 '22 at 18:33
  • 3
    It is not only necessary for a clock to be accurate, but also to be precise. Else, repeatability of short term time dependent experiments suffers. – Glen Yates Feb 22 '22 at 22:09
  • The title is not particularly accurate/descriptive of the question being asked. It could currently be accurately answered with "because it's useful to know when things will happen" but that wouldn't actually address what OP is wondering about in the question body. – TylerH Feb 23 '22 at 16:42
  • 1
    Also, the second is specifically defined in the International System of Units: "The second is equal to the duration of 9192631770 periods of the radiation corresponding to the transition between the hyperfine levels of the unperturbed ground state of the 133Cs atom." from https://en.wikipedia.org/wiki/Second – Ryan Donovan Feb 23 '22 at 21:21

19 Answers19

76

why it is necessary for a reference clock to worry about this, if the definition of the second itself is a function of the number of ticks the clock makes.

The concern is that somebody else (say a scientist in France or China or Botswana) needs to be able to build a clock that measures seconds at the same rate mine does.

If we both have atomic clocks, we can keep our clocks syncronized to within microseconds per year. If we have mechanical clocks they might be different from each other by a second (or anyway some milliseconds) by the end of a year. If we're doing very exact measurements (comparing the arrival times of gamma rays from astronomical events at different parts of the Earth, or just using a GPS navigation system) then a few milliseconds (or even microseconds) can make a difference in our results.

The Photon
  • 27,393
  • 50
    This is a nice answer, but it's probably important to note your examples of required accuracy are multiple orders of magnitude too lax. For GPS a microsecond will make a difference in your results (300 metre/1000 feet difference), a millisecond offset means your distence metrics are hundreds of km/miles off. The GPS time system expects to ensure an accuracy of <30 nanoseconds (https://www.gps.gov/systems/gps/performance/accuracy/). Also, you need much-better-than-millisecond accuracy for many other things e.g. modern cellular comms may need you to hit a specific 0.5ms transmit window. – Peteris Feb 22 '22 at 02:30
  • 5
    Another concern is that for many applications it is important that two measurements even with the same clock are as close as possible. For example during the Olympics, or when measuring the same physical phenomenon repeatedly with an error estimate, etc. – Peter - Reinstate Monica Feb 22 '22 at 11:50
  • 18
    @Peteris But why should 1000 ft matter, he asks, as he stares at the cargo ship that just ran aground, or the self-driving vehicle that self-drove-itself off a cliff. – CGCampbell Feb 22 '22 at 13:20
  • 4
    @CGCampbell In his defense, it will not matter much longer. – DKNguyen Feb 22 '22 at 15:34
  • 3
    @Peteris edited – The Photon Feb 22 '22 at 15:37
  • @Peteris NB that there are multiple flavors of GPS, and the one you usually think of (CA) is just to bootstrap into the more accurate versions. As such, it provides MUCH more accurate time than it needs for the coarse positioning we all use for navigation. – fectin Feb 22 '22 at 15:50
  • @Peteris: Radio protocols provide ways for clocks to sync with each other, which is why they don't need atomic clocks to hit that 0.5 ms transmit window after running independently for a year or something. You need a clock with some stability, e.g. a quartz crystal, and some logic to figure out a ratio between it and the actual frequency / timing interval other devices are using (a PLL). But that's fine, you don't also need GPS receiver. Sub-millisecond accuracy after seeing something to sync with only a few seconds ago just means everything is happening fast (high frequency). – Peter Cordes Feb 22 '22 at 23:29
  • 2
    Additionally, the internet (well, the fiber optic communications backbone using the SDH and SONET protocols) requires atomic clocks to synchronize the two endpoints of an optical fiber. So without atomic clocks there won't be NTP, or the internet as we currently know it. – JanKanis Feb 23 '22 at 14:56
  • @JanKanis That's the name! SONET. I had forgotten what it was called and was trying to Google random names like SONUS to talk about it here but never did find it. – DKNguyen Mar 03 '22 at 20:20
42

For most of human history, we had a single mechanical clock: the spinning Earth.

Well, actually two mechanical clocks. The Earth’s spin rate is a good constant, but it’s tricky to measure directly. The thing that’s easy to measure is the interval between sunrises, but that gets longer and shorter from time to time. When sunrises are getting farther apart, the weather tends to be getting warmer, so noticing this was useful for agriculture. That’s because of the second mechanical clock: Earth’s orbit around the Sun.

Synchronizing the daily clock and the annual clock was a terrifically hard problem: the mismatch between a tropical year and a 365-day year is small over the duration of a human life. The solution was the Gregorian calendar, which was invented in the 1500s but not adopted worldwide until the twentieth century.

You don’t care about accuracy if you only have one clock. It’s impossible to care: you can’t test the accuracy of a single clock. But if you have ten clocks, you can ask whether they all keep the same time, or whether they all diverge from each other, or whether eight of them stay together but two of them run slow.

rob
  • 89,569
35

Let's say we live in Plato's ideal city-state and I am the Philosopher King. I proclaim my sleeping cycle as the clock. One unit of time is the time interval between two consecutive instances of me waking up. Equipped with this clock, you go on the quest to observe nature and understand its patterns. The world would look incredibly confusing and you would not be able to notice any discernable patterns in its behavior. Sometimes two sunrises would happen in one unit of time-interval and sometimes none, sometimes you'd feel hungry six times in one unit of time-interval and sometimes only once, sometimes you'd be able to finish a given amount of work in one unit of time-interval whereas sometimes it would take you multiple units of time-interval to finish the same amount of work even if you're working in the same manner, etc.

You'd soon realize that if you instead use the Sun as your clock and use its two consecutive rises to define the unit interval of time, a lot of things would start looking more robust, more predictable. You'd see that you almost always feel hungry three times during one unit of time-interval, you always finish roughly the same amount of work in every unit of time-interval, etc.


The point is that a clock needs to be a mechanism that is reliably periodic, ideally, perfectly periodic. As you can see, this is circular but circularity is the wrong thing to focus on. The validity of the scheme comes from the fact that as I illustrated in my example above, there are wrong answers to what you proclaim as periodic in that they won't be useful in finding discernable patterns in the universe. Furthermore, to a certain extent, you can argue that you have good reason to proclaim a system periodic even before specifying how to measure time via appealing to symmetry. For example, you can say that the amount of time that passes during a simple pendulum's left-to-right swing must be equal to the amount of time that passes during the same pendulum's right-to-left swing. Of course, this proclamation of periodicity won't be good enough over a large number of oscillations of the pendulum. You'd notice this in the same way that you noticed that my sleeping pattern wasn't reliably periodic. And you'd go on to find an even more reliably periodic system.

So, the quest of finding more and more accurate clocks is the quest of going closer and closer to an ideal periodic system and this is important because the usefulness of the concept of time is not a penny more than the robustness of the periodicity of the clock that is used to define a unit of time.

  • 2
    Of course, perfect periodicity isn't always the most desirable trait in a time scheme, which is why Universal Standard Time contains leap seconds - consistency with Earth's movement was considered more important than perfect linear time for everyday use – Arcanist Lupus Feb 22 '22 at 14:42
  • 8
    @ArcanistLupus Sure, that's not a physical concern tho. That's just a practical/technological decision. As far as science/physics is concerned, periodicity is the point. –  Feb 22 '22 at 15:27
  • 1
    @ArcanistLupus There is a difference between clocks that measure relative time intervals and clocks ("calendars") that track absolute moments of time for civil purposes. – Rufflewind Feb 23 '22 at 03:32
  • 4
    @ArcanistLupus Leap seconds aren't a part of the clock. They are merely a part of the display. – Aron Feb 23 '22 at 04:04
  • @Aron: It depends on how the clock is implemented. Traditionally, Unix time "ignored" leap seconds, which implied a discontinuity at the moment the leap second was applied. But nowadays, it's becoming increasingly popular to "smear" the leap second (i.e. temporarily adjust the rate of the clock so it gains or loses a second over a 24-hour period), because a great deal of software doesn't like it when the clock abruptly changes (particularly when it goes backwards). – Kevin Feb 23 '22 at 21:15
  • 2
    @Kevin, Re, "...popular to "smear" the leap second..." I don't know about other operating systems, but when you ask for the time in Linux, you specify which one of a number of different "clocks" to interrogate. CLOCK_REALTIME gives you civil time, CLOCK_MONOTONIC_RAW gives you the as-close-as-we-can-get-to-perfectly periodic timing that Dvij D.C.'s answer talks about. Other CLOCK_XXXXX clocks give other guarantees. – Solomon Slow Feb 24 '22 at 00:35
  • 1
    @SolomonSlow: The OS has no idea that a leap second has happened. You basically have the NTP server lie about what time it is. So all clocks other than CLOCK_MONOTONIC will reflect the smearing (that clock doesn't, because it ignores NTP adjustments). The reason this is necessary is because a lot of software incorrectly uses CLOCK_REALTIME when it should have used CLOCK_MONOTONIC, and it is pathologically hard to find and fix all of those bugs, and prove that you have actually done so. – Kevin Feb 24 '22 at 00:37
  • Unix time (aka POSIX time) does not include leap seconds. Also see History of IEEE P1003.1 POSIX time & Epoch time vs. time of day. – PM 2Ring Feb 24 '22 at 04:16
  • @Kevin, Good point. I wasn't thinking about how, in the present scheme of things, the OS can't know any more about wall-clock time than what the NTP service tells it. OTOH, I still think it worth mentioning that Linux at least tries to accommodate the fact that there is more than one kind of "time" that a program might want to know about. – Solomon Slow Feb 24 '22 at 16:21
23

if the definition of the second itself is a function of the number of ticks the clock makes.

Why don't we just use a single simple mechanical clock somewhere with a wound up spring that makes it tick, and whenever it makes a tick, treat it as a second having elapsed?

There is a misconception here. Seconds do not define time nor do clocks define the passage of time. The universe defines the passage of time. Remember that the clock is being used to measure time. The clock doesn't define time. In other words, we are trying to track "universe's clock" which does define the passage of time. Our definition of the second is just to quantify time.

Implicit in this accuracy/precision is that each cycle of the clock is as identical as possible from moment to moment (or rather, tracks the universe's passage of time which is where time dilation gets gnarly). That's what really matters. Not so much the losing one second every million years. That's secondary.

Really, the precision of the clock comes before the accuracy. That is most important: How repeatable each interval is. The accuracy only comes into play when you have a standard you're trying to aspire to such as the theoretical definition of a second or another time standard, or other clocks.

Why don't we just use a single simple mechanical clock somewhere with a wound up spring that makes it tick, and whenever it makes a tick, treat it as a second having elapsed?

(Assuming this clock was broadcasting its time via internet ntp servers to everyone in the world)

Because it's not just about making sure the bus schedule lines up properly for everyone.

Others have mentioned that it helps when you have multiple clocks but even when you only have one clock it is important because if you are using that to measure physical phenomena, that physical phenomena is already running on the universe's clock. When you are using a timing device, there's always "two clocks" present, in a sense.

And the more precise a clock is, the more finely you can subdivide the intervals and still have them be meaningful in order to measure very fast events, or very small time differences between two events.

DKNguyen
  • 9,309
  • The real answer! The point of a clock (or any measuring device, really) is to lack volatility in its units. – Jivan Pal Feb 23 '22 at 22:07
15

Having a central clock system has a lot of drawbacks:

  • broadcasting means the signal takes time, so if I need a clock, it will always be somewhat behind. This cannot be fixed to the precision necessary for many modern applications
  • if the clock fails, then what? Do I replace it with a new one? But will that be precise enough? Because if the new clock ticks differently, all my timings will be off. Maybe my GPS location depends on me being able to measure millisecond precision. Changing that definition changes the mathematics. So you need to synchronize clocks and make sure they don't change the definition of "a second" and that's exactly what atomic clocks are good at. That's what it means to have two atomic clocks that will only differ by a tiny amount of time after x years: They are extremely precise!
  • what if I can't get the signal (ever used a cellphone?)? Then I need my own clock for the time I don't get a signal and if it's not accurate enough, maybe my application fails
  • there really are applications that need to measure extremely small fractions of a second. So just getting each second approximately equal length just doesn't cut it. But you can't broadcast at arbitrary frequency, so how do I subdivide my second enough? I can do it with a local clock, I just can't do it with a radio clock...

And now for the biggest physical reason:

  • the way the clock ticks depends on the reference frame of the clock. If I put the clock on a moving train, it will be slower to all those watching the train (special relativity). If I move it to the top of a mountain, it will go a tiny bit faster (general relativity). This means that the definition of time itself depends on the frame of reference and an "absolute time" does not exist. Therefore, we need to pick a definition which is independent of the frame of reference and use it to measure time within each frame of reference, so we NEED to have several clocks.
Martin
  • 15,550