I am having trouble understanding something:
The concept of time measurement.
So I want to simply this as much as possible to get an intuitive understanding:
We define the second by the time it takes cesium to oscillate 9.192631770 x 10^9 times. So in my head I am picturing someone watching a cesium atom, counting 1,2,3,4,5... 9.192631770 x 10^9 and then being BOOM a second -- that right there is a second, mark it down.
First:
(1) Why is this a second - why this specific number? I mean in essence isn't the second defined by counting ONE oscillation of cesium and multiplying it by 9.192631770 x 10^9? and since we can count these oscillations this gives a precision of about: (1 sec / 9.192631770 x 10^9) = 10^-10 seconds -- right?
(2) I am reading about optical clocks and how they can improve accuracy in time keeping, by reducing the uncertainty - which can be done using frequency combs. So i understand how we can improve our precision: namely just being able to count the the oscillations of a light wave at a high frequency -- but everywhere I keep reading they always refer to this bridge between 'microwave cesium radiation' and optical radiation. I don't fully understand the connection. Like say I am sitting there counting oscillations of the optical light - ok 1, 2, 3, 4, 5 in say X amount of time. My precision will be (X/5) = VERY SMALL NUMBER < 10^-10 (as we had in our cesium clock). And say for argument sake the second was defined by the amount of time it took cesium to oscillate once. So is an optical clock essentially counting the number of times the optical mode oscillates between cesium oscillations? E.G. Cesium oscillation = 0 -> Cesium oscillation = 1.
Whereas currently it is the time it takes for Cesium oscillation = 0 -> Cesium oscillation = 1 * 9.192631770 x 10^9 ?
(3) How do frequency combs aid in this matter?