3

This question is motivated by me trying to finish my answer to this question.

In the 2019 SI redefinition the kg was redefined in terms of Planck's constant (and the second and meter):

$$ 1 \text{ kg}= (h) \times (1 \text{ s}) \times (1 \text{ m})^{-2} \times (6.62607015\times 10^{34}) $$

This means to realize the meter you must have a measurement of mass that is directly related, in some way, to Planck's constant. This is currently realized at precision metrology labs using the Kibble balance. The crude idea of the Kibble balance is you use electromagnets to generate a force that cancels the weight of a mass (you independently measure local acceleration due to gravity so you can convert weight to mass). You do some tricks and measure some voltages and currents related to the magnetic force you are leveraging. You actually measure those electrical parameters using calibrated electrical standards based on superconducting Josephson junctions. Because Josephson junctions rely on the quantum Hall effect it turns out something about them is quantized in units of Planck's constant $h$. When you express the mass of the object in terms of those voltages and currents you can proceed to express it in terms of $h$ as needed for a primary mass standard.

A major downside of this method, while establishing a non-artifact based SI standard, is that it requires monetarily and technically expensive superconducting Josephson junctions.

My question is, is there any way (even if it is inaccurate and imprecise) to realize a primary mass standard based on the current SI definition of the kg, that does not rely on superconducting Josephson junctions? Or more generally, that does not rely on the quantum Hall effect or any device that requires advanced fabrication techniques?

As a starting point, this youtube video explains how a voltage can pretty easily be calibrated to Planck's constant and the electron charge. Basically, an LED with wavelength $\lambda = c/\nu$ emits photons with energy $h\nu$. This means it won't turn on until the voltage driving it is $V > e h \nu = e h c/\lambda$. If we assume the meter has been realized, then you create a diffraction grating setup and use that to measure $\lambda$. The problems are, even though this voltage is calibrated to microscopic quantities, (1) the voltage depends on $e$ as well as $h$ and (2) the Kibble balance requires a precisely measured voltage AND current. This method does not produce a calibrated current. (One could validly argue the LEDs are "devices that require advanced fabrication techniques" but I'd rebut that there is a qualitative difference with them compared to quantum Hall devices in that I can go to the store and get an LED for $0.05 [or get one out of some electronic box in my house] and they don't require cyrogenic cooling to operate.)

A shorter way to ask this question might be: Is there a way, without the quantum Hall effect, to refer a current measurement to Planck's constant $h$?


update: I had a recollection about an ultracold atomic physics experiment I had heard about which involves kicking a single atom with a photon with a known wavelength and watching how the photon recoils as way to measure mass in a way that is related to Plancks constant. See Lan et. al., Science Vol 339 Issue 6119 (2013). Apologies for the paywall. Whiles this approach gives an affirmative answer to the titular question, it still requires a very complex apparatus including many lasers, ultracold atoms, and an ultra-high vacuum chamber. The question of a "simple" primary standard realization of the kg remains.

Jagerber48
  • 13,887

2 Answers2

2

As a partial answer, I had a student several years ago try the LED experiment, and they got terrifically confusing results:

I was motivated to ask this question by a similar question about forward voltage drops in LEDs. I was expecting to answer that question with some data from a student using LED turn-on voltage and the wavelength of light to measure the Planck constant. However those data are a lot more complicated than I expected: in fact, most of my LEDs apparently emit multiple wavelength components with comparable strength, and there doesn't seem to be much correlation the turn-on voltage between and the most prominent color in the LED spectrum. I don't seem to be able to say much more than "LEDs have turn-on voltages between two and three volts."

If you did have a set of LEDs where the relationship between applied voltage and output light wavelength were linear, you might think you would be measuring $h$ directly. More correctly, you’d be using the defined value of $h$ to check the calibration on your voltmeter. But even then, a photon wavelength/frequency/energy measurement is a measurement of energy, most conveniently measured in electron-volts. If your translation to energy comes from reading some voltmeter, you still need a way to back out the fundamental charge $e$.

The BIPM’s recipe calls for a measurement (see comment below) of both the Josephson constant $K_J = 2e/h$, with units $\rm Hz\,V^{-1}$, and the Hall effect’s von Klitzing constant $R_K = h/e^2$, which is a resistance, unit $\Omega = \rm V\,A^{-1}$. The way to think of these is that the Josephson constant turns your frequency reference (your cesium clock) into a voltage reference. The Hall resistance then allows you to measure currents, charges, and electrical energies. The Kibble balance is fundamentally a way to transfer this quantum-electrical calibration to a mechanical energy.

The whole scheme only works because the Josephson and Hall effects are quantum-mechanical. The question of whether you can make an electrical energy measurement without advanced fabrication or superconductors, therefore, is a technological question. It used to be the case that a cesium atomic clock was also a technology which required a whole laboratory to run. But economic pressure for independent references has made cesium-clock technology much more accessible: a comment on your linked question claims you can get a clock-on-a-chip for a few thousand dollars.

A homebrew implementation of the kilogram “in the spirit of” the SI might start with an independent voltage reference and an independent current reference. I have read that, in the pre-solid-state era, the photometers in high-end cameras used low-current mercury batteries because, unlike alkaline batteries, the mercury cells have very little voltage sag until the battery is completely exhausted. A mercury voltage reference is in the same spirit as “all cesium atoms oscillate at the same frequency,” though more chemistry is involved. It also compares reasonably well to the Josephson voltage standard, though a mercury battery can’t be connected to the definition of the second.

A charge reference that might be accessible at the level of an undergraduate senior thesis might be a photoelectric effect device. Bright light of a known wavelength shines on some photocathode in vacuum. Between the cathode and the anode, you apply a voltage which you compare to your voltage reference. One sign of the voltage attracts all of the photoelectrons to the anode; the other sign of the voltage repels all but the most energetic electrons, allowing you in principle to measure the energy endpoint via curve-fitting. Varying the color of your incident light (measured by diffraction grating) moves the energy endpoint in a linear way, allowing you to back out the “work function” of your cathode.

The photoelectric endpoint approach is fundamentally the same as your LED measurement. But the LED measurement, as I said at the start of this answer, only works if LEDs produced for different wavelengths, possibly by different manufacturers, have a consistent relationship between band gap, turn-on voltage, and probably temperature — my experience was different from this. A classic photoelectric experiment still has this chemical/manufacturing uncertainty (which the Hall effect does not), but the photoelectric uncertainty is confined to a single photocathode which you can characterize completely.

rob
  • 89,569
  • "The BIPM’s recipe calls for a measurement of both the Josephson constant [...] and the Hall effect’s von Klitzing constant": Indeed no, because in the SI the Josephson constant and the von Klitzing constant are exactly defined. One uses those two effects and the two associated constants to realize accurate voltage and resistance standards. – Massimo Ortolano Apr 03 '22 at 18:00
  • And I know of no Cs clock so cheap. The linked clock in this answer is a Rubidium one, not a Cs one. – Massimo Ortolano Apr 03 '22 at 19:19
  • @MassimoOrtolano Correct on both counts! I am still adjusting to the language of the post-2019 SI. – rob Apr 03 '22 at 20:55
  • According to the this Wikipedia page, you can get a cesium clock for a few thousand dollars. It's "the size of a deck of cards," rather than fitting on a chip, but that's probably okay for this answer. – rob Apr 03 '22 at 22:13
  • 1
    Note that that is not a primary clock and actually needs to be calibrated. If you look at the user manual, they specify that it has an initial calibration before shipment, and then it drifts. So, it cannot be considered suitable for the realization of units in the spirit of the question. – Massimo Ortolano Apr 03 '22 at 22:23
2

The answer is yes, you can do it without superconducting josephson junctions. If you successfully pull off the LED experiment, then you need a calibrated resistance to get a calibrated current to run the Kibble balance. (Alternatively, you could use something like radiation pressure, but for that you need a power standard, and the ways that’s done these days is by comparing optical power to electrical power dissipated though, you guessed it, a calibrated resistor.)

The current way to calibrate a resistor uses the quantum Hall effect, a non-superconductive effect. So, just use QHE to calibrate your resistor, the LED experiment to calibrate your voltage, and now you have everything to run a Watt balance. HOWEVER, this is probably rather unsatisfying because the QHE experiment is equally involved as the Josephson junction experiment (involving fabrication of devices and sensitive electrical measurements at cryogenic temperatures).

What you’re probably looking for is a low-tech primary resistor calibration, and I’m unaware of how to do that.

Gilbert
  • 11,877