5

It has always annoyed me that an Ampere is an SI unit, rather than a Coulomb. Why is this the case? Was current discovered first historically? I believe that the standards were published in the 1960s, couldn't we measure charge accurately by then?

To me at least, a Coulomb is more 'fundamental' than an Ampere.

What's more, (in SI units) charge being described as the amount of electricity carried over 1 second by 1 ampere of current does not seem very intuitive. I can't think of any other quantity that (in SI units) is a quantity over seconds, in most cases it is a quantity per seconds. If a Coulomb was an SI unit then an Ampere would be 'Coulombs/Charge that pass per second' which is more logical for me, and would seem more measurable experimentally (like the gold leaf or Coulomb's electrometer).

As I understand it, SI units were developed out of a necessity to standardise measurements internationally, are were chosen as such that each of the SI units were the 'base' measurements. Obviously if the aim was to standardise then it does not matter which units were chosen as long as consistency was achieved, but as 'base' units were chosen, why was charge not ahead of current?

If current was deemed more 'base' (forgive the poor phrasing), then why current over resistance, or voltage for that matter? It seems odd to define voltage or resistance in terms or current, rather than defining all three in terms of charge.

I also have a vague recollection of being told that current was chosen after it was deemed very implausible (through dimensional analysis) to express electrical quantities solely in terms of weight, time and distance ($kg,s,m$), is this true?

0 Answers0