4

According to the definition of electric current, it appears to be a derived quantity. Charge on the other hand seems more fundamental than electric current. Then why is current taken as fundamental quantity instead of charge?

Is it arbitrary choice? Is it because we can measure current more efficiently than charge or some other reason?

Qmechanic
  • 201,751
Yashbhatt
  • 1,794
  • Can you give a link suggesting that charge is a derived quantity? Normally we would take charge as fundamental and current is then given by $I = dQ/dt$. – John Rennie Jun 20 '14 at 14:46
  • 5
    The OP probably means why Ampere, the unit of current, is considered the basic unit in the SI system. – Luboš Motl Jun 20 '14 at 14:52
  • @JohnRennie I mean to say that what I have read is flow of charges produces current. That gives a impression that charge is more fundamental. – Yashbhatt Jun 20 '14 at 15:32
  • Yes, you're correct. However have you read Luboš' answer explaining why we use the ampere as the elementary SI unit? Is that what you were asking? – John Rennie Jun 20 '14 at 15:38
  • Yes. He mentions what I was asking. But what I get from hIts answer is just that we went with the convention. – Yashbhatt Jun 20 '14 at 17:05
  • I think the point is that it's easy to measure the force between two wires but hard to measure charge. So as a practical unit the ampere is more convenient, and we get the charge from $Q = \int I dt$. – John Rennie Jun 20 '14 at 17:42

1 Answers1

5

I think that the question is why the SI system of units considers one ampere, the unit of current, to be the elementary one, rather than the unit of the electric charge.

Recall that one ampere is defined in SI as

"the constant current that will produce an attractive force of $2\times 10^{–7}$ newton per metre of length between two straight, parallel conductors of infinite length and negligible circular cross section placed one metre apart in a vacuum"

Note that this definition relies on magnetic forces; it is equivalent to saying that the vacuum permeability $$\mu_0=4\pi\times 10^{-7} {\text{V s/(A m)}} $$ It's the magnetic force that has a "simple numerical value" in the SI system of units, and magnetic forces don't exist between static electric charges, just between currents.

If we tried to give a similar definition for the electric charge, using the electrostatic force, the numerical values would be very different.

Now, one may ask why the magnetic forces were chosen to have "simple values" in the SI system. It is a complete historical coincidence. The SI system was designed, up to the rationalized additions of $4\pi$ and different powers of ten, as the successor of CGSM, the magnetic variation of Gauss' centimeter-gram-second (CGS) system of units.

These days, both methods would be equally valid because we use units in which the speed of light in the vacuum is fixed to be a known constant, $299,792,458\,{\rm m/s}$, so both $\mu_0$ and $\epsilon_0=1/(\mu_0 c^2)$, the vacuum permittivity, are equal to known numerical constants, anyway.

At any rate, the unit of the electric charge is simply "coulomb" which is "ampere times second", so it is as accurately defined as one ampere.

Luboš Motl
  • 179,018