Early investigators of electricity tried to measure electric phenomena, including the strength of an electric current, in terms of length, mass and time, the fundamental units of earlier physics. In 1832, Gauss succeeded in measuring the strength of the Earth's magnetic field in terms of length, mass and time.¹ The first to succeed in measuring an induced electric current in such terms was W.E. Weber, in 1851, in which year he also recommended a complete system of electric units.²
In 1872 a committee on standards of electrical resistance of the British Association for the Advancement of Science made an influential recommendation advocating the use of the cgs system of units (reversing an earlier recommendation for mgs). The unit of resistance they defined, however, was made 10⁹ times larger than the cgs absolute unit of resistance. A unit of this size was needed because the size of the units that fell out of the electric equations in cgs units were much too small for everyday use. The committee named their unit the “ohm.”
1. C. F. Gauss.
Intensitas vis magneticae terrestris ad mensuram absolutam revocata. Commentatio auctore Carolo Friderico Gauss in consessu Societatis MDCCCXXXI Dec. XV recitata.
Commentationes Societatis Regiae Scientiarum Gottingensis Recentiores Volumen VIII - AD A. MDCCCXXXII.-XXVII.
Gottingae, Sumptibus Dieterichianis. MDCCCXLI p. 1-44.
2. See Weber's On the Measurement of Electro-dynamic Forces (1848), in an English translation from Weber's time.
The First International Conference of Electricians (Paris, 1881) adopted the British Association definition of the ohm and added definitions for the volt, ampere, coulomb, and farad. Thus was born the “absolute practical system of electrical units”:
The ampere was a derived unit, defined as the current produced in a conductor with a 1-ohm resistance when there was a potential difference of 1 volt between its ends.
The Ampère is the unit of current, and is equal to the quantity of electricity whieh will pass during 1 second through a circuit having a resistance of 1 ohm when the electromotive force is 1 volt. This quantity of electricity was formerly spoken of as 1 farad per second. It more recently came to be called by British and American electricians a weber, and sometimes an œrsted, but the designation of ampère, having been established by the Paris Congress, may now be considered authoritative.
The actual value of the ampère has been carefully determined by the electrolytic method; that is, by ascertaining the weight of a given metal which it is capable of depositing from solution in a given time, the quantity of metal thus deposited being known to be proportional to the whole quantity of electricity which passes during such time. The results obtained by different investigators are now in close accordance, the conclusion being as follows:
The ampère is represented by that quantity of current which is capable of depositing 4.025 grammes (62.10 grains) of silver per hour or 0.001118 grammes per second.
The Elementary Principles of Electrical Measurement.
The Electrician and Electrical Engineer, vol. 3, page 211 (October, 1884)
The 1881 definitions had a serious deficiency as far as most workers in the field were concerned: they were not easily reproducible outside highly-specialized laboratories. The Fourth International Conference of Electricians (Chicago, 1893) addressed the problem of producing definitions that were more “workable” with a new set of “international” units, corresponding as closely as possible to the absolute practical units, but more suited to replication in ordinary laboratories. For electric current, the international ampere was defined as that unvarying current that would deposit 0.001 118 000 grams of silver per second from a solution of silver nitrate in water. The amount of silver was chosen to make the international ampere equal to the absolute practical ampere within the limits of precision of the day.
The conference also defined an international volt and international ohm in similarly practical ways. Defining all three units experimentally, however, was a mistake. As standards laboratories in Germany, Britain and America made ever more precise measurements, they found that the definitions were inconsistent with one other; volts did not equal amperes × ohms.
An international conference in London in 1908 decided to leave the international ampere and the international ohm as they were, and make the international volt a derived unit, its value set by its relation to the other two.
With the development of more sophisticated laboratory techniques, measuring electric units in terms of mass, length and time became much easier, and the need to define electrical units in such terms as the amount of silver deposited decreased.
In 1948 the CGPM abandoned the international ampere and reverted to an absolute definition. The 1948 definition is identical to the one used today, except that they used “MKS unit of force” for what would today be called a newton. Investigation showed the new definition led to a new value for the ampere that was 1.000 15 times the mean international ampere as previously realized in the various national standards laboratories. For the ampere of the U. S. National Bureau of Standards, the factor was 1.000 165.
For an excellent history of the development of the electric and magnetic units up to 1913, see U. S. Bureau of Standards Circular 60.
Copyright © 2000-2014 Sizes, Inc. All rights reserved.
Last revised: 1 April 2014.