The Americans Who Defined The Meter

By The Metric Maven

On April 13, 1668 the metric system was born. It had been devised by the English scholar, John Wilkins (1614-1672). The system was elegantly defined. One would take a length of string with a mass attached, and adjust that length until the pendulum had a period of two seconds (one second each direction). Now that you had a standard length you would divide that by ten, and use that length to make a cube, which became the liter. Fill the cube with rain water and the mass is a kilogram.

This system was to be a “Universal Measure” that everyone could use. There was just one perceived problem, the period of a pendulum depends on its latitude, so a seconds pendulum would not be universal. The alternative, measuring a distance on the surface of the earth, was however, a very questionable alternative. James Clerk Maxwell (1831-1879) in his A Treatise on Electricity and Magnetism sums up the situation with the meter in 1873:

In…countries which have adopted the metric system, …[the base unit] is the metre. The metre is theoretically the ten millionth part of the length of a meridian of the earth measured from the pole to the equator; but practically it is the length of a standard preserved in Paris, which was constructed by Borda to correspond, when at the temperature of melting ice, with the value of the preceeding length as measured by Delambre. The metre has not been altered to correspond with new and more accurate measurements of the earth, but the arc of the meridian is estimated in terms of the original meter.

One can sense that Maxwell is satirizing the idea of a measurement unit based on the earth, and exposes the “earth based” meter as being essentially a defined artifact which is not exactly “universal.”

Maxwell had his own viewpoint of how a universal standard of length might be created:

In the present state of science the most universal standard of length which we could assume would be the wavelength in vacuum of a particular kind of light, emitted by some widely diffused substance such as sodium, which has well-defined lines in its spectrum. Such a standard would be independent of any changes in the dimensions of the earth, and should be adopted by those who expect their writings to be more permanent than that body.

Yes, it’s clear. Maxwell is not particularly keen on the current definition of the meter in 1873. Indeed light did seem to be the best option for a standard.

Charles Peirce

This is when the eccentric and abrasive American-born Charles Sanders Peirce (1839-1914) enters the story. Robert P. Crease, author of World In The Balance states:

“He was also one of America’s most important metrologists. He made precision measurements, and improved techniques for making them. His work helped remove American metrology from under the British shadow and put American metrology on its feet.”

Charles was introduced to a spectroscope, which is a device that separates light into its constituent frequencies, by Joseph Winlock of the Harvard Observatory. Spectroscopy was allowing scientists to identify the chemical elements which make up stars. The element helium was first identified as a yellow spectrum line seen during a solar eclipse of the sun in 1868, prior to its identification on earth. I suspect they thought it would probably be a metal given the ium suffix. With the help of his father, Charles became head of the Office of Weights and Measures in 1872. Peirce traveled to Paris in 1876 and brought back brass meter standard number 49, which would be used for the calibration of American standards.

The idea of using light for a standard had been contemplated for some time, but there was a potential problem. Light is a wave, waves travel through a medium (water for water waves, air for sound waves). It was thought that light traveled through a medium, which they called aether. It was believed that the wavelength of light would be altered because of the earth’s rotation in the aether and its solar orbit. This would be like the problem of a seconds pendulum having a different period depending on its latitude. Peirce was aware of this and is quoted by Crease on page 195:

[T]here may be a variation in wave-lengths if the aether of space, through which the solar system is traveling, has different degrees of density. But as yet we are not informed of such variation.

In 1887 the Michelson–Morley experiment failed to detect the aether. This caused a considerable scientific brouhaha, but the aether was not dead yet. It was too powerful of an idea. In the end, after repeated experiments failed to detect the aether, it was decided it must not exist, and light could be relied upon to be a universal standard for the definition of a meter.

One can create light which is produced by a known element by placing its gas inside of an evacuated tube. The tube can then be exited with electricity. We all know that when the gas is neon we call it a neon light, or neon tube. Peirce chose to use sodium for his tube. Peirce attempted to calibrate the distance between the  machined  lines on a diffraction grating, back to his number 49 meter standard using the sodium light. Unfortunately, the lines on the diffraction grating had imperfections that made the lines a bit fuzzy, which limited the resolution. The distance between the lines on the diffraction grating would change with temperature, further decreasing the accuracy. The accuracy of the thermometer he used to monitor the temperature also introduced error. Peirce published his results in 1879. He had tied the meter to a wavelength of light by way of the lines on the diffraction grating. He was the first to do this, but it was still not the method described by Maxwell, which involved counting wavelengths of light.

Illustration of Interferometer from Michelson and Morley's Scientific American Paper

Albert Michelson read Peirce’s publication and realized that the interferometer he and Morley had developed to detect the ether could be used for the precise measurement of wavelengths which Peirce was pursuing with diffraction gratings. An interferometer splits a single beam of light in two and later recombines it so the two beams are out of phase. This produces a series of light and dark interference patterns. A screw is attached to a mirror that can be used to move the mirror and count the number of light and dark oscillations. Michelson and Morley published this work in 1888. The first sentence of the paper is: “The first actual attempt to make the wave length of sodium light a standard of length was made by Peirce.” The inaccuracies of his method  are described and the advantages of an interferometer are discussed.

They determined that it would take the counting of 400,000 wavelengths to obtain a decimeter (100 mm). Michelson and Morley suggest in their paper:

Probably there would be considerable difficulty in actually counting 400,000 wave lengths, but this can be avoided by first counting the wave lengths and fractions in a length of one millimeter and using this to step off a centimeter. This will give the nearest whole number of wave-lengths, and the fractions may be observed directly. The centimeter is then used in the same way to step off a decimeter, which again determines the nearest whole number, the fraction being observed directly as before.

In 1892 Michelson went to Paris to relate he and Morley’s interferometer work. Unfortunately, Michelson discovered that his sodium light did not produce a single frequency line but was actually a composite of two lines. This caused enough fuzziness to not allow for measurements which were as precise as he needed. Michelson tried both mercury and cadmium and settled on the latter.

In the 1906 book Outlines of The Evolution of Weights and Measures and The Metric System, the authors, William Hallock and Herbert Wade, state (pg 265) that Michelson used “three different kinds of light, viz. the red, green, and blue of the cadmium spectrum, he determined the wave-length of each or the number of times this wave-length was contained in the standard meter. The wave-lengths for each color were as follows:”

Hallock and Wade can hardly control their enthusiasm and excitement at this technical breakthrough:

The accuracy of this work is almost incredible, as the variation in measurements was only about one part in ten million. …..here is an absolute measurement which gives the length of a standard in terms of a natural unit, under conditions reproducible at any time. This, of course, gives a permanent check on the integrity of the meter, as in the event of the international prototype being damaged or destroyed……

It was decided by the participants that pursuing a method of tying the natural phenomenon of light to the meter was to be undertaken. Charles Fabry and Alfred Perot made improvements to Michelson and Morley’s interferometer, and were able to obtain a precision near that of their artifact standard. Improvements to the interferometer continued.

A survey of candidate elements was undertaken to find the best one to use for a new standard for the meter. This uncovered the fact that various isotopes of the elements were emitting light at different wavelengths which caused blurred lines. The search was on for elements that were heavy and had few isotopes. This work continued throughout the 20s and 30s. World War II delayed progress, but in the 1950s enough improvements had been made to schedule a re-definition of the meter in 1960. By international agreement the meter was defined in terms of the wavelength of light emitted by the krypton86 isotope. The meter was now a length available to all countries without respect to an artifact or geography.

Despite the fact that Peirce, Michelson, and Morley—all American scientists—were instrumental in achieving the dream of a universal meter available to all, America did not convert to the metric system or metric lengths. Even though the lengths used in the US: the inch, foot, yard and mile, are all defined by the meter, America rejects a system of length first defined by an Englishman, and then made universal by Americans. I find great irony that most Americans believe that German Chocolate Cake is of Germanic origin. This is not the case. It was created by Sam German—an American—in the 19th Century. It is almost as ironic as Americans refusing to adopt The French Meter.

3 thoughts on “The Americans Who Defined The Meter

  1. The origins of the metric system actually predates John Wilkins. It goes back to pre-Norman England where a decimal system actually existed and the base unit was the wand (the word continues to this day in folklore as the magic wand). The wand was approximately 1 m.

    The Normans and French brought Babylonian/Roman units to England. Possibly John Wilkins was aware of his countries pre-history and his system was an attempt to resurrect the pre-Norman units.

    http://genforum.genealogy.com/wand/messages/52.html
    http://www.experiencefestival.com/a/Wand_-_Metrology/id/601470

    The wand is also a pre-Norman unit of length used in the British Isles equal to approximately the modern metre, apparently dating from an early use as a yardstick (originally as a generic term). The ‘wand’ survived for a time under the Normans. Then when the yard was established, the wand came to be known as the ‘yard and the hand’, and then disappeared, either slowly or by being banned by law. The old English unit of 1007 millimetres was called a ‘wand’, and although the ‘yard’ was created to replace the wand the wand was still used for some centuries because of its convenience as part of an old Eng …

    The wand is also a pre-Norman unit of length used in the British Isles equal to approximately the modern metre, apparently dating from an early use as a yardstick (originally as a generic term). The ‘wand’ survived for a time under the Normans. Then when the yard was established, the wand came to be known as the ‘yard and the hand’, and then disappeared, either slowly or by being banned by law.

    The old English unit of 1007 millimetres was called a ‘wand’, and although the ‘yard’ was created to replace the wand the wand was still used for some centuries because of its convenience as part of an old English decimal system that included:

    1 digit (base of long finger) about 20 millimetres
    10 digits = 1 small span (span of thumb and forefinger) 200 millimetres
    10 small spans = 1 armstretch (1 fathom from finger tip to finger tip) about 2 metres
    10 fathoms = 1 chain about 20 metres
    10 chains = 1 furlong about 200 metres
    10 furlongs = 1 thus-hund of about 2000 metres

    The wand that has survived today as part of folklore may in fact be a rendition of the ancient British length unit. Thus a true wand would be a metre in length and not 30 cm.

    http://answers.yahoo.com/question/index?qid=20060927125649AAFoFsY
    Metrology
    The wand is also a pre-Norman unit of length used in the British Isles equal to approximately the modern metre, apparently dating from an early use as a yardstick (originally as a generic term). The ‘wand’ survived for a time under the Normans. Then when the yard was established, the wand came to be known as the ‘yard and the hand’, and then disappeared, either slowly or by being banned by law.

    The old English unit of 1007 millimetres was called a ‘wand’, and although the ‘yard’ was created to replace the wand, the wand was still used for some centuries because of its convenience as part of an old English decimal system that included:

    1 digit (base of long finger) about 20 millimetres
    10 digits = 1 small span (span of thumb and forefinger) 200 millimetres
    10 small spans = 1 armstretch (1 fathom from finger tip to finger tip) about 2 metres
    10 fathoms = 1 chain about 20 metres
    10 chains = 1 furlong about 200 metres
    10 furlongs = 1 thus-hund of about 2000 metres

    The wand that has survived today as part of folklore may in fact be a rendition of the ancient British length unit. Thus a true wand would be a metre in length and not 30 cm.

    In Wicca and Ceremonial magic, practitioners use wands for the channeling of energy—they serve a similar purpose to the athame although the two have their distinct uses. While an athame is generally used to command, a wand is seen as more gentle and is used to invite or encourage. Though traditionally made of wood, they can also consist of metal or crystal. Practitioners usually prune a branch from an Oak, Hazel, or other tree, or may even buy wood from a hardware store, and then carve it and add decorations to personalize it; however, one can also purchase ready-made wands. In Wicca the wand usually represents the element fire, or sometimes air. Ceremonial magicians may have several wands for different purposes, such as the Fire Wand and the Lotus Wand in the Hermetic Order of the Golden Dawn. In Zoroastrianism, there is a similar ritual implement called a barsom.

    There is some scholarly opinion that the magic wand may have its roots in the drumstick of a shaman, especially in Central Asia and Siberia, as when using it to bang on his drum or point, to perform religious, healing, and magical ceremonies. 1
    Source(s):
    http://en.wikipedia.org/wiki/Magic_wand

    • You seem to have missed the larger issue rather magnificently. Whether John Wilkins was aware of the measures you list, it is clear he was unconcerned with them: his goal was to relate length, volume, mass, and time without reference to any artifact or archaic unit, at least as far as the science of his century permitted. There was a problem with his approach, but it’s astounding that in the 17th century, a guy was working toward a system of lab-reproducible standards independent of artifacts. This is still the goal today: the sole remaining artifact in SI is the kilogram, and the kilogram has become a serious problem.

      Wilkins was a Church of England clergyman, and a scientific dilettante in an age when nearly all scientists were, to some extent, dilettantes. Why would he try to honor pre-Norman (ultimately pre-Christian) measures? That some archaic measure might correspond roughly to the length of a seconds pendulum is not an impressive coincidence. Wilkins (1) tried to define a measurement system independent of artifacts, and (2) the system he came up with is the first identifiable ancestor of metric today. Not a bad résumé.

      One last thing apparently needs to be said: it is really bad “netiquette” to post an essay in a comment thread.

      • I didn’t miss any point. I was pointing out the similarity between Wilkins system and the system of the pre-Norman British. The two systems as far as length measures goes is identical except for the names. Neither used prefixes, but that doesn’t matter. I

        The information that I found on the pre-Norman units doesn’t include mass and volume units and therefore one can not assume either way that volume and mass were not derived in a similar manner as the present metric system.

        The fact that the ten millionth part of the quarter circumference of the earth, the second pendulum and the pre-Norman wand all pretty much equal the modern metre is more than just coincidence.

        My whole point is to point out that the metric system as we know goes back even further in history. It can also be used to prove that metric is just as natural as Luddites claim for imperial.

Comments are closed.