On April 13, 1668 the metric system was born. It had been devised by the English scholar, John Wilkins (1614-1672). The system was elegantly defined. One would take a length of string with a mass attached, and adjust that length until the pendulum had a period of two seconds (one second each direction). Now that you had a standard length you would divide that by ten, and use that length to make a cube, which became the liter. Fill the cube with rain water and the mass is a kilogram.
This system was to be a “Universal Measure” that everyone could use. There was just one perceived problem, the period of a pendulum depends on its latitude, so a seconds pendulum would not be universal. The alternative, measuring a distance on the surface of the earth, was however, a very questionable alternative. James Clerk Maxwell (1831-1879) in his A Treatise on Electricity and Magnetism sums up the situation with the meter in 1873:
In…countries which have adopted the metric system, …[the base unit] is the metre. The metre is theoretically the ten millionth part of the length of a meridian of the earth measured from the pole to the equator; but practically it is the length of a standard preserved in Paris, which was constructed by Borda to correspond, when at the temperature of melting ice, with the value of the preceeding length as measured by Delambre. The metre has not been altered to correspond with new and more accurate measurements of the earth, but the arc of the meridian is estimated in terms of the original meter.
One can sense that Maxwell is satirizing the idea of a measurement unit based on the earth, and exposes the “earth based” meter as being essentially a defined artifact which is not exactly “universal.”
Maxwell had his own viewpoint of how a universal standard of length might be created:
In the present state of science the most universal standard of length which we could assume would be the wavelength in vacuum of a particular kind of light, emitted by some widely diffused substance such as sodium, which has well-defined lines in its spectrum. Such a standard would be independent of any changes in the dimensions of the earth, and should be adopted by those who expect their writings to be more permanent than that body.
Yes, it’s clear. Maxwell is not particularly keen on the current definition of the meter in 1873. Indeed light did seem to be the best option for a standard.
“He was also one of America’s most important metrologists. He made precision measurements, and improved techniques for making them. His work helped remove American metrology from under the British shadow and put American metrology on its feet.”
Charles was introduced to a spectroscope, which is a device that separates light into its constituent frequencies, by Joseph Winlock of the Harvard Observatory. Spectroscopy was allowing scientists to identify the chemical elements which make up stars. The element helium was first identified as a yellow spectrum line seen during a solar eclipse of the sun in 1868, prior to its identification on earth. I suspect they thought it would probably be a metal given the ium suffix. With the help of his father, Charles became head of the Office of Weights and Measures in 1872. Peirce traveled to Paris in 1876 and brought back brass meter standard number 49, which would be used for the calibration of American standards.
The idea of using light for a standard had been contemplated for some time, but there was a potential problem. Light is a wave, waves travel through a medium (water for water waves, air for sound waves). It was thought that light traveled through a medium, which they called aether. It was believed that the wavelength of light would be altered because of the earth’s rotation in the aether and its solar orbit. This would be like the problem of a seconds pendulum having a different period depending on its latitude. Peirce was aware of this and is quoted by Crease on page 195:
[T]here may be a variation in wave-lengths if the aether of space, through which the solar system is traveling, has different degrees of density. But as yet we are not informed of such variation.
In 1887 the Michelson–Morley experiment failed to detect the aether. This caused a considerable scientific brouhaha, but the aether was not dead yet. It was too powerful of an idea. In the end, after repeated experiments failed to detect the aether, it was decided it must not exist, and light could be relied upon to be a universal standard for the definition of a meter.
One can create light which is produced by a known element by placing its gas inside of an evacuated tube. The tube can then be exited with electricity. We all know that when the gas is neon we call it a neon light, or neon tube. Peirce chose to use sodium for his tube. Peirce attempted to calibrate the distance between the machined lines on a diffraction grating, back to his number 49 meter standard using the sodium light. Unfortunately, the lines on the diffraction grating had imperfections that made the lines a bit fuzzy, which limited the resolution. The distance between the lines on the diffraction grating would change with temperature, further decreasing the accuracy. The accuracy of the thermometer he used to monitor the temperature also introduced error. Peirce published his results in 1879. He had tied the meter to a wavelength of light by way of the lines on the diffraction grating. He was the first to do this, but it was still not the method described by Maxwell, which involved counting wavelengths of light.
Albert Michelson read Peirce’s publication and realized that the interferometer he and Morley had developed to detect the ether could be used for the precise measurement of wavelengths which Peirce was pursuing with diffraction gratings. An interferometer splits a single beam of light in two and later recombines it so the two beams are out of phase. This produces a series of light and dark interference patterns. A screw is attached to a mirror that can be used to move the mirror and count the number of light and dark oscillations. Michelson and Morley published this work in 1888. The first sentence of the paper is: “The first actual attempt to make the wave length of sodium light a standard of length was made by Peirce.” The inaccuracies of his method are described and the advantages of an interferometer are discussed.
They determined that it would take the counting of 400,000 wavelengths to obtain a decimeter (100 mm). Michelson and Morley suggest in their paper:
Probably there would be considerable difficulty in actually counting 400,000 wave lengths, but this can be avoided by first counting the wave lengths and fractions in a length of one millimeter and using this to step off a centimeter. This will give the nearest whole number of wave-lengths, and the fractions may be observed directly. The centimeter is then used in the same way to step off a decimeter, which again determines the nearest whole number, the fraction being observed directly as before.
In 1892 Michelson went to Paris to relate he and Morley’s interferometer work. Unfortunately, Michelson discovered that his sodium light did not produce a single frequency line but was actually a composite of two lines. This caused enough fuzziness to not allow for measurements which were as precise as he needed. Michelson tried both mercury and cadmium and settled on the latter.
In the 1906 book Outlines of The Evolution of Weights and Measures and The Metric System, the authors, William Hallock and Herbert Wade, state (pg 265) that Michelson used “three different kinds of light, viz. the red, green, and blue of the cadmium spectrum, he determined the wave-length of each or the number of times this wave-length was contained in the standard meter. The wave-lengths for each color were as follows:”
The accuracy of this work is almost incredible, as the variation in measurements was only about one part in ten million. …..here is an absolute measurement which gives the length of a standard in terms of a natural unit, under conditions reproducible at any time. This, of course, gives a permanent check on the integrity of the meter, as in the event of the international prototype being damaged or destroyed……
It was decided by the participants that pursuing a method of tying the natural phenomenon of light to the meter was to be undertaken. Charles Fabry and Alfred Perot made improvements to Michelson and Morley’s interferometer, and were able to obtain a precision near that of their artifact standard. Improvements to the interferometer continued.
A survey of candidate elements was undertaken to find the best one to use for a new standard for the meter. This uncovered the fact that various isotopes of the elements were emitting light at different wavelengths which caused blurred lines. The search was on for elements that were heavy and had few isotopes. This work continued throughout the 20s and 30s. World War II delayed progress, but in the 1950s enough improvements had been made to schedule a re-definition of the meter in 1960. By international agreement the meter was defined in terms of the wavelength of light emitted by the krypton–86 isotope. The meter was now a length available to all countries without respect to an artifact or geography.
Despite the fact that Peirce, Michelson, and Morley—all American scientists—were instrumental in achieving the dream of a universal meter available to all, America did not convert to the metric system or metric lengths. Even though the lengths used in the US: the inch, foot, yard and mile, are all defined by the meter, America rejects a system of length first defined by an Englishman, and then made universal by Americans. I find great irony that most Americans believe that German Chocolate Cake is of Germanic origin. This is not the case. It was created by Sam German—an American—in the 19th Century. It is almost as ironic as Americans refusing to adopt The French Meter.
If you liked this essay and wish to support the work of The Metric Maven, please visit his Patreon Page.