Who Says!?

Isaac.Asimov01

Isaac Asimov

By The Metric Maven

Isaac Asimov Edition

One Sunday, I was attending my weekly coffee klatch, when one of the participants asked: “Who besides you thinks millimeters should be used instead of centimeters?” I was rather surprised at the question, even though I’m the resident metric advocate. I blurted out “Well, Isaac Asimov does for one, so does the U.S. metric building code, the late Pat Naughtin did, and so did Herbert Arthur Klein in his book The Science of Measurement.” The person who asked the question has a solid scientific background, and what surprised me about the question was the appeal to authority. I have found it very puzzling that when I explain the situation, people do not seem to absorb its meaning, or don’t really think about the simplified symbolic expression.

First, lets start with authority. In his 1983 book The Measure of The Universe, Isaac Asimov has this to say about the centi prefix:

The prefix “centi” (SEN-tih), symbolized as “c,” represents a hundredth of a basic unit, from the Latin “centum” meaning “hundred.” A “centimetre,” therefore, is a hundredth of a metre. The prefix is not commonly used, except in “centimetre,” and its use is falling off even there.

Isaac Asimov has this to say about the milli prefix:

The prefix “milli” (MIL-ih), from the Latin “mille,” meaning “thousand,” is symbolized as “m,” just as “metre” is. A millimetre is therefore symbolized as “mm.” Increasingly “milli-” is replacing “centi-” and “deci-” in use. We are approaching the point where 1 centimetre will routinely be referred to as 10 millimetres, and 1 decimetre as 100 millimetres. This is even more true where these prefixes are used for any basic measure other than “metre.”

There it is, documented with sans serif typeface, Isaac Asimov asserting the utility of the milli prefix over the centi prefix. Asimov also had this to say in his essay “Read Out Your Good Book In Verse” in his 1984 book X Stands for Unknown:

Light wavelengths have traditionally been given in “Angstrom Units,” named in 1905 for the Swedish physicist Anders Jonas Ångström (1814-74), who first used them in 1868. An Angstrom unit is one ten-billionth of a meter, or 1 x 10−10 meters.

Nowadays, however, it is considered bad form to use Angstrom units because they disrupt the regularity of the metric system. It is considered preferable now to use different prefixes for every three orders of  magnitude, with “nano” the accepted prefix for a billionth (10−9) of a unit.

In other words, a “nanometer” is 10−9 meters, so that one nanometer equals 10 Angstrom units. If a particular light wave has a wavelength of 5,000 Angstrom units, it also has a wavelength of 500 nanometers, and it is the latter that should be used.

Again Asimov asserts the importance of using metric prefixes with separations of three orders of magnitude.

Nine years earlier in 1974, Herbert Arthur Klein, in his book The Science of Measurement, wrote this about the metric prefixes:

HAK-PrefixesHerbert Klein sees the atavistic magnifying prefixes deka, hecto, and myria as unnecessary and implies that separations by 1000 are best. He sees only milli as a reducing prefix in good standing, and again argues for reduction by factors of 1000.

Pat Naughtin spent considerable time exploring why millimeters worked so much better than centimeters when implemented in industry. When millimeters were used, the metric transition was quick and almost painless. The introduction of centimeters would delay metric adoption almost indefinitely. He wrote a long discussion of this in 2008 and pleaded for people to use millimeters.

Long time readers know that when the issue was explained to me, and I used millimeters and millimeter instruments in my own engineering work (sans centimeters); I became convinced that the centi prefix, and centimeters are considerable intellectual barriers to metric adoption in the U.S..

After I understood the problem with centimeters, it seemed obvious to use millimeters, but as Isaac Asimov states in his 1971 book The Stars in Their Courses:

One of the pitfalls to communication lies in that little phrase “It’s obvious!” What is obvious to A, alas, is by no means obvious to B and is downright ridiculous to C.

I’m going to do my best to return to my unexamined world view and try to explain the epiphany that struck me at Mach III+. Below is an image from a newspaper film box, probably from the 1970s. The film size is in inches, and it is converted to metric in centimeters.

Newspaper-Film-1970sThe film size is 45.7 x 58.4 centimeters. The number of symbols used is four for each linear dimension. In the everyday world, a measurement with only the precision of a centimeter, is generally too coarse to be of any practical use. The odds that one will measure to an even centimeter are rather low, and so almost all common measures in our world require an unnecessary decimal point and a value for a tenth of a centimeter.

But a tenth of a centimeter is a millimeter. This implies that everyday measurement is generally useful only to a millimeter value. When 45.7 cm and 58.4 cm are written in millimeters, only three symbols are required to express the very same value of length: 457 mm x 584 mm. The mind does not need to stop and perceive the location of a decimal point and parse the decimal number. The number of symbols used is reduced from four to three.

The objection often offered is that one only has to move the decimal point to change from millimeters to centimeters! Pat Naughtin pointed out that often people who work on construction are less familiar with manipulating numbers than scientifically trained professionals. Asking them to slither a decimal point along in any calculations they might do, will only introduce an opportunity for error. In the case of centimeters, the error can be very large because of the unit size chosen.

But, indeed, the proof of the pudding is in the eating, and so it is with the millimeter and the metric system. Pat Naughtin has an extensive discussion (50 pages) about millimeters versus centimeters. His original observation was an empirical one: Industries that used the millimeter had quick and smooth metric transitions, those that chose the centimeter are still in turmoil to this day. Why is this the case? It was analysis after-the-fact that offered clues.

Naughtin makes this observation:

Talking or arguing with people who have not done any measuring with the metric system is quite pointless. But as soon as they experience the simplicity of the metric system for themselves they will then convince themselves that it is the better

Sven had lobbied for the use of millimeters, but it was only when I had all-millimeter rulers and instruments, that I realized their utility, and adopted millimeters exclusively.

I continue to have people who are from “metric countries,” who, with an air of sanctimoniousness say “I’ve never had a problem with centimeters. I use them all the time.” They don’t seem to realize I could just as easily say I’ve used inches (feet, yards, rods, miles) here in the U.S. and I’ve never had a problem. Or stating that “I can use Roman numerals, and have for years,” with the implication that your mind is obviously too small and dim to handle them. Not that they are in fact awkward. It took about 1000 years for people to realize there was a problem with Roman Numerals. They never saw a problem because the were immersed with them.  These denizens of “metric countries,” have an antique metric system usage, that is contemporary with Þe Olde English, and they are fine with the retention of familiarity over simplicity. There is no examination or self-reflection, just a thoughtless assertion. References are offered, reasons explained, and the response appears to be reactionary truthiness, rather than thoughtful introspection.

Certainly Isaac Asimov has demonstrated that he is trustworthy, but I’m sure he would also indicate that a person should never take his word alone. It is always best to understand an idea directly. The question should not be who says?! but why.

Related essays:

Building a Metric Shed

Metamorphosis and Millimeters

10 thoughts on “Who Says!?

  1. Can you or Mr. Klein quote (or better, give sources for) those “recent decisions by international bodies supervising the metric and SI systems” regarding deka-, hecto-, and myria-? I agree the BIPM dropped myria- in 1960, but the others (along with centi- and deci-) remain in the BIPM’s SI Brochure, based on decisions by its governing committees, CIPM, and CGPM. I am not aware of other international bodies specifically tasked with supervising the SI.

    I am NOT advocating the centimeter generally replace the millimeter in all applications, but, frankly, it has its uses and is explicitly allowed and even is used within the text of the SI Brochure (2008 Edition). Frankly the Metric Maven blog has turned into the “Rant Against the Centimeter” blog.

    Someone properly educated in the metric system should be capable of understanding BOTH centimeters and millimeters. If “Metric for Dummies” bans the millimeter, Americans will be lost dealing with a metric world that uses the centimeter daily (for some applications, millimeters in others).

  2. I think the case for preferring millimeters over centimeters is a strong one, for the reasons discussed here and elsewhere. Intelligent people can disagree. While centimeters will still be commonly used for the foreseeable future, please continue to advocate for millimeters as a “best practice”.

  3. Although I for the most part concur with the Maven, there is still the issue of false precision that could occur with the millimeter. (For example, when measuring a person’s height in millimeters.)

    Also, sometimes using 1 instead of 100 is a matter of convenience. For example, for many years, blood-glucose levels were given as “mg /100 mL” but such is practically nonexistent now, replaced by the more concise “mg / dL”.

    [Now back to the Alabama-Clemson game…]

    • BTW, as I sit here watching this Alabama-Clemson game, Alabama just scored off a remarkable 95-yard kickoff return, with the player just barely making it into the endzone for the touchdown as a Clemson player just missed knocking him out of bounds.
      One of the announcers followed by describing it by saying something like “he made it in [to the endzone] by a centimeter”. Would have “by a millimeter” been better for him to say??

      • On belated second thought, the announcer could have said “he made it in by millimeters!”
        [BTW, Maven, shouldn’t the title be “Who Says?!”?]

        • Don’t know if Wikipedia is an “authority” but it accepts either order in fonts that do not offer a glyph for the interrobang, U+203d.

          • My thinking is the question [?] comes “first” and the the “excitement” [!]; in other words, it should be thought of as inside out. (The Maven may be thinking differently, but I just don’t see it his way…)

            • Just a little while ago, in an incredible last-two-play ending of a game, namely Green Bay-Arizona, Arizona needed a first down on a third-down play late in the game but missed it, as the announcer put it, “by millimeters”.
              Good to hear the Maven’s millimeter may be the bit measurement of choice in football, and not the ACWM’s beloved “fraction of an inch”…

              • A millimeter is smaller than an inch. I’d probably say that myself.

Comments are closed.