Pardon The Decimal Dust

by The Metric Maven

Early in my time as an undergraduate at a Midwestern university, I discovered that having “too many decimal places” on my homework was a terrible and ignorant trespass on rationality and technical competence. I was told that the decimal values in my calculation, that were only a few places past the decimal marker, were meaningless. As a graduate student, my TA had the “experience” to understand this problem, and the importance of properly rounding numbers. What I’ve generally settled on these days, is to use three places past a decimal point so that I can easily change a metric number to an integer expression, should I want to change the chosen metric prefix. I also have enough years behind me to realize that whatever number of places were chosen by my TA, they were not the product of an error analysis, they were the product of personal preference.

I’ve heard critics of “too many decimal places” call values they believe are insignificant “decimal dust.” There is no clear definition of this rubric. It has been defined as:

WORD SPY

DECIMAL DUST: An inconsequential numerical amount.

What I’ve come to realize, is that a sort of meaningless bifurcated “intellectual” tug-of-war occurs between those who are concerned about “the excessive use of decimals,” and those who see “too much precision.” Historians have noticed that Newton would compute answers out to an excessively unnecessary number of decimal places. Their conclusion is that “he just liked doing the calculation.” This is very probably true. Quite possibly the best engineer I’ve ever shared an office with was Michel. He was able to take the abstract equations of electromagnetism and turn them into useful computer code. We were, of course, banned from presenting any results that interfaced with the Aerospace world in metric, but inside of every computer was, to my knowledge, metric computations hidden away in ones and zeros, so as not to offend delicate Olde English sensibilities.

I was fascinated with understanding the details of how Michel implemented his computer code and verified it. What I noticed immediately was that Michel would hand compute each line of code, and compare it with the computer code’s output at each line. I was surprised that he was carrying the hand calculation out to perhaps ten decimal places! I had always harbored a secret desire to compute out to that many places. It gave me some strange reassurance that my code was right when I did, despite the admonishment I’d received at my University against creating and propagating meaningless “garbage numbers.”

One day I could clearly see that something was bothering Michel, and I just had to know what it was, as what he worked on was usually very interesting. He had some computer code he had not written, that had been used in-house for sometime. He was told to use it to predict the outcome of a measurement. Michel had derived a formula that should have been equivalent to what was implemented in the computer code, but about four or five decimal places out, the values were different. Michel showed me the hand calculation (he checked it three times) and the computer output. In his French accent he said “They should be the same, should they not?” I agreed. We checked the value of physical constants, such as the speed of light. They were all the same. Finally Michel saw the problem, and it was in the code. At certain extremes it would introduce a considerable error into the computation. That was the day I began to always check my hand and computer code computations to at least 5-6 places, minimum. I would learn that one man’s decimal dust is another man’s gold dust.

Indeed, decimal dust can be a source of new scientific knowledge. In the 1960s, Edward Lorenz (1917-2008) had noticed a very interesting output from a non-linear mathematical computer model he was using. He wanted to repeat the computation and input the initial conditions by hand as a short-cut. Lorenz rounded the original input value of 0.506127 to 0.506, a number of decimal places expected to be insignificant, and plenty accurate. When he ran it again, the computation output was nothing like the previous computation after a short period. Changing the input value at the level of “decimal dust” was expected to have no effect on the computation, clearly for the mostly non-linear world we live in, this is not the case. It was Lorenz that coined the now ubiquitous term “the butterfly effect” for sensitivity to initial conditions, and ushered along the science of chaos theory into what it is today. The tiny pressure changes caused from a butterfly flapping its wings in Africa, has the potential to be the seed for a hurricane in the Atlantic ocean. There are cases where non-linear deterministic equations need an infinite number of decimal places for a computation to repeat over all time.

In the early 1980s, British scientists using ground-based measurements reported a large ozone hole had appeared above Antarctica. This was quite surprising as satellite data had not noted the same problem. The computer code for the satellite had “data quality control algorithms” that rejected such values as “unreasonable.” Assumptions about what values are important, and those that are not, are assumptions, and should be understood as such. Another example is “filtered viruses.” It was assumed viruses had to be smaller than a certain dimension, so all other microbes above that size were removed with filter paper. It took decades for researchers to realize that monster size viruses exist. I’ve written about this in my essay Don’t Assume What You Don’t Know.

The a priori assumption of what is important is used as a rhetorical cudgel to suppress “excessive” information. When I’ve argued that human height should be represented in meters or millimeters (preferably millimeters), there is a vast outcry that only the traditional atavistic pseudo-inch known as the centimeter should be used. To use millimeters is, harrumph!, “too much precision.” It is also a possible lost opportunity for researchers as information has been suppressed from the introduction of a capricious assumption. One can always round the offending values down, but obtaining better precision after-the-fact is not an option. In my view, those who use the term decimal dust in a manner other than as a metaphor for tiny, are lazy in their criticisms, and assume they know how many decimal places matter without any familiarity with subject and the values involved.

When long and thoughtful effort is expended, one can introduce the astonishing simplification of using integers, which eschew decimal points entirely. As has been pointed out ad nausium in this blog, using millimeters for housing construction is a measurement environment that is partitioned in a way that allows for this incredible simplification. Pat Naughtin noted that integer values in grams should be used for measuring babies. This produces an intuitive understanding of the amount of weight that a baby gains or loses compared with Kilograms. Grams, millimeters and milliliters are efficient for everyday use. Integer values are the most instinctual numerals for comparison tables. The metric system is beautiful in its ability to provide the most intuitive ways of expressing the values of nature. It is up to us to use it wisely and thoughtfully, instead of dogmatically. In my view, this measurement introspection is sorely lacking in our modern society, and definitely in the community of science writers and researchers.


If you liked this essay and wish to support the work of The Metric Maven, please visit his Patreon Page and contribute. Also purchase his books about the metric system:

The first book is titled: Our Crumbling Invisible Infrastructure. It is a succinct set of essays  that explain why the absence of the metric system in the US is detrimental to our personal heath and our economy. These essays are separately available for free on my website,  but the book has them all in one place in print. The book may be purchased from Amazon here.


The second book is titled The Dimensions of the Cosmos. It takes the metric prefixes from yotta to Yocto and uses each metric prefix to describe a metric world. The book has a considerable number of color images to compliment the prose. It has been receiving good reviews. I think would be a great reference for US science teachers. It has a considerable number of scientific factoids and anecdotes that I believe would be of considerable educational use. It is available from Amazon here.


The third book is called Death By A Thousand Cuts, A Secret History of the Metric System in The United States. This monograph explains how we have been unable to legally deal with weights and measures in the United States from George Washington, to our current day. This book is also available on Amazon here.