By The Metric Maven
Unless you are very nerdy, and patient, you may be excused from reading this essay. It is also highly speculative.
One day I was reading a long article about measurement units and the metric system. The author mentioned in passing that John Von Neumann (1903-1957) wanted to add a unit for information to the metric system. I found the idea interesting, and theoretically very important, but moved on and can no longer find the reference. I suspect that Von Neumann probably wanted to make the binary digit, a one or zero a basic unit of information and include it as a part of SI. Information theory in my early experience was something developed by Engineer Claude Shannon (1916-2001) for transmitting writing, voices and other such messages over a communication channel. It was a theory that was mostly of interest to engineers and had little to do with fundamental science—I thought. Shannon’s work was originally called communication theory rather than information theory.
Rudy Rucker in his exceptional 1988 book Mind Tools states:
Relative to information we are in a condition something like the condition of seventeenth-century scientists regarding energy. We know there is an important concept here, a concept with many manifestations, but we do not yet know how to talk about it in exactly the right way. (page 26-27).
Rucker presents information theory as like a game of twenty questions. If I ask a series of yes-or-no questions, you will answer yes (1) or no (0) twenty times. The bit value for each piece of information will be a zero or a one. After twenty questions, the person being questioned has provided me with twenty bits of information. The bit is the measure of information in this context. If one is thinking of a famous person, and the person asking the questions wants to discover the name of that person, they start with questions like: Is the person alive or dead? Is the person male or female? Does their name start with a letter between A-M? If yes then take the halfway point, A-G?
There is a lot of information in language, Shannon estimated that common written English has an information content of about 9.14 bits per word. Messages with less common words have higher information content and common ones less information content. There are problems with this analogy, perhaps the person thinking of a famous person speaks German, and I only speak English, or is a creature from an alien world with no information pool of famous humans to draw upon.
How does one even define what a zero or a one is in an absolute sense? Existence or non-existence? The concept of one or zero — yes or no — seems built into our universe, part of its fabric. The device that you are reading this essay on uses nothing but groups of ones and zeros to produce the letters I’m typing. It uses them to guess search phrases in a search engine, often with an almost supernaturally accurate suggestion for your desired phrase before you have it completely typed out.
The first scientific battle over information theory that I recall became known as the Black Hole Information Paradox. The argument is over whether information is destroyed when matter is sucked into a black hole. Steven Hawking argued that the information is destroyed. This lead to a “Black Hole War” with other physicists who did not think information would be destroyed. This discussion seemed to be very abstract, confined to one corner of physics, and have little immediate scientific or engineering importance. I didn’t think it was that important at the time.
A contemporary scientific controversy and mystery is why current physics cannot explain why galaxies are rotating much faster than predicted. This knowledge vacuum lead to the postulating of dark matter as an explanation. About 80% of the universe’s mass would have to be dark matter to explain the observations. Dark matter has proven elusive, and so far direct measurements that point to its existence are non-existent.
In 2009 Erik Verlinde (1962 — ) produced a theory called entropic gravity, and has now (2016) argued that gravity exists because of a difference in concentration of information in the empty space between the two masses and their surroundings. Verlinde has developed a theory that he claims eliminates the need for dark matter as an explanation for the rotating galaxies discrepancy. To do this he turned to information theory. I only mention this particular theory as an example of how prevalent the ideas of information theory have become in physics. Still, the importance of information theory had no visceral scientific importance to myself.
This changed after my friend Dr. Sunshine suggested, very strongly, that I read What Darwin Got Wrong by Jerry Fodor and Massimo Piattelli-Palmarini. For someone like myself it was a fascinating read. It is very difficult to summarize this work, but to give one an idea of where they are going in their critique, the authors point to the Rock-pigeon example Darwin used in his work to illustrate “artificial selection.” Pigeon fanciers can trace all the different versions of pigeons they have bred to a single original pair. The variety of forms they take is very surprising. Here is a graphic from the book The Death of Adam by John C. Greene:
The differences are remarkable, and were all determined by the desires of the pigeon breeders. The philosophical problem is that an intelligence (human) was used to “select for” desired traits and then was removed from the explanation and unintelligent and random “nature” then substituted by Darwin as the guiding force. This would seem to indicate that nature must have some “intelligence” which makes decisions about what path the changes in pigeons will take in the natural world. One cannot suddenly decant intelligence from an example that relies on human intelligence as a guiding influence and then substitutes the random affects of nature in its place.
The other, much more measurable problem confronting modern evolutionary theory is its speed. What is being observed in the field, is that evolution is occurring at a fantastic rate when compared with a Darwinian perspective. The excellent book The Beak of The Finch shows that natural evolution occurs so fast one can measure it with a set of calipers over a few decades. A unit, called the darwin, was introduced to define an evolutionary rate of change. The belief at the time, was that evolutionary change is painfully slow, and so the darwin was meant to be a very large base unit:
Rates in the living world would have to be measured in millidarwins. In artificial selection, he said, you could get rates of thousands of darwins, but that is not something you would see in the wild…..
Now it is possible to translate the evolution of Darwin’s finches in the drought and the flood into Haldane’s whimsically named unit, the darwin. In the drought the change was 25,000 darwins. After the flood the change was about 6,000 darwins. (pg 110)
The book What Darwin Got Wrong does not offer an explanation for these two philosophical problems. The arguments in the critique seem sound, so what is the explanation? At this point I must ask my readers to entertain my speculative musing on these problems and my thoughts on a possible solution.
My thoughts on how to deal with this philosophical problem start with James Clerk Maxwell. In the 19th century he postulated a thought-experiment with a rectangular box which contained ordinary air (or a gas). The entire box and the gas inside would be of a uniform temperature. Maxwell then introduced a partition down the center of the box with a small “trap door” that could be opened or closed by a small demon at will. The gasses on both sides would begin at the same temperature, but the demon would watch and as fast moving molecules arrived at the door it would let them through to the pocket of gas on the opposite side that was designated to become warmer. He would open the door for slow moving molecules that were on their way toward the side chosen to become cooler. This activity would slowly increase the temperature of the gas on one side of the box (where the fast molecules were sorted) and decrease it on the other (where the slow ones had been sorted). The demon would do this without introducing any energy into the system. This is a strange paradox, and, if true, would violate the second law of thermodynamics. It argues against entropy decreasing without increasing disorder elsewhere. I recall this paradox from my introductory physics, and thought about it occasionally. Then I ran across a novel explanation that could resolve this paradox.
The explanation goes like this, while the demon itself introduces no work (energy) into the system that would be required to reverse entropy, it does make a decision for each molecule. It must decide which particles are moving fast, which are moving slowly, and in response open the trap door appropriately. This decision would be a yes-or-no decision and would require a finite amount of energy for the decision to take place. The changing of the demon decision bit from one to zero (yes to no) or vice versa would require a minimum amount of energy to accomplish. The idea that an equivalence between a bit and an equivalent, and possibly fundamental amount of energy might exist, gobsmacked me. The notion of a binary bit having an equivalent fundamental value of energy associated with it seemed as surprising as Einstein’s equivalence of energy and matter or E = mc2. Could there be a mathematical relationship between information and energy?!!. If there is, it would tie the joule to the bit, and one might wonder if the bit should be a part of the metric system. Indeed, there is a relationship. This notion, known as Landauer’s Principle determines the lower limit of energy needed to change a bit. This minimum energy is often known as the Landauer limit and is equal to about 2.75 zeptojoules (2.75 zJ). An equivalent Landauer mass would be about 30.5 x 10-36 grams or 0.0 000 000 000 305 yoctcograms. That is seriously On Beyond Yotta.
How does this help to resolve the philosophical problems raised with evolutionary theory? First, it has been known for some time that the coding of DNA conforms with Shannon’s information theory. It literally stores information and I suspect interacts with the reservoir of information that exists within the fabric of nature. My guess is an information transfer and interaction between the information environment that surrounds a life-form (its environment) and the information it stores within its DNA takes place. Interacting life-forms are involved in an “information struggle” where the stored information from previous generations of each different organism react to one another’s attempt at incorporating their stored energy.
The interaction between the energy stored in an organism, and its appropriation by another biological entity, which is interested in using the stored energy for its own needs, would be metaphorically similar to the ideas forwarded by Arthur Tansley (1871-1955). Tansley argued that an ecosystem consisted of interacting “energy tubes” which form a large networked “communication” system of some type involving the human mind. Tansley’s work is described by historian Peder Anker in the second episode of Adam Curtis’ three part documentary series All Watched Over By Machines of Loving Grace:
Peder Anker: Tansley’s idea of the mind was that of a networks. Energy going though tubes into a new sort of explosion–and then a new explosion. What would create this explosion would be sense perception. So these energy tubes would go out of the modern mechanism of the mind, creating a network, a system of within mind. Now this he would just transfer one-to-one almost into his description of the natural environment, in which energy between species, and among the species would constitute a system—an ecosystem—of energy flowing between these different species. So the grasshopper eating the grass would then be energy transforming through the tube into the tube where the beetle would do his or her job.
Adam Curtis: It’s a very mechanical idea.
Peder Anker: It’s very mechanical indeed.
Rather than operate to produce a non-existent “balance of nature,” the information-energy environment would operate to change itself as rapidly as possible, and produce a chaotic mathematical situation, rather than a stable one. Chaotic systems can react faster than those with negative feedback.
The other key is the postulate that the universe has a set of limitations that were “baked into the cake” at the moment of the big bang. What I mean is that the universe is like a set of building blocks that have only a limited number of combinations, and not an infinite number. Anything goes is limited by a Platonic set of restrictions that were set at the time of the Big Bang. Not all combinations are possible or meaningful, they are finite and very, very large, but not infinite.
This set of baked-in restrictions came into high relief when I was reading the essay “Hemoglobin and the Universe” by Isaac Asimov in his collection Only a Trillion. In the 1950s, the chemicals that make up hemoglobin were known, the problem was their arrangement. The number of possible combinations for hemoglobin that Asimov presented in his essay is about 4 x 10619. Asimov writes the number out as a metaphorical way of realizing how astonishingly large this number is:
Asimov points out, that as of his writing, the estimated number of human hemoglobin molecules on Earth is only about 1031. Even when adding up the hemoglobin contribution from all other life on Earth, we only arrive at 1041. With some assumptions it turns out that using all the material in the universe can only produce an estimated 1077 hemoglobin molecules. Asimov then discusses the limitations of trying to construct a computer the size of the known universe to try all of the combinations. His estimate is that the best computer he could come up with, running for 300 billion years, would only test about 10179 combinations. This number is still essentially zero when compared with 4 x 10619. This large number problem is a general one that exists for all proteins. How was hemoglobin developed by non-information guided random selection?—let alone its alternative hemocyanin which is copper based and used by horseshoe crabs. The numbers involved are staggering.
How could we ever figure out the combination that makes up an important molecule like insulin, which is much less complex, but still has about 10100 different arrangements? As Asimov states: “The fact is that straight trial-and-error technique would have been an unbearable trial and a colossal error.” His next essay “Victory on Paper” discusses the development of paper chromatography. This technique allowed scientists to solve a problem that seemed insurmountable, determining the make-up of the insulin molecule. How was this done?
A very simple explanation is that information was used with a game like twenty questions. Which combinations can be ignored and which are promising reduce the candidates and offer a solution in a finite amount of time. The combinations that could make up insulin are about 3 x 10100. When the arrangement for insulin was finally found, it was discovered that even minor changes to the molecule would destroy its effectiveness.
This restricted structure of the universe constrains what type of life is possible, and information theory, via DNA interaction, sorts the possibilities out in response to its information environment. That is my cracker barrel assessment of the direction in which evolutionary theory might be heading.
The final “bit of information” is that one definition of life is that it reverses entropy. How does it do this? Well, if it uses information in a manner like that of Maxwell’s Demon, it can reverse entropy, but it would require an input of energy as a trade off that uses information to locally reverse entropy.
My guess is that information is stored using a currently unknown mechanism which allows generations of organisms to produce counterfactual information that is useful in changing an organisms phenotype (i.e. the physical or biochemical makeup of an organism) in as optimum a manner as possible in response to its external information environment. Rather than Darwinian slow change, this local biological information pool, in response to the requirement that it maximally obtain energy from its external environment, uses information phenotype selection algorithms that are much faster and produce more variation than expected by Darwin, and also change multiple aspects of an organism concurrently. Of course the competing biological information pools will also react to the reaction.
I have no idea if any of these thoughts have any true scientific import, or are just “sound and fury signifying nothing”; but one way or another, the fundamental units of the metric system may someday find themselves directly coupled to the bit, and information theory will contain a fundamental unit, that in concert with the other SI units, will fundamentally describe our universe.
If you liked this essay and wish to support the work of The Metric Maven, please visit his Patreon Page