The Expanding Universe

Telescope

By The Metric Maven

After I learned it, I’ve made good use of the Whole Number rule in my technical work. But over the last eight years, I’ve found that now and then, I need to present data with a large dynamic range. What this means is that the numbers involved vary from a very small value to a very large value and span many metric prefix ranges. I’ve spent a lot of time trying to find a way to represent these values as intuitively as possible, and until recently all the options were, in my view, unsatisfactory.

Not long ago I was reading an essay by Isaac Asimov entitled The Figure of The Farthest. The essay discusses the  increase in the estimated size of the known universe from antiquity to 1973, when this essay was written.

Asimov starts with Hecataeus of Miletus (c.550 BC — c.476 BC) who wrote a world survey. The extent of the world he traveled was about 5000 miles across and thought to be a flat disk.

800px-Hecataeus_world_map-en.svg
Hecataeus of Miletus Reconstructed World Map — Wikimedia Commons

Greek philosophers came to the realization around 350 BC that the Earth is probably a sphere. Eratosthenes of Cyrene (c.276 BC — c.195/194 BC) was the first to devise a method to measure the dimensions of the spherical Earth. He realized that incident sunlight at separate locations arrive at different angles. Using the distance between two points with a known measured angular difference, he estimated the Earth to have a diameter of 8000 miles.

The Greek astronomer Hipparchus of Nicaea (c.190 BC — c.120 BC) used trigonometric methods to compute the separation from the Earth to the Moon. His estimate was equal to 30 times the diameter of the Earth. Using Eratosthenes estimate for the Earth’s diameter, this distance is 480,000  240,000 miles.

Eighteen centuries passed before more refined astronomical estimates of distances were computed. The invention of the telescope in 1608 allowed astronomers to measure celestial values with much greater precision and accuracy. In 1671 French astronomer Giovanni Domenico Cassini (1625–1712) would be the  first to measure the parallax of Mars with reasonable accuracy. The model of the solar system provided by Johannes Kepler (1571–1630) was used to determine the distance to numerous astronomical bodies. The furthest distance measured was Saturn at 1,800,000,000 miles.

Edmund Halley (1656–1742) would compute the orbit of his eponymous comet to have a maximum distance from the Sun of 6,000,000,000 miles.

Measuring the distances to the stars proved much more difficult. A race was on to measure stellar parallax. A succession of star distances were finally measured. The distance to the star Vega was determined in 1840 by Friedrich Wilhelm von Struve (1783–1864). It proved to be the furthest at 54 light-years.

Countless stars remained without measurable parallax values.  Clearly the Universe was much larger than the distance to Vega. William Herschel (1738–1822) surveyed the number of stars in different directions and realized that it varied. He suggested they formed a flattened lens-shaped distribution that we now call our Galaxy.  It would not be until 1906 that Dutch astronomer Jacobus Cornelis Kapteyn (1851–1922) was able to use photographic techniques to estimate the dimensions of this stellar distribution. The largest dimension of the Galaxy was computed to be 55,000 light-years.

Harlow Shapley (1885–1972) would use a variable star called a Cepheid to determine the extent of our galaxy with increased accuracy. It was now thought to be about 100,000 light-years across.  Shapley further demonstrated that the Magellanic Clouds are outside of our Galaxy. The farthest extent of our Universe increased to 330,000 light-years by 1920.

The Andromeda Nebula became a source of scientific controversy. Was it inside of our Galaxy or outside?  In 1923 Edwin Powell Hubble (1889–1953) showed the Andromeda Nebula is indeed a Galaxy outside of our own, and the furthest extent of the known Universe expanded to 5,400,000 light-years.

Other fuzzy glowing patches could be seen, which suggested the Universe was much, much larger than the distance to Andromeda. By 1940, the maximum distance measured for the size of the Universe was around 400,000,000 light-years.

This value seemed to be a measurement limit and no further progress would be made. Then objects much brighter than galaxies were discovered. They were originally called quasars, but are now thought to be black holes which are swallowing nearby matter. The information they provided ballooned the maximum extent of the known Universe to 2,000,000,000 light-years. The year that Asimov wrote his essay, 1973, the farthest known quasar increased the size of the Universe to about 24,000,000,000 light-years.

In his essay, Asimov does not provide a table of these values. This is unusual, as Dr. Asimov had no reluctance to present numerous tables in his other essays and full length books. Each succeeding estimate of the Universe is separated with the explanations I’ve summarized above. I decided to create a table of Dr. Asimov’s data to use for illustration:

Table-1

The values of the known Universe in miles become larger and larger and then Dr. Asimov shifts them to light-years. In my view this data has a perceptual discontinuity, and is not an acceptable way to present this data.

So what to try? Often in other tables distances are kept in provincial Kilometers, but clearly always starting with Kilometers would restrict an optimum starting value. I decided to try to categorize the data with a set of metric prefixes:

Table-2

Clearly the situation is worse, the numerous sets of prefixes produce at least four perceptual discontinuities in place of the single one in Asimov’s data. I gave it another try where I attempted to minimize the number of prefixes and separate each set with a line skip:

Table-3
Table-4

Again, this is just a perceptual mess, with three rather than one discontinuity. I then thought about representing the lengths using a logarithm, and the knee-jerk way to do this is to use decibels. (Yes, I know that it is not usual to represent lengths as decibels, please humor me for a moment.) This yielded:
While this is continuous, it is also laughably decanted of all perceptual interpretation for most people. It simply seems to hide the magnitudes involved.

I started from scratch, and decided that it might be best if one chose a metric prefix which produces the smallest integer value possible in the data set. In this case it is eight Megameters. I then used only Megameters with standard three digit separations. This seems to be a useful way to present the large dynamic range data. One can see a large magnitude jump from 150 BC to 1671. The size of the universe was refined from 1671 to 1840 until another large magnitude jump occurred in 1906. The values increased without another large magnitude jump until 1940. From 1940 onward the increase was again without a quick jump in magnitude.

Table-5
Table-6

Still the table seemed to be missing something that might increase numerical clarity. I showed the above table to Sven for some brainstorming,  and he immediately had a suggestion. One could place the appropriate metric prefix at about a 30 degree angle above each of the three digit separations. I thought that using the metric prefix-base unit abbreviation might be best. Sven also thought that some light separation lines might be a good idea. My sense, from what I’ve learned from the book The Visual Display of Quantitative Information, was that this would distract from the data. When I implemented my thoughts, I ended up with:
My eye seems to be drawn to the metric prefixes at the top, which then act as a distracting interpretive boundary while I’m looking at the data. It struck me that a better alternative might be to  put the metric prefixes at the bottom:

Table-7

While this is not perfect, it seems to help allow one to concentrate on the numbers with less distraction. My best suggestion for large dynamic range data is to:

  1. Use the smallest metric prefix that produces the smallest integer value possible for the smallest value in the data set.
  2. Tabulate the data with three order of magnitude separations spelling out the units at the right.
  3. Place the metric prefix-base unit abbreviations below each appropriate column.

I’ve pondered this problem for a long time. This is the first instance where a satisfactory form for large dynamic range data was obtained. This format may very well have been used before, but I don’t have an example (I’m sure my readers will let me know). I’m going to implement this format going forward, and continue to evaluate it. The use of spaces between the metric magnitude triads allows this format to work aesthetically. The column separation is immediately apparent. If commas are inserted as triad separators, the columns merge and become very difficult to cognitively distinguish. Independent of whether this is an optimum choice for large dynamic range data, it is simply not possible to create a table of this form using Ye Olde English units. It illustrates once again the superior nature of the modern version of the metric system’s units and methods.

Related essay:

Lies, Damned Lies and Scientific Notation


If you liked this essay and wish to support the work of The Metric Maven, please visit his Patreon Page and contribute. Also purchase his books about the metric system:

The first book is titled: Our Crumbling Invisible Infrastructure. It is a succinct set of essays  that explain why the absence of the metric system in the US is detrimental to our personal heath and our economy. These essays are separately available for free on my website,  but the book has them all in one place in print. The book may be purchased from Amazon here.


The second book is titled The Dimensions of the Cosmos. It takes the metric prefixes from yotta to Yocto and uses each metric prefix to describe a metric world. The book has a considerable number of color images to compliment the prose. It has been receiving good reviews. I think would be a great reference for US science teachers. It has a considerable number of scientific factoids and anecdotes that I believe would be of considerable educational use. It is available from Amazon here.


The third book is called Death By A Thousand Cuts, A Secret History of the Metric System in The United States. This monograph explains how we have been unable to legally deal with weights and measures in the United States from George Washington, to our current day. This book is also available on Amazon here.

Worlds Apart or Worlds Converged?

bizarro-world

By The Metric Maven

Bulldog Edition

I once taught an internal class on electromagnetism at a large consumer electronics company. After one class, a technician with whom I worked, sauntered up to me holding a small advertisement. It was for a sort of “healing crystal.” The ad claims it can protect a person from EMF radiation. The advertising copy had an explanation of its powers, stating it resonates at the same frequency as the Schumann Resonance, which it further claimed is sometimes called the “brainwave of the planet.” I was taken aback as I’d never heard of the Schumann Resonance. It was still the era where one would locate the company librarian, who in turn had books that described it as a real phenomenon. It is not the “brainwave of the planet.”

Takionic-Pendant

Our outer atmosphere is conductive, and so is the Earth. Between the two is an electromagnetic cavity resonator  that is driven by the 50-100 lighting strikes that take place each second. It is possible to mathematically predict the resonance frequencies of this cavity. German physicist Winfred Otto Schumann first did this in 1952. The resonances were originally measured in 1954 by Schumann and König. The first Schumann resonance is about 7.8 Hertz. In other words, the Schumann electric field that surrounds all of us reverses its direction 7.8 times per second. The strength of this field, about 300 microvolts/meter, is so minute it is a very difficult measurement for an amateur to make. The static electric field on a nice day is about 500 000 times larger than the fundamental Schumann Resonance.

Despite the difficulties involved, I wound an iron bar with 50 000 turns of copper wire, had my fellow engineers Lapin and JV help me with the amplifier design, wrote software to process the output of a DC voltmeter, and one morning around 02:00 I averaged about 120 time measurements and found a peak at 7.8 Hz. There were a massive number of human-made frequency spikes in the frequency data. Below is a signal in both time and frequency professionally measured in Germany.

Schumann_Measurement
click to enlarge

The time signal shown on the left is choppy and squiggly. It is very hard to “see” the fundamental Schumann resonance without signal processing and viewing the resulting information in terms of frequency. The top and bottom time signals on the left are from detectors facing east and west, and north and south respectively. The spike between f2 and f3 is from an electric train almost 30 Km distant. The 50 Hz spike is from Germany’s electric power grid that operates at 50 Hz rather than our 60 Hz.

One reference indicated the coil I was using to detect the first Schumann Resonance should not jiggle more than the diameter of an atom, or the vibration would contaminate the signal. I was amazed I had managed to measure it in my apartment.

One evening I was discussing the possibility of science measuring gravitational waves with Lapin. He asked me what I thought, and my experience with measuring Schumann Resonances resurfaced. My thought was that it seemed extremely unlikely that a device capable of measuring such a small change seemed possible, but I wanted the engineers and scientists to try. It was worth the effort and something interesting might be learned just devising and realizing such a measurement device.

The possibility of using gravity waves to communicate in the same way we do with electromagnetic waves was discussed in 1991 by John Kraus.[1]  The difficulty of communicating by gravitational wave appeared to be essentially insurmountable. A rotating bar will radiate gravitational waves. Kraus used a 20 meter long 500 Megagram rotating steel beam as an example. The maximum rate of rotation is about 270 rpm. This is just below the spin rate that would cause the steel to fly apart. It would take about 20 hours to spin the beam up to speed with a 100 KW power input (about 7.2 Gigajoules). It also takes about 20 hours to spin it down. The device would transmit no more than a single bit per day. The radiated power of the propagating gravitational wave would be only 10-27 watts. This would be 0.001 yoctowatts, using the smallest available metric prefix. While not zero, it almost might as well be.

Gravity-Wave-Communication-Link
– From [1]

The radiated gravitational waves would have a frequency of about 9 Hz and detecting this tiny amount of power impinging upon a gravitational wave antenna seems almost impossible. Some type of material that would directly convert gravity waves into an electrical signal with perfect efficiency would probably be needed.

When the measurement of a gravitational wave was announced, I was cautious. Then I saw the signal:

click to enlarge
click to enlarge

I had stared at enough Schumann Resonance time signals that my immediate reaction was: “Wow, that is clearly a signal, and it’s replicated on a second detector.” I was very sure they had measured a gravitational wave, and when the signal was interpreted as two massive black holes merging, it made consistent sense. I knew I had witnessed one of the most important discoveries in 21st century science. During a press conference, LIGO Co-Founder Rai Weiss stated:

I’ve been on many committees for NASA. When an engineer hears ten-to-the-minus-twenty-one they think you’re out of your mind. That’s the very first response that most people have. You’re going to measure something at ten-to-the-minus-twenty-one of anything—I don’t care what it is you’re not going to be able to do it.

This engineer would have felt the same way. When discussing a phenomenon with Zeptoworld lengths*, gravitational waves are really the only game in town. The next and last reducing prefix, yocto describes the extent of subatomic particles such as the neutrino. Thankfully the researchers were stubborn, dedicated, and persuasive. Two LIGO sites were constructed. What is interesting is that the noise levels of the current version of LIGO were not low enough for a high expectation of signal detection. But then two massive black holes rapidly rotating about one another (radiating gravitational waves like the bar example) and then merging to form a single singularity was probably not foremost in the researchers minds. The detection of less massive binary stars that rotate about one another was probably thought to be the first realistic candidate for gravity wave detection. The Zeptoworld of length is on the order of 1 x 10-21 meters. At the opposite magnitude extreme, 1 x 1021 meters, is Zettaworld. This world describes the size of galaxies, and little else I know of exists of this dimension.

LIGO’s two detectors allowed the researchers involved to estimate the distance to the merging black holes at about 12.3 Zettameters from us. That is one incredible length for this signal to have traveled and then have been detected. It really struck me as astonishing to think that we measured a stress change of about a zeptometer that was induced by gravitational radiation originating a dozen Zettameters from us. This single measurement connects a Zeptoworld length to a Zettaworld length spanning 14 metric prefixes (triads) or a factor of 1 000 000 000 000 000 000 000 000 000 000 000 000 000 000. The measurement of gravitational waves is a mind bending achievement when viewed within the metric system**.

[1] Kraus, J. “Will Gravity-Wave Communication Be Possible?” IEEE Antennas and Propagation Magazine, Vol. 33, No. 4, August 1991

*  My book The Dimensions of the Cosmos breaks the universe into 16 metric worlds based on 16 metric prefixes. There is Kiloworld, Megaworld, Milliworld and so on. I will be using this metaphor in current blogs.

** LIGO has since detected a second black hole merger on December 25th of 2015 (2015-12-26).  Grossman, L. New Scientist 2016-06-18 “LIGO sees second black hole merger.” pg 8-9 The new detection was from two less massive black holes. The first detection had black holes so massive they only revolved around one another about 10 times before merging. In the second detection they spent 55 orbits before merging.


If you liked this essay and wish to support the work of The Metric Maven, please visit his Patreon Page and contribute. Also purchase his books about the metric system:

The first book is titled: Our Crumbling Invisible Infrastructure. It is a succinct set of essays  that explain why the absence of the metric system in the US is detrimental to our personal heath and our economy. These essays are separately available for free on my website,  but the book has them all in one place in print. The book may be purchased from Amazon here.


The second book is titled The Dimensions of the Cosmos. It takes the metric prefixes from yotta to Yocto and uses each metric prefix to describe a metric world. The book has a considerable number of color images to compliment the prose. It has been receiving good reviews. I think would be a great reference for US science teachers. It has a considerable number of scientific factoids and anecdotes that I believe would be of considerable educational use. It is available from Amazon here.


The third book is called Death By A Thousand Cuts, A Secret History of the Metric System in The United States. This monograph explains how we have been unable to legally deal with weights and measures in the United States from George Washington, to our current day. This book is also available on Amazon here.