My Grandfather’s Science

Velociraptor by Ben TownsendThe pace of technology is breathtaking. For that reason we’re tempted to believe our own time to be the best of times, the worst, the most wise and most foolish, most hopeful and most desperate, etc. And so we insist that our own science and technology be received, for better or worse, in the superlative degree of comparison only. For technology this may be valid. For science, technology’s foundation, perhaps not. Some perspective is humbling.

This may not be your grandfather’s Buick – or his science. It contemplates my grandfather’s science – the mind-blowing range of scientific progress during his life. It may dwarf the scientific progress of the next century. In terms of altering the way we view ourselves and our relationship to the world, the first half of the 20th century dramatically outpaced the second half.

My grandfather was born in 1898 and lived through nine decades of the 20th century. That is, he saw the first manned airplane flight and the first man on the moon. He also witnessed scientific discoveries that literally changed worldviews.

My grandfather was fascinated by the Mount Wilson observatory. The reason was the role it had played in one of the several scientific discoveries of his youth that rocked not only scientist’s view of nature but everyone’s view of themselves and of reality. These were cosmological blockbusters with metaphysical side effects.

When my grandfather was a teen, the universe was the Milky Way. The Milky Way was all the stars we could see; and it included some cloudy areas called nebulae. Edwin Hubble studied these nebulae when he arrived at Mount Wilson in 1919. Using the brand new Hooker Telescope at Mt. Wilson, Hubble located Cepheid variables in several nebulae. Cepheids are the “standard candle” stars that allow astronomers to measure their distance from earth. Hubble studied the Andromeda Nebula, as it was then known. He concluded that this nebula was not glowing gas in the Milky Way, but was a separate galaxy far away. Really far.

In one leap, the universe grew from our little galaxy to about 100,000,000 light years across. That huge number had been previously argued but was ruled out in the “Great Debate” between Shapley and Curtis in April 1920. To earlier arguments that Andromeda was a galaxy, Harvard University’s Harlow Shapley had convinced most scientists that Andromeda was just some glowing gas. Assuming galaxies of the same size, Shapley noted that Andromeda would have to be 100 million light years away to occupy the angular distance we observe. Most scientists simply could not fathom a universe that big. By 1925 Hubble and his telescope on Mt. Wilson had fixed all that.

Over the next few decades Hubble’s observations showed galaxies far more distant than Andromeda – millions of them. Stranger yet, they showed that the universe was expanding, something that even Albert Einstein did not want to accept.

The big expanding universe so impressed my grandfather that he put Mt. Wilson on his bucket list. His first trip to California in 1981 included a visit there. Nothing known to us today comes close to the cosmological, philosophical and psychological weight of learning, as a steady-state Milky Way believer, that there was a beginning of time and that space is stretching. Well, nothing except the chaotic inflation theory also proposed during my grandfather’s life. The Hubble-era universe grew by three orders of magnitude. Inflation theory asks us to accept hundreds of orders of magnitude more. Popular media doesn’t push chaotic inflation, despite its mind-blowing implications. This could stem from our lacking the high school math necessary to grasp inflation theory’s staggering numbers. The Big Bang and Cosmic Inflation will be tough acts for the 21st century to follow.

Another conceptual hurdle for the early 20th century was evolution. Yes, everyone knows that Darwin wrote in the mid-1800s; but many are unaware of the low status the theory of evolution had in biology at the turn of the century. Biologists accepted that life derived from a common origin, but the mechanism Darwin proposed was impossible. In the late 1800’s the thermodynamic calculations of Lord Kelvin (William Thomson, an old-earth creationist) conflicted with Darwin’s model of the emergence of biological diversity. Thomson’s 50-million year old earth couldn’t begin to accommodate prokaryotes, velociraptors and hominids. Additionally, Darwin didn’t have a discreet (Mendelian) theory of inheritance to allow retention of advantageous traits. The “blending theory of inheritance” then in vogue let such features regress toward the previous mean.

Darwinian evolution was rescued in the early 1900s by the discovery of radioactive decay. In 1913 Arthur Holmes, using radioactive decay as a marker, showed that certain rocks on earth were two billion years old. Evolution now had time to work. At about the same time, Mendel’s 1865 paper was rediscovered. Following Mendel, William Bateson proposed the term genetics in 1903 and the word gene in 1909 to describe the mechanism of inheritance. By 1920, Darwinian evolution and the genetic theory were two sides of the same coin. In just over a decade, 20th century thinkers let scientific knowledge change their self-image and their relationship to the world. The universe was big, the earth was old, and apes were our cousins.

Another “quantum leap” our recent ancestors had to make was quantum physics. It’s odd that we say “quantum leap” to mean a big jump. Quanta are extremely small, as are the quantum jumps of electrons. Max Planck kicked off the concept of quanta in 1900. It got a big boost in 1905 from Einstein. Everyone knows that Einstein revolutionized science with the idea of relativity in 1905. But that same year – in his spare time – he also published papers on Brownian motion and the photoelectric effect (illuminated metals give off electrons). In explaining Brownian motion, Einstein argued that atoms are real, not just a convenient model for chemistry calculations as was commonly held. In some ways the last topic, photoelectric effect, was the most profound. Like many had done with atoms Planck considered quanta as a convenient fiction. Einstein’s work on the photoelectric effect, for which he later got the Nobel Prize, made quanta real. This was the start of quantum physics.

Relativity told us that light bends and that matter warps space. This was weird stuff, but at least it spared most of the previous century’s theories – things like the atomic theory of matter and electromagnetism. Quantum physics uprooted everything. It overturned the conceptual framework of previous science and even took a bite out of basic rationality. It told us that reality at small scales is nothing like what we perceive. It said that everything, including light perhaps even time and space – is ultimately discreet, not continuous; nature is digital. Future events can affect the past and the ball can pass through the wall. Beyond the weird stuff, quantum physics makes accurate and practical predictions. It also makes your iPhone work. My grandfather didn’t have one, but his transistor radio was quantum-powered.

Technology’s current heyday is built on the science breakthroughs of a century earlier. If that seems like a stretch consider the following. Planck invented the quantum in 1900, Einstein the photon in 1903, and Von Lieben the vacuum tube in 1906. Schwarzschild predicted black holes in 1916, a few years before Hubble found foreign galaxies. Georges Lemaitre proposed a Big Bang in 1927, Dirac antimatter in 1928, and Chadwick the atomic nucleus in 1932. Ruska invented the electron microscope the following year, two years before plastic was invented. In 1942 Fermi tested controlled nuclear reactions. Avery identified DNA as the carrier of genes in 1944; Crick and Watson found the double helix in 1953. In 1958 Kilby invented the integrated circuit. Two years later Maiman had a working laser, just before the Soviets put a man in orbit. Gell-Man invented quarks in 1964. Recombinant DNA, neutron stars, and interplanetary probes soon followed. My grandfather, born in the 1800s, lived to see all of this, along with personal computers, cell phones and GPS. He liked science and so should you, your kids and your school board.

While recent decades have seen marvelous inventions and cool gadgets, conceptual breakthroughs like those my grandfather witnessed are increasingly rare. It’s time to pay the fiddler. Science education is in crisis. Less than half of New York City’s high schools offer a class in physics and only a third of US high school students take a physics class. Women, African Americans and Latinos are grossly underrepresented in the hard sciences.

Political and social science don’t count. Learn physics, kids. Then teach it to your parents.

  1. #1 by Anonymous on May 31, 2016 - 9:25 pm

    Great insight, as always, Bill. Thanks for sharing.

  2. #2 by cathyc on May 31, 2016 - 9:52 pm

    I was with you until the last three sentences. Speaking as one in the areas you don’t think count, there are lots of reasons to explain the decline and also reasons to be wary about telling your children that this is what they should do.

    Not least of these is that so much work is done in huge groups in which the individual young scientist is nothing but a cog, with no possibility of doing anything interesting. When you get your name on a paper it is with a thousand others. Of the many disillusioned young scientists I’ve seen leaving Cern, one of them actually said and I am quoting ‘What they do there isn’t science, it’s anti-science.’ Your life’s work is always worrying about your next year’s contract. Mostly Cern only wants to employ people at the start of their careers so that it pays out the minimum in wages.

    The theory still stands, as far as I know, that people in these areas have lost their creativity, their capacity to do important work by the time they are thirty or so – a bit older since they aren’t doing maths? If that’s the case, what possible reason is there to encourage kids to make this what they do?

    The system stinks, but it takes a political or social scientist to make that point in the right way and we, of course, don’t count.

    • #3 by Bill Storage on June 1, 2016 - 10:52 pm

      I didn’t mean to say that political science has no use. Rather political science does not count as a “hard” science and does not reduce the need for high school physics education. In the sense of the word “science” that Americans are familiar with, I doubt that most political science counts as science at all. If science requires some combination of iteration upon hypothesis and experimentation, hypothesis testing, vigorous skepticism and falsification attempts, I don’t see how political science could, regardless of the intent of its practitioners, ever meet those requirements. Saying that political science does not deserve the special epistemic status given to hard science should be no insult to political scientists.

      That’s assuming those who undertake it are even trying in any sincere way to do science. In my experience, however, much of political science is highly prone to eternal revision of hypotheses to make them consistent with observation. As Popper noted about Marx, when Marx’s predictions failed, Marxists saved the theory from falsification by the addition of endless ad hoc hypotheses. Eddington’s quote about “facts being confirmed by theory” comes to mind.

      If you mean to say that academic and institutional physics (e.g., Cern) is in need of an overhaul, I certainly agree. But that doesn’t reduce the need for high school kids to learn physics, whether they aspire to particle physics, engineering, political science or auto repair. High school physics teaches (or should teach) how things work in the physical world. If the public and its government knew basic physics, we might hear fewer grossly ignorant proposals for physics-defying energy supplies, sanitation programs, water distribution and transportation initiatives.

      Physics knowledge without creativity leads to poor solutions, but creativity without physics is alchemy and astrology.

  3. #4 by Steve Wallis on June 1, 2016 - 5:33 am

    My Grandfather’s Social Science

    Our collective decisions for funding education (indeed, the very importance of education) are based on our mental models, theories, and policies. Generally, those may be understood as “conceptual systems.” Those systems represent our understanding of the world. And, consequently, our ability to make effective decisions. Our ability to develop more effective/useful conceptual systems is based on the social sciences.

    Over the past century, the social sciences (including economics, sociology, psychology, business, policy) does not seem to have advanced significantly. Scholars publish; however, the models they suggest are often deemed useless by those outside the ivory tower of academia. So, conceptual systems are still made the old-fashioned way. Politicians make speeches and cut deals in the back room to meet the short-term goals and limited aspirations of whatever groups wield the most power.

    To compare… decisions for placing a satellite in orbit are based on highly useful conceptual systems (e.g. Newton’s laws of motion) decisions for funding are based on relatively useless conceptual systems (e.g. Maslow’s theories on stages of development). Recently, we’ve developed methods for evaluating the systemic structure of conceptual systems. We can objectively and rigorously compare Newton with Maslow. What we find is that the conceptual systems of physics score 100% (and are useful 100% of the time).

    In contrast, the conceptual systems found in the social sciences score around 20%. Interestingly, we find ourselves succeeding about 20% of the time. A recent study of national policy failure found a 17% success rate. Organizational change efforts (such as TQM, BPR, & OD) only work about 20% of the time. The methodology for evaluating those systems is Integrative Propositional Analysis (IPA).

    Bottom line… If we want to make more effective decisions, we need to develop conceptual systems with a higher level of systemic structure. Oh, by the way, that approach also improves collaboration and transparency (good for buy-in and accountability). While that approach is not easy, it is not very difficult either (we’ve even created a gamified version for leaders, communities, and even scholars). IPA may be easily learned and applied. A bit more depth here:

  4. #5 by sstorage31 on June 1, 2016 - 10:23 am

    Bill, You get an A+ on this piece; excellent. Even with my limited and inferior education, I think that I understand and agree with you. I love your use of your Grandfather.

  5. #6 by Stephen Christie on January 22, 2017 - 12:56 am

    Thanks for your article. I’m intrigued by the idea that scientific progress has slowed. It’s possible that you’re right about that, but I’m sceptical. I would struggle to list as many breakthroughs in recent decades as you have listed for earlier decades. But my suspicion is that we are always biased against recognisng recent progress as breakthroughs. I wonder if in 1930 your grandfather might have bemoaned lack of recent progress, compared to the golden era of 1850-1899. If, however, you are right, and progress has slowed, then we have a serious problem. We face several existential threats – problems that we need to solve – and progress in science and politics is the only way to solve them.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: