In 1801, Thomas Young proved light was a wave, and overthrew Newton’s idea that light was a “corpuscle”.
Among the many reasons I chose to pursue physics was the desire to do something that would have a permanent impact. If I was going to invest so much time, energy, and commitment, I wanted it to be for something with a claim to longevity and truth. Like most people, I thought of scientific advances as ideas that stand the test of time.
My friend Anna Christina Büchmann studied English in college while I majored in physics. Ironically, she studied literature for the same reason that drew me to math and science. She loved the way an insightful story lasts for centuries. When discussing Henry Fielding’s novel Tom Jones with her many years later, I learned that the edition I had read and thoroughly enjoyed was the one she helped annotate when she was in graduate school.
Tom Jones was published two hundred and fifty years ago, yet its themes and wit resonate to this day. During my first visit to Japan, I read the far older Tale of Genji and marveled at its characters’ immediacy too, despite the thousand years that have elapsed since Murasaki Shibuku wrote about them. Homer created the Odyssey roughly two thousand years earlier. Yet notwithstanding its very different age and context, we continue to relish the tale of Odysseus’s journey and its timeless descriptions of human nature.
Scientists rarely read such old - let alone ancient - scientific texts. We usually leave that to historians and literary critics. We nonetheless apply the knowledge that has been acquired over time, whether from Newton in the seventeenth century or Copernicus more than a hundred years earlier still. We might neglect the books themselves, but we are careful to preserve the important ideas they may contain.
Science is certainly not the static statement of universal laws we all hear about in elementary school. Nor is it a set of arbitrary rules. Science is an evolving body of knowledge. Many of the ideas we are currently investigating will prove to be wrong or incomplete. Scientific descriptions certainly change as we cross the boundaries that circumscribe what we know and venture into more remote territory where we can glimpse hints of deeper truth beyond.
Neils Bohr in 1912 was faced with a challenging choice - abandon classical physics or abandon his belief in observed reality. Bohr wisely chose the former and assumed classical laws don’t apply at the small distances occupied by electrons in an atom. And it was one of the key insights that led to the development of quantum physics.
Once Bohr ceded Newton’s laws, at least in this limited regime, he could postulate that electrons occupied fixed energy levels - according to a quantization condition involving a quantity called orbital angular momentum that he proposed. According to Bohr, his quantization rule applied at these smaller scales. The rules were different from the rules that hold at macroscopic scales, such as the Earth circulating around the Sun.
Technically, quantum mechanics still applies to these larger systems as well. But the effects are far too small to ever measure or notice. When you observe the orbit of the Earth or any macroscopic object for that matter, quantum mechanics can be ignored. The effects average out in all such measurements so that any prediction you make agrees with its classical counterpart. For measurements on macroscopic scales, classical predictions generally remain extremely good approximations - so good that you can’t distinguish that quantum mechanics is in fact the deeper underlying theory.
Classical predictions are analogous to the words and images on an extremely high-resolution computer screen. Underlying them are the many pixels that are like the quantum mechanical atomic substructure. But the images or words are all we generally need (or want) to see.
Quantum mechanics constitutes a change in paradigm that becomes apparent only at the atomic scale. Despite Bohr’s radical assumption, he didn’t have to abandon what was known before. He didn’t assume classical Newtonian physics was wrong. He simply assumed that classical laws cease to apply for electrons in an atom. Macroscopic matter, which consists of so many atoms that quantum effects can’t be isolated, obeys Newton’s laws, at least at the level at which anyone could measure the success of its predictions. Newton’s laws are not wrong. We don’t abandon them in the regime in which they apply. But at the atomic scale, Newton’s laws had to fail. And they failed in an observable and spectacular fashion that led to the development of the new rules of quantum mechanics.