The uncertainty principle is one of the most famous (and probably misunderstood) ideas in physics. It tells us that there is a fuzziness in nature, a fundamental limit to what we can know about the behavior of Quantum particles and, therefore, the smallest scales of nature. Of the scales, the most we can hope for is to calculate probabilities for where things are and how they will behave. Unlike Isaac Newton’s clockwork universe, where everything follows clear-cut laws on how to move and prediction is easy if you know starting conditions, the uncertainty principle enshrines a level of fuzziness into quantum theory.
Werner Heisenberg‘s simple idea tells us why atoms don’t implode, how the sun manages to shine and, strangely, that the vacuum of space is not actually empty.
An early incarnation of the uncertainty principle appeared in a 1927 paper by Heisenberg, a German physicist who working at Niels Bohr’s institute in Copenhagen at the time titled “On the Perceptual Content of Quantum Theoretical Kinematics and Mechanics”. The more familiar from of the equation came a few years later he had further refined his thoughts in subsequent lectures and papers.
Heisenberg was working through the implications of quantum theory, a strange new way of explaining how atoms behaved that had been developed by physicist, including Niels Bohr, Paul Dirac and Erwin Schrödinger, over the previous decade. Among its many counter-intuitive ideas, quantum theory proposed that energy was not continuous, but instead came in discrete packets (quanta) and that light could be described as both a wave and a stream of these Quanta. In fleshing out this radical worldview, Heisenberg discovered a problem in the way the basic physical properties of a particle in a quantum system could be measured. In one of his regular letters to a colleague, Wolfgang Pauli, he presented the inklings of an idea that has since became a fundamental part of quantum description of the world.
The uncertainty principle says that we cannot measure the position (x) and the momentum (p) of a particle with absolute precision. The more accurately we know of these values, the less accurately we know the other. Multiplying together the errors in the measurements of these values (the errors are represented by the triangle symbol in front of each property, the Greek letter “delta”) has to give a number greater than or equal to half constant called “h-bar”. This is equal to Planck’s constant (usually written as h) divided by 2π. Planck’s constant is an important number in quantum theory, a way to measure granularity of the world at its smallest scales and it has the value 6.626 x 10^34 joule seconds.
One way to think about the uncertainty principle is an extension of how we see measure things in the world everyday. You can read these words because particles of light, photons, have bounced off or paper and reached your eyes. Each photon on that path carries with it some information about the surface it has bounced from, at the speed of light. Seeing a subatomic particle, such as an electron, is not so simple.
The uncertainty principle is at the hearth of many things that we observe but con not explain using classical (non-quantum) physics. Take atoms, for example, where negatively-charged electrons orbit a positively-charged nucleus. By classic logic, we might expect the two opposite charges to attract each other, leading everything to collapse into, a ball of articles. The uncertainty principle explains why this doesn’t happen: if an electron got too close to nucleus, then its position in space would be precisely known and, therefore, the error in measuring its position would be minuscule.