Does Randomness Exist? From Epicurus's Swerve to Laplace's Demon
In 300 BCE, the Greek philosopher Epicurus had a problem. His teacher Democritus had argued that the universe was nothing but atoms moving through empty space, colliding and combining according to fixed laws. It was elegant, mechanical, and completely deterministic. It also left no room for free will, novelty, or anything genuinely new happening in the universe.
So Epicurus introduced a small, radical idea: sometimes, for no reason at all, an atom swerves. He called it the clinamen, a tiny, unpredictable deviation from the atom's determined path. This random swerve, Epicurus argued, was what made free will possible. Without it, we'd all be clockwork, our thoughts and choices as predetermined as the trajectory of a falling stone.
Twenty-three centuries later, we're still arguing about whether that swerve is real.
The Clockwork Universe
The idea that the universe runs like a machine reached its peak in 1814, when the French mathematician Pierre-Simon Laplace proposed a thought experiment. Imagine, he said, an intellect vast enough to know the position and momentum of every particle in the universe. Such a being could, using the laws of physics, calculate the entire future and reconstruct the entire past. Nothing would be uncertain. The future would be as clear as the past.[1]
This hypothetical being, later called Laplace's Demon, represents the ultimate expression of determinism: if the universe follows fixed laws, and if you had complete information, randomness would be revealed as an illusion. What we call "chance" would just be a confession of ignorance. The coin flip isn't random; we just don't know enough about the forces acting on the coin to predict the outcome.
For over a century, physics seemed to support this view. Newton's laws were deterministic. Given initial conditions, you could predict where a planet would be in a thousand years. The universe appeared to be a vast, intricate clock, and randomness was just a word for the gears we couldn't see.
This worldview had consequences beyond physics. If everything is determined, what does that mean for moral responsibility? If your decisions are the inevitable result of prior causes stretching back to the Big Bang, can you really be blamed for anything? Epicurus had introduced the swerve precisely to avoid this problem. Without genuine randomness, he believed, there could be no genuine choice.
Quantum Mechanics Breaks the Clock
In the 1920s, the clockwork universe shattered. Quantum mechanics revealed that at the subatomic level, nature appears to be fundamentally random. When you measure the spin of an electron, the outcome isn't determined by any prior state. It's genuinely probabilistic. Not "we don't know enough to predict it" probabilistic, but "there is nothing to know" probabilistic.[2]
Einstein famously resisted this conclusion. "God does not play dice with the universe," he reportedly said. He believed quantum randomness was a sign that the theory was incomplete, that there must be hidden variables, deeper deterministic laws we hadn't found yet.
In 1964, physicist John Bell devised a way to test whether hidden variables could explain quantum behavior. The experiments, conducted repeatedly since the 1970s and earning the 2022 Nobel Prize in Physics, consistently show that no local hidden variable theory can reproduce quantum predictions.[3] Nature, at its most fundamental level, appears to involve genuine randomness. The dice are real.
This doesn't settle the philosophical debate entirely. Some interpretations of quantum mechanics (like the many-worlds interpretation) restore determinism at a higher level. But the standard interpretation, and the one most working physicists operate under, accepts that quantum events are irreducibly random.
Chaos: Deterministic but Unpredictable
Between the clockwork universe and quantum randomness lies a third possibility that complicates things further: chaos theory.
In 1961, meteorologist Edward Lorenz discovered that tiny differences in initial conditions could produce wildly different weather outcomes. He was running a computer simulation and re-entered a number rounded from 0.506127 to 0.506. The result was a completely different weather pattern.[4]
Chaotic systems are fully deterministic. They follow precise mathematical laws. Laplace's Demon, with perfect information, could predict them. But in practice, you can never have perfect information. Measurements always have some imprecision, and in chaotic systems, that imprecision grows exponentially over time. A butterfly's wing in Brazil really could, through a chain of amplifying effects, influence a tornado in Texas.
Chaos theory reveals a category that Epicurus and Laplace didn't consider: systems that are deterministic in principle but random in practice. The laws are fixed, but the outcomes are unpredictable because you'd need infinite precision to predict them. This is epistemic randomness, randomness born from the limits of knowledge rather than from the nature of reality itself.
For technology, this distinction matters enormously. Most of the "randomness" we use in software is closer to chaos than to quantum mechanics. It's deterministic at its core but unpredictable in practice, and that's usually good enough.
The Two Kinds of Randomness
Philosophy distinguishes between two fundamentally different kinds of randomness, and this distinction runs through every technological application.
Ontological randomness is randomness that exists in the world itself. Quantum mechanics suggests this is real. There is no deeper explanation, no hidden mechanism. The electron's spin is not determined before measurement. The randomness is in the physics, not in our heads.
Epistemic randomness is randomness that exists in our knowledge. The coin flip is deterministic; we just can't track all the variables. A shuffled deck of cards follows the laws of physics perfectly; we simply don't know the exact sequence of forces that produced the order. The randomness is in our ignorance, not in the world.
For most of human history, all randomness was assumed to be epistemic. Laplace's Demon could, in principle, predict everything. Quantum mechanics introduced the possibility that some randomness is ontological, baked into the fabric of reality.
Technology lives in the gap between these two. Computers are deterministic machines. They execute instructions in a fixed sequence. They cannot produce ontological randomness on their own. Every "random" number generated by software is actually pseudorandom: produced by a deterministic algorithm that creates sequences which look random but are entirely predictable if you know the algorithm and its starting state.
This is Laplace's Demon in miniature. The pseudorandom number generator is a tiny clockwork universe. Know the seed, and you know every number it will ever produce.
Why Technology Forces the Question
For most practical purposes, pseudorandomness works. Shuffling a playlist, distributing load across servers, running a Monte Carlo simulation: these don't require genuine randomness. They require sequences that are statistically well-distributed and hard to predict in practice.
But some applications demand more. Cryptography, for instance, depends on randomness that an attacker cannot predict or reproduce. If the random numbers used to generate encryption keys are pseudorandom, and if an attacker can guess or reconstruct the seed, the encryption is broken. The entire security model collapses. This isn't theoretical: the history of cryptographic failures is filled with cases where "random" turned out to be predictable.[5]
Fairness is another domain where the quality of randomness matters. When a lottery, a jury selection, or a draft order depends on random selection, the legitimacy of the process depends on the randomness being genuine, or at least genuinely unpredictable. A "random" selection that can be gamed or predicted isn't random in any meaningful sense.
And in machine learning, randomness plays a role that would have fascinated Epicurus. Neural networks are initialized with random weights, trained using random subsets of data, and regularized by randomly disabling neurons. The randomness isn't a limitation; it's what makes learning possible. Without the swerve, the network gets stuck. Noise, it turns out, is a feature.
Distributed systems use randomness to break symmetry and resolve conflicts. When multiple servers need to coordinate without a central authority, randomized algorithms provide probabilistic guarantees that deterministic approaches often can't match.
In each of these domains, the philosophical question, is randomness real or just ignorance, has practical consequences. The answer determines whether your encryption is secure, whether your lottery is fair, whether your AI can learn, and whether your distributed system can reach consensus.
The Swerve in the Machine
Epicurus introduced the swerve because a deterministic universe seemed to leave no room for freedom, creativity, or genuine novelty. We inject randomness into our software for remarkably similar reasons.
Deterministic systems are predictable, which is often a virtue. But predictability can also be a weakness. A predictable encryption key is a broken encryption key. A predictable neural network is an overfitted neural network. A predictable server selection algorithm is a recipe for cascading failures.
Randomness, whether ontological or epistemic, whether quantum or pseudorandom, introduces the unpredictability that makes systems secure, fair, robust, and capable of learning. The swerve isn't a flaw in the machine. It's what keeps the machine from getting stuck.
The ancient debate about whether randomness is real remains open. Quantum mechanics suggests it is. Determinists argue it's an illusion. Chaos theory offers a middle path: deterministic but unpredictable.
But technology has added something new to the conversation. We've discovered that randomness, real or simulated, is useful. It solves problems that determinism alone cannot. We don't just tolerate randomness; we engineer it, calibrate it, and depend on it.
Laplace imagined a demon who could predict everything. We build systems that work precisely because they can't be predicted. The question isn't whether the swerve is real. The question is what happens when you take it away.
References
[1] Pierre-Simon Laplace, A Philosophical Essay on Probabilities, translated by F.W. Truscott and F.L. Emory, Dover Publications, 1951 (originally published 1814). https://www.gutenberg.org/ebooks/58881
[2] Werner Heisenberg, "Über den anschaulichen Inhalt der quantentheoretischen Kinematik und Mechanik," Zeitschrift für Physik, 43, 172–198, 1927. https://link.springer.com/article/10.1007/BF01397280
[3] "The Nobel Prize in Physics 2022," The Nobel Foundation, October 4, 2022. https://www.nobelprize.org/prizes/physics/2022/summary/
[4] Edward Lorenz, "Deterministic Nonperiodic Flow," Journal of the Atmospheric Sciences, 20(2), 130–141, 1963. https://philpapers.org/rec/LORDNF
[5] Nadia Heninger et al., "Mining Your Ps and Qs: Detection of Widespread Weak Keys in Network Devices," USENIX Security Symposium, 2012. https://www.usenix.org/conference/usenixsecurity12/technical-sessions/presentation/heninger