Simulation theory is the idea that our entire reality, including everything we see, feel, and experience, could be a sophisticated computer program running on some advanced civilization’s hardware. It sounds like science fiction, but the concept has been taken seriously by philosophers, physicists, and computer scientists for over two decades. The most influential version was formalized in 2003 by philosopher Nick Bostrom, who argued that at least one of three unsettling possibilities must be true.
Bostrom’s Three Possibilities
Bostrom’s simulation argument isn’t a claim that we definitely live in a simulation. It’s a logical trilemma: one of three statements must be true, and we can’t easily determine which one. First, nearly all civilizations go extinct before reaching a technological stage advanced enough to create realistic simulations. Second, advanced civilizations that do survive almost universally choose not to run simulations of their ancestors. Third, we are almost certainly living inside a simulation right now.
The argument’s power comes from its structure. If advanced civilizations can and do run simulations, they could run millions or billions of them. In that scenario, simulated beings would vastly outnumber “real” ones, making it statistically overwhelmingly likely that any given conscious experience, including yours, exists inside a simulation rather than in base reality. The only escape from that conclusion is if civilizations either can’t reach that technological level or consistently choose not to use it.
Why Some Physicists Take It Seriously
Several features of our universe have struck researchers as oddly consistent with a computational system. Physicist John Archibald Wheeler proposed a concept he called “it from bit,” suggesting that physical reality may ultimately arise from information, specifically from yes-or-no binary outcomes in quantum measurements. If the universe is fundamentally informational rather than material, that’s at least compatible with the idea that it could be running on some form of processor.
A 2023 paper published in AIP Advances by physicist Melvin Vopson explored what he calls the “second law of infodynamics.” While the familiar second law of thermodynamics says that disorder (entropy) always increases in physical systems, Vopson found that information entropy in systems tends to decrease over time, reaching a minimum value at equilibrium. He argues this pattern looks like built-in data compression, exactly the kind of optimization you’d design into a simulation to reduce computational demands and storage requirements. The universe, in other words, appears to be minimizing its own information content the way a well-engineered program would.
Vopson’s reasoning goes like this: the total entropy of the universe stays constant during its expansion. Physical entropy is always increasing. So something else must be decreasing to balance the equation, and that something is information entropy. He contends this universal pattern of information optimization “points to the fact that the entire universe appears to be a simulated construct.”
The Planck Scale Misconception
One popular version of the argument compares the Planck length (the smallest meaningful distance in physics, roughly 1.6 × 10⁻³⁵ meters) to a pixel, and the Planck time to a “tick rate” in a video game engine. The idea is intuitive: if reality has a minimum resolution, maybe that’s the grid size of the simulation.
Physicists generally reject this analogy. The Planck length marks the point where our current understanding of quantum physics breaks down, not a hard boundary where space becomes discrete. Nothing in established physics confirms that time or space is actually divided into neat units the way a computer screen is divided into pixels. In video games, any two changes of state must have a whole number of ticks between them. There is no equivalent constraint on the Planck time. Just because it’s currently meaningless to discuss intervals smaller than the Planck time doesn’t mean the universe is quantized at that scale.
What the Math Says Against It
One of the strongest counterarguments is computational. A 2017 study published in Science Advances by physicists Zohar Ringel and Dmitry Kovrizhin examined whether classical computers could simulate quantum systems. They found that certain quantum phenomena, particularly those involving gravitational responses at the quantum level, create what’s known as a “sign problem.” This makes the computational resources needed to simulate even a small quantum system grow exponentially with the system’s size. Simulating an entire universe’s worth of quantum interactions wouldn’t just be hard. It would require more computing power than the universe itself contains, at least using any approach we currently understand.
This doesn’t definitively kill the theory. A civilization advanced enough to build a universe-scale simulation might use computational methods we can’t imagine. But it does suggest that a simulation of our universe couldn’t be a straightforward brute-force calculation running on anything resembling known technology.
Attempts to Calculate the Odds
Columbia University astronomer David Kipping applied Bayesian statistics to the simulation argument in a 2020 analysis. Bayesian reasoning updates the probability of a hypothesis as new evidence comes in. Kipping found that even under generous assumptions, the probability that we are simulated beings comes out to less than 50%. It approaches 50% only in the extreme scenario where an infinite number of simulations are running, but it never crosses that threshold. In other words, the math slightly favors base reality.
The calculation is sensitive to assumptions, though. If you change your starting estimate of how likely civilizations are to survive and develop simulation technology, the numbers shift. Kipping’s analysis highlights a key frustration with the whole debate: we’re trying to calculate probabilities about a scenario where we have almost no hard data to work with.
Could We Ever Test It?
In 2012, physicists Silas Beane and colleagues at the University of Washington proposed an actual experiment. If the universe is a simulation running on a discrete lattice (a grid-like computational structure), then ultra-high-energy cosmic rays should show subtle patterns reflecting the geometry of that grid. Specifically, the distribution of the highest-energy cosmic rays would exhibit a slight breaking of rotational symmetry, meaning they’d favor certain directions over others in a way that mirrors the underlying lattice structure.
No such pattern has been detected so far. That doesn’t prove we’re not in a simulation, since a sufficiently advanced simulator could use a lattice structure too fine for us to detect, or use a completely different computational architecture. But it represents one of the few concrete proposals for how the question could move from philosophy into experimental science.
An Ancient Idea in New Clothing
The core intuition behind simulation theory is far older than computers. In Plato’s Allegory of the Cave, written around 375 BCE, prisoners chained inside a cave perceive shadows on a wall as their entire reality. They hear echoes and assume the sounds come from the shadows, never realizing that a richer world exists behind them. Mathematician Charles Howard Hinton extended this idea in 1904, arguing that Plato was essentially describing a lower-dimensional projection of a higher-dimensional reality, the same way a 3D object casts a 2D shadow, a 4D “hyper-object” could cast a 3D shadow that we experience as our physical world.
Simulation theory updates this framework with modern technology as the metaphor. Instead of shadows on a cave wall, we might be data structures in a program. Instead of chains preventing us from turning around, we might face fundamental limits on what measurements can reveal about the nature of our reality. The philosophical question is the same one Plato asked: how would you know if everything you perceive is a representation rather than the thing itself?
Where the Debate Actually Stands
Simulation theory occupies an unusual space between philosophy, physics, and computer science. It’s not fringe pseudoscience, since its logical structure is taken seriously in academic philosophy, and physicists have engaged with it through peer-reviewed work. But it’s also not a scientific theory in the conventional sense, because it makes very few testable predictions. The computational limits identified by Ringel and Kovrizhin suggest that simulating quantum physics as we know it may be fundamentally impossible with classical computing. The information-theoretic observations from Vopson suggest the universe behaves as if it’s computationally optimized. These two findings point in opposite directions, and neither is conclusive.
For now, simulation theory functions mainly as a thought experiment that forces useful questions about the nature of computation, consciousness, and physical law. Whether or not we live in a simulation, the argument reveals something genuinely interesting: we don’t yet have a firm philosophical basis for ruling it out.

