Who Came Up With the Concept of Zero and Why It Matters

No single person invented zero. The concept emerged independently across several civilizations over thousands of years, evolving from a simple placeholder mark into a full number with its own mathematical properties. The Sumerians of Mesopotamia were the first to use a zero-like symbol around 5,000 years ago, but it was Indian mathematicians who transformed zero into the number we recognize today.

Mesopotamia: The First Placeholder

The earliest evidence of zero comes from the Sumerians in Mesopotamia, roughly 3000 BCE. They used a slanted double wedge inserted between other number symbols to show an empty position, much the way we use zero in a number like 102 to indicate there’s nothing in the tens column. This wasn’t zero as a number in its own right. It was a gap-filler, a way to avoid confusing 12 with 102.

As this positional number system spread to the Babylonian empire, the symbol evolved into two angled wedges. From Babylon, the idea traveled to the Greeks, who adopted it only sporadically and late. The Romans never used it at all, which is why Roman numerals have no symbol for zero.

The Maya: A Separate Invention

On the other side of the world, Mesoamerican civilizations developed zero completely independently. The oldest known Mesoamerican zero, dating to 31 BCE, appears on a carved stone monument called Stela C at the Olmec site of Tres Zapotes in Veracruz, Mexico. The Maya later used zero extensively in their base-20 counting system and their elaborate Long Count calendar.

Maya zero took several visual forms depending on context: a seed shape for arithmetic, a flower for calendar dates, and sometimes a conch shell or a human head in profile. These symbols appeared across stone sculptures, painted books, and decorated pottery. Like the Babylonian version, Maya zero functioned as a placeholder, but it was woven deeply into a sophisticated mathematical and astronomical tradition.

India: Where Zero Became a Number

The transformation of zero from a mere placeholder into a number you could add, subtract, and calculate with happened in India. This leap was as much philosophical as mathematical. The Sanskrit word for zero, “śūnya,” literally means void or emptiness, and it carried deep meaning in Indian philosophy. Ancient scholars also referred to zero as “pūrṇa,” meaning fullness, reflecting the idea that emptiness and completeness were two sides of the same concept. That philosophical comfort with nothingness as something real and meaningful likely helped Indian thinkers treat zero as a legitimate number rather than just a blank space.

The earliest physical evidence of the zero symbol we use today is the Bakhshali manuscript, an ancient mathematical text discovered in 1881 near Peshawar (in present-day Pakistan). Radiocarbon dating conducted by the University of Oxford found that portions of the manuscript date to the 3rd or 4th century CE, roughly 500 years older than scholars had previously believed. The manuscript contains hundreds of zeros, written as dots. Those dots eventually evolved into the small circle we use today.

For a long time, the oldest confirmed zero inscription was thought to be one carved into the wall of the Chaturbhuj Temple in Gwalior, India, dating to 876 CE. The inscription records the dimensions of a garden (187 by 270 hastas) and mentions 50 garlands donated to the temple daily. The final digits of 270 and 50 are written as small circles, the familiar O-shaped zero. While the Bakhshali manuscript is now considered older, the Gwalior inscription remains the earliest known example of zero carved in stone.

The Indian mathematician Brahmagupta, writing in 628 CE, is often credited with the first formal rules for arithmetic involving zero: what happens when you add zero to a number, subtract it, or multiply by it. His work gave zero a defined role in mathematics, not just notation.

From Baghdad to Europe

In the 9th century, the Persian scholar Muhammad ibn Musa al-Khwarizmi brought zero into the mainstream of Islamic mathematics. Working at the House of Wisdom in Baghdad during the Islamic Golden Age, al-Khwarizmi developed an Arabic numeral system that fully incorporated zero, called “sifr” in Arabic (the root of both “zero” and “cipher” in English). He is celebrated as the father of algebra, and his name is the origin of the word “algorithm.”

Al-Khwarizmi’s key work on Indian numerals was translated into Latin in the 12th century, creating a bridge between Indian and Arab mathematical traditions and European scholarship. Arab traders also helped spread the concept through commerce across the Mediterranean.

The person who most directly brought zero to Europe was Leonardo of Pisa, better known as Fibonacci. His book “Liber Abaci,” published in 1202, introduced the Hindu-Arabic decimal system, including zero, to a European audience. Fibonacci had learned the system during travels in North Africa and the Mediterranean. His book was aimed heavily at merchants, showing how the new numerals made everyday calculations far easier than Roman numerals. “Liber Abaci” was widely copied and imitated, and it set Europe on the path to adopting the number system we all use today.

The transition wasn’t instant. European authorities were initially suspicious of the new numerals. Florence banned them from bookkeeping in 1299, partly because the symbols were easy to forge and partly out of general resistance to change. But the practical advantages were overwhelming, and by the Renaissance, Hindu-Arabic numerals, zero included, had become standard.

Why Zero Changed Everything

Zero’s importance goes far beyond filling an empty column. As a number in its own right, it made algebra possible and later became essential to calculus, where the concept of approaching zero (without reaching it) underpins the mathematics of motion, change, and rates of growth. Binary code, the foundation of every computer, relies on just two digits: zero and one. Without zero as a true number, modern computing, physics, and engineering would not exist in their current forms.

The journey from a Sumerian wedge scratched into clay to the zero on your keyboard took roughly 4,500 years and passed through at least three independent civilizations. No single genius “came up with” zero. It was built in layers: Mesopotamians created the placeholder, Indian philosophers and mathematicians turned it into a number, Persian scholars formalized it in algebra, and European merchants made it universal.