When Did Automation Begin? From Ancient Machines to Robots

Automation, as both a concept and a practice, stretches back far longer than most people realize. The word itself was coined in 1946, but humans have been building self-operating machines for nearly two thousand years. The history of automation isn’t a single starting point but a series of leaps, from ancient steam-powered toys to programmable factory robots.

Ancient Machines That Moved on Their Own

The earliest known automated devices come from Alexandria, Egypt, around the first century AD. Hero of Alexandria, a mathematician and engineer born around AD 10, built an extraordinary range of mechanical devices that operated without direct human control. His surviving works describe trick jars that dispensed wine or water in set proportions, mechanical birds that sang, trumpets that sounded on their own, and puppets that moved when a fire was lit on an altar. He also designed an entire automated puppet theater, powered by strings, drums, and weights, that could perform a sequence of movements without an operator.

Perhaps most remarkably, Hero built the aeolipile, a hollow sphere mounted on tubes that fed it steam from a boiler. Steam escaped through bent nozzles on the sphere’s equator, causing it to spin. This was the first known device to convert steam into rotary motion. It was a novelty, not a practical engine, but the underlying principle wouldn’t be harnessed for industrial work for another 1,600 years.

The Word “Automation” Is Born

The term itself entered the English language in 1946. Delmar S. Harder, Vice President for Manufacturing at Ford Motor Company, coined it as a nickname for a new production setup at Ford’s Detroit factory. The system linked together a series of automatic machines into one integrated process, moving components and materials between different stages of production without human hands. Harder described automation as “the automatic handling of materials and parts in and out of machines.” Before that moment, people talked about “mechanization” or “automatic machinery,” but Harder’s word captured something new: not just machines doing work, but machines coordinating with each other.

Textile Mills and Self-Acting Machines

The Industrial Revolution produced the first wave of automation that reshaped entire economies. In 1825, British inventor Richard Roberts designed the self-acting spinning mule, a machine that could spin cotton thread with minimal human intervention. Earlier spinning mules required a skilled operator to control the carriage by hand. Roberts’ version automated that motion, reducing the need for experienced spinners and dramatically increasing output. Textile manufacturing became one of the first industries where machines didn’t just assist human workers but actively replaced the skill and judgment those workers had provided.

Around the same time, engineers were developing the first feedback control systems. Christiaan Huygens invented the centrifugal governor in the seventeenth century to regulate windmills and water wheels. In 1788, James Watt adapted the device to control his steam engine. As the engine sped up, spinning weights on the governor rose outward, which mechanically reduced the flow of steam into the cylinders. If the engine slowed, the weights dropped, opening the valve again. This self-correcting loop is a foundational concept in automation: a machine that senses its own output and adjusts its own behavior. No human needed to watch the dial and turn a knob.

Programmable Machines Arrive

The mid-twentieth century brought a shift from mechanical automation to programmable automation, where machines could be told what to do through coded instructions rather than physical gears and cams. In September 1952, MIT’s Servomechanisms Laboratory demonstrated the first numerically controlled milling machine. Instead of a machinist guiding the cutting tool by hand, the machine followed instructions encoded on punched tape. The lab’s Computer Application Group later developed the Automatically Programmed Tool Language (APT), a special-purpose programming language that made it relatively easy to write instructions for these machines. By the late 1950s, APT had become the world standard for programming computer-controlled machine tools.

That same year, 1952, engineer John Diebold published “Automation: The Advent of the Automatic Factory,” a book that helped define public understanding of what automation would mean for industry and labor. Diebold argued that automation wasn’t simply faster mechanization but a fundamentally different way of organizing production.

The First Industrial Robot

In 1959, the 2,700-pound Unimate #001 prototype was installed on an assembly line at a General Motors diecasting plant in Trenton, New Jersey. It was the first industrial robot ever put to work in a factory. The Unimate handled hot metal parts fresh from the casting process, a dangerous job that exposed human workers to extreme heat and heavy loads. The robot’s success at GM proved that programmable machines could handle real production tasks reliably enough to justify their cost, and it opened the door to robotic automation across manufacturing.

Programmable Logic Controllers Replace Wiring

Before 1968, industrial processes were controlled by hard-wired relay systems. If you wanted to change what a factory line did, you had to physically rewire the control panels, a process that was expensive, slow, and error-prone. That year, Dick Morley and his colleagues at Bedford Associates built the Modicon 084, the first programmable logic controller (PLC). Instead of rewiring relays, engineers could now reprogram the controller’s software to change how machines behaved. The group formed a company called Modicon (short for Modular Digital Controller) to sell their invention. PLCs became the backbone of factory automation worldwide, and their descendants still control everything from bottling lines to power plants today.

Early Steps Toward Intelligent Automation

By the early 1960s, researchers were already experimenting with machines that could learn from data rather than follow fixed instructions. Raytheon Company developed an experimental “learning machine” called Cybertron, which used punched tape memory and rudimentary reinforcement learning to analyze sonar signals, heart rhythms, and speech patterns. It was far from the machine learning systems now embedded in manufacturing and logistics, but it represented the first practical attempts to make automated systems adaptive rather than purely repetitive.

The trajectory from Hero’s spinning sphere to modern AI-guided factories spans roughly two millennia, but the acceleration is striking. It took over 1,700 years to get from the aeolipile to Watt’s governor. Then just 170 years from the governor to the first industrial robot. And only nine more years from Unimate to the programmable logic controller that made flexible factory automation practical. Each leap made the next one faster, building on the same core idea Hero demonstrated in Alexandria: machines that act on their own.