The first integrated circuits were used in military and aerospace systems, specifically in missile guidance computers and space navigation equipment. The U.S. military was the earliest and largest customer for integrated circuits in the late 1950s and early 1960s, funding their development and buying them in bulk before any commercial or consumer applications existed.
The First Working Prototype
Jack Kilby at Texas Instruments built the first working integrated circuit in 1958. Rather than wiring together separate electronic components on a board, Kilby combined a transistor, a capacitor, and the equivalent of three resistors on a single piece of germanium, a semiconducting material. That thin rectangular slice of germanium, roughly the size of a pencil eraser, proved that an entire circuit could exist on one chip. The device is now in the collection of the National Museum of American History.
Meanwhile, Robert Noyce at Fairchild Semiconductor independently developed a different approach using silicon and a planar manufacturing process that turned out to be easier to mass-produce. Fairchild provided prototype samples to customers as early as 1960 and publicly announced its first product line, branded µLogic (Micrologic), at a press conference in March 1961. The initial offering was a flip-flop circuit, followed later that year by a gate function, a half adder, and a half shift register. These were the first integrated circuits anyone could actually buy.
The price reflected how new and difficult the technology was. A single integrated circuit cost roughly $450 in 1961, the equivalent of several thousand dollars today.
Missile Guidance: The First Real Application
In 1962, Texas Instruments won a contract to design 22 custom integrated circuits for the guidance system of the Minuteman II intercontinental ballistic missile. This was the first major application of integrated circuits inside a working computer system. The Minuteman program gave chipmakers something no commercial customer could offer at the time: guaranteed high-volume orders at prices that justified scaling up production. The military needed circuits that were smaller, lighter, and more reliable than traditional wiring, and it was willing to pay a premium to get them.
That military demand did more than just fund early chip production. It pushed manufacturers to solve quality and consistency problems that would have taken years to work out otherwise. The defense budget essentially subsidized the learning curve for an entire industry.
The Apollo Guidance Computer
NASA’s Apollo program became the other massive early buyer. The Apollo Guidance Computer, which navigated spacecraft to the moon and back, relied on thousands of integrated circuits. Each chip contained a dual three-input NOR gate, a simple logic building block. By stacking thousands of these identical circuits together, engineers at MIT’s Instrumentation Laboratory built a computer small and light enough to fly in a spacecraft, something impossible with older technology.
The chips were designed by Fairchild Semiconductor and manufactured under license by Philco, based near Philadelphia. By the time astronauts flew Apollo 7 in October 1968, the six-device circuit specified for the computer was already considered obsolete by industry standards. The chip world was moving that fast. But the Apollo computer worked, and it demonstrated that integrated circuits could handle life-or-death computing tasks reliably.
Early Consumer Products
While the military and NASA consumed most of the world’s integrated circuit supply in the early 1960s, a few consumer products appeared surprisingly quickly. In 1964, Zenith released the Arcadia, the first hearing aid built with an integrated circuit. The chip packed six transistors into a space small enough to fit inside an ear-level device, replacing a tangle of discrete components. It ran on a single small battery. The Arcadia was similar to Zenith’s existing Delegate model but represented a genuine leap in miniaturization.
These early consumer uses were limited because the chips were still expensive and production volumes were small. Most manufacturers couldn’t justify the cost for products that worked fine with conventional components.
Computing Takes a Hybrid Path
You might expect that mainframe computers would have been early adopters, but the reality was more complicated. When IBM announced its landmark System/360 computer family in April 1964, it used a technology called Solid Logic Technology, or SLT. This was a hybrid approach, not a true monolithic integrated circuit. IBM’s Components Division had developed SLT as a middle step between discrete transistors and full integration.
Even before System/360 started shipping, IBM faced public criticism that SLT was already outdated compared to monolithic integrated circuits. The company diverted significant resources toward developing true integrated circuits in response, which actually disrupted its own SLT production effort. It took several more years before fully integrated circuits became standard in commercial computing.
The Handheld Calculator Breakthrough
By the late 1960s, integrated circuits had become capable enough and affordable enough to enable entirely new product categories. In 1967, Texas Instruments built the Cal-Tech, a prototype of the first battery-powered handheld calculator. The device used just four integrated circuits, each built on a wafer roughly the size of a standard chip production wafer but functioning as a single large circuit. It could add, subtract, multiply, and divide, fitting capabilities that once required a desk-sized machine into something you could hold in one hand.
The Cal-Tech never went into mass production itself, but it proved the concept. Within a few years, pocket calculators flooded the market and became one of the first truly mainstream consumer products built on integrated circuit technology. From a unit cost of $450 in 1961, the price of a chip eventually dropped to fractions of a cent, making the explosion of modern electronics possible.

