Electrical engineers have invented many of the technologies that define modern life, from the power grid that lights your home to the smartphone in your pocket. Their work spans power generation, computing, communications, medicine, and lighting. Here’s a look at the most transformative inventions to come from the field.
The AC Power Grid
Before anything electronic could reach the masses, someone had to figure out how to deliver electricity over long distances. Thomas Edison developed direct current (DC) systems, which worked but couldn’t efficiently travel far from a power station. Nikola Tesla solved this problem with alternating current (AC), a system where the electrical flow reverses direction 60 times per second in the U.S. The critical advantage: AC voltage can be stepped up or down using a transformer, meaning power plants can send high-voltage electricity across hundreds of miles with minimal loss, then reduce it to safe levels at your home.
George Westinghouse licensed Tesla’s polyphase AC induction motor patent and won the contract to generate power from Niagara Falls, proving the system could work at scale. That same AC infrastructure still forms the backbone of electrical grids worldwide.
The Transistor
The transistor is arguably the single most important invention in electrical engineering. In 1947, John Bardeen and Walter Brattain at Bell Labs demonstrated the first semiconductor amplifier: a point-contact transistor made from a sliver of germanium with two closely spaced gold contacts held in place by a plastic wedge. A small change in current through one contact caused a larger change in the other, amplifying the input signal up to 100 times. William Shockley, who had directed the research, followed up in 1948 by inventing the more robust junction transistor. All three shared the 1956 Nobel Prize in Physics.
Transistors replaced vacuum tubes, which were bulky, fragile, and power-hungry. Early transistors were actually costlier than vacuum tubes, but they were small enough to make portable, battery-powered electronics practical for the first time. Radios shrank from furniture-sized cabinets to pocket devices. And the transistor’s real legacy was still ahead: it became the building block of every computer chip ever made.
The Integrated Circuit
By the late 1950s, transistors were everywhere, but wiring thousands of individual components together by hand was slow, expensive, and unreliable. Jack Kilby, an engineer at Texas Instruments, had a simple but radical idea: since his company already manufactured transistors, resistors, and capacitors from semiconductor material, why not build all of them on a single chip? On July 24, 1958, he described what he called “The Monolithic Idea” in his lab notebook. By early 1959, his team had a working flip-flop circuit built entirely from scratch on germanium, using bulk resistors, junction capacitors, and mesa transistors.
Shortly after, Robert Noyce at Fairchild Semiconductor independently developed a version using the planar process with metal leads over an oxide layer, which proved easier to manufacture. The two approaches merged into what we now call the microchip. Military programs like Apollo and the Minuteman missile were the first major adopters in the 1960s, lending credibility to the technology. Kilby’s team at TI also developed the first handheld electronic calculator to demonstrate the chip’s commercial potential.
The scaling since then is staggering. Intel’s 4004 processor in 1971 held 2,250 transistors on a 12 square millimeter chip. Apple’s A17 chip, released in 2023, packs 19 billion transistors onto roughly 104 square millimeters using a 3-nanometer manufacturing process. That’s a density increase of nearly a millionfold.
The LED
In 1962, Nick Holonyak Jr. invented the first visible-spectrum LED at General Electric, producing red light from a gallium arsenide phosphide semiconductor. The key insight was that by adjusting the composition of certain semiconductor alloys, engineers could “tune” the material to emit light at specific wavelengths. This opened the door to LEDs in every color, eventually including the blue LEDs (developed in the 1990s) that made white LED lighting possible.
LEDs now dominate everything from traffic signals and TV screens to home lighting. They use a fraction of the energy of incandescent bulbs and last tens of thousands of hours longer.
Spread Spectrum Communication
The technology behind WiFi and Bluetooth traces back to an unlikely pair: actress Hedy Lamarr and composer George Antheil. During World War II, they patented frequency hopping, a method where a radio signal rapidly switches between different frequency bands. The original purpose was to prevent enemies from jamming radio-controlled torpedoes. A signal that hops unpredictably across frequencies is far harder to intercept or disrupt than one sitting on a single channel.
The concept sat largely unused until the 1980s, when researchers developed techniques to handle collisions (two transmitters hitting the same frequency at the same time) and built systems that performed well even when noise and interference patterns were unpredictable. Spread spectrum technology became the foundation for modern wireless standards, enabling the WiFi networks, Bluetooth connections, and cellular signals you rely on every day.
Fiber Optic Communication
Before Charles Kao’s work in the 1960s, glass fibers were widely dismissed as useless for carrying information because light scattered too quickly inside them, losing its signal within short distances. Kao realized the problem wasn’t glass itself but impurities in the glass. He proposed that carefully purified glass fibers could carry enormous amounts of data over long distances with minimal signal loss, replacing copper wires for telecommunications.
That prediction proved correct. Today, undersea fiber optic cables carry over 95% of intercontinental internet traffic. A single fiber thinner than a human hair can transmit terabits of data per second. Kao received the 2009 Nobel Prize in Physics for his work.
The Implantable Pacemaker
Canadian engineer John Hopps discovered in the 1940s and 1950s that an electrical impulse could cause the heart to contract, and that repeated pulses could keep it beating at a steady rhythm. His first prototype used vacuum tubes and ran on household current, meaning it was far too large to implant. The breakthrough came when transistors replaced vacuum tubes. Swedish engineer Rune Elmqvist built the first implantable pacemaker using just two transistors, making the device small enough to fit inside the body. Surgeon Ake Senning implanted it in 1958.
Modern pacemakers weigh about an ounce, last over a decade on a single battery, and can automatically adjust heart rate based on physical activity. They exist because electrical engineers miniaturized the same transistor technology that was simultaneously transforming computing.
Solar Cells
The modern solar cell was born in 1954 at Bell Labs, where Daryl Chapin, Calvin Fuller, and Gerald Pearson developed the first silicon photovoltaic cell capable of powering everyday electrical equipment. Their initial version converted 4% of sunlight into electricity; they later pushed it to 11%. Hoffman Electronics drove efficiency to 14% by 1960 using innovations like grid contacts that reduced electrical resistance.
Progress continued steadily. The University of South Wales broke 20% efficiency for silicon cells in 1985. By 1994, researchers at the National Renewable Energy Laboratory created a cell combining gallium indium phosphide and gallium arsenide that exceeded 30% efficiency. In 1999, a three-layer design from Spectrolab and NREL reached 32.3%. Each jump came from electrical engineers finding new semiconductor materials and cell architectures that captured more of the solar spectrum. Today, solar power is one of the cheapest sources of new electricity generation in much of the world.

