The last 100 years produced an extraordinary wave of inventions that reshaped daily life, from antibiotics and jet engines to the internet and gene editing. Many of the technologies you rely on every day, including GPS, microwave ovens, and LED lighting, didn’t exist before the 1930s. Here’s a look at the most significant inventions from roughly the 1920s to today, organized by the areas of life they transformed.
Penicillin and Modern Antibiotics
Before 1928, a simple infected cut could kill you. That year, Alexander Fleming noticed mold killing bacteria in a petri dish at his London laboratory, a chance observation that changed the course of medicine. But turning that observation into a usable drug took more than a decade. Purifying penicillin and testing it on patients didn’t happen until the early 1940s, and by the end of 1942, there was only enough penicillin in existence to treat fewer than 100 people.
An unprecedented cooperation between the United States and Great Britain scaled up production rapidly. By September 1943, supply was sufficient to treat wounded soldiers across the Allied Armed Forces. Commercial sales to the general public began in January 1946. Penicillin opened the door to an entire class of antibiotic drugs that made surgeries, childbirth, and wound care dramatically safer.
The Transistor: Foundation of All Modern Electronics
In December 1947, physicists John Bardeen and Walter Brattain at Bell Labs created the first working transistor, a tiny semiconductor device that could amplify electrical signals. Their goal was to replace the vacuum tubes and electromechanical switches used in the Bell Telephone System, which were bulky, fragile, and unreliable. Bell Labs announced the device publicly at a press conference in New York in June 1948.
It’s difficult to overstate what this single invention made possible. Vacuum tubes were the size of a thumb and burned hot. Transistors could be made microscopically small and used almost no power. Without them, there would be no smartphones, no laptops, no modern cars, no internet. Every piece of digital technology you touch traces its lineage back to that 1947 lab in New Jersey.
Solar Cells and Renewable Energy
The ability to turn sunlight directly into electricity became practical in 1954, when three researchers at Bell Labs built the first silicon solar cell. That initial cell converted just 4% of the sun’s energy into usable power, though the team later pushed efficiency to 11%. At the time, solar cells were far too expensive for everyday use and found their first real application powering satellites in space.
Costs dropped steadily over the following decades, and today solar panels routinely exceed 20% efficiency at a fraction of their original price. What started as a laboratory curiosity is now one of the fastest-growing energy sources on Earth.
The Jet Engine and Modern Air Travel
Frank Whittle in England and Hans von Ohain in Germany independently developed the first working jet engines within months of each other in 1937. Whittle’s turbojet ran successfully in April, and Ohain’s followed in September. Both engines replaced the propeller-driven piston system that had powered aircraft since the Wright brothers, using high-speed exhaust thrust instead.
Military jets flew in World War II, but the real transformation came in 1952, when the de Havilland Comet flew the first commercial jet route from London to Johannesburg. Jet engines cut travel times in half and eventually made international flights affordable for ordinary people. The global tourism and business travel industries exist largely because of this invention.
The Microwave Oven
Percy Spencer, an engineer at Raytheon, stumbled onto microwave cooking by accident while working with radar equipment during World War II. He noticed a chocolate bar in his pocket had melted. Spencer and Raytheon quickly pivoted to developing the technology for cooking, filing a patent on October 8, 1945. The first commercial model, called the “Radarange,” hit the market in 1947, though it was roughly the size of a refrigerator and far too expensive for home kitchens.
Smaller, affordable countertop versions arrived in the late 1960s, and by the 1980s the microwave oven had become a standard kitchen appliance in most households. It remains one of the clearest examples of military technology finding its way into everyday life.
Satellites and the Space Age
On October 4, 1957, the Soviet Union launched Sputnik I, the first artificial satellite to orbit Earth. It was a small, polished metal sphere that did little more than transmit radio signals, but its beeping signal overhead triggered the space race and reshaped geopolitics. Within a few years, satellites were relaying television broadcasts, tracking weather systems, and enabling long-distance communication.
Satellites also made the Global Positioning System possible. GPS was originally a military project, and during the 1990s civilian readings were intentionally degraded, making them inaccurate by as much as 100 meters. In May 2000, President Clinton ordered the Department of Defense to turn off that restriction. Overnight, GPS became accurate enough for turn-by-turn navigation, ride-sharing apps, delivery logistics, and precision agriculture.
LEDs: A New Kind of Light
In 1962, Nick Holonyak Jr. created the first practical visible-light LED while working at General Electric. Early LEDs could only produce dim red light and were used mainly as indicator lights on electronics. Over the following decades, researchers developed LEDs in other colors, and the critical breakthrough came with blue and white LEDs in the 1990s, which made LED lighting suitable for homes and streets.
LEDs use a fraction of the energy of incandescent bulbs and last tens of thousands of hours. They now light homes, stadiums, phone screens, and traffic signals worldwide, and they’ve become the dominant lighting technology in less than two decades.
MRI Scanners
The first MRI exam on a living human patient took place on July 3, 1977. The technology grew out of physics research stretching back to the 1930s, when scientists first developed methods for measuring the magnetic properties of atomic nuclei. In 1946, two physicists independently figured out how to study these properties in solid and liquid materials rather than individual atoms, and that work eventually led to the idea of building images from magnetic signals.
Researchers produced the first crude images in the early 1970s, and by the 1980s MRI machines were entering hospitals. Unlike X-rays or CT scans, MRI uses no radiation. It produces detailed images of soft tissues like the brain, spinal cord, and joints, making it essential for diagnosing everything from torn ligaments to brain tumors.
The World Wide Web
The internet’s underlying network existed before the web, but the system you actually use, with clickable links, web addresses, and browsers, was invented by Tim Berners-Lee at the European physics laboratory CERN in late 1990. Working with Robert Cailliau, Berners-Lee built the first web browser, the first web server, and defined the core technologies that still power the web today: URLs, HTTP, and HTML.
The system was released to the research community in 1991 and opened to the general public shortly after. Within a few years, it transformed commerce, communication, entertainment, and access to information on a scale no previous invention had matched. The web didn’t just create new industries. It fundamentally changed how people learn, work, socialize, and consume media.
Lithium-Ion Batteries
Sony introduced the world’s first commercial lithium-ion battery in 1991. Compared to the rechargeable batteries available at the time, lithium-ion cells stored significantly more energy for their weight and could be recharged hundreds of times without losing much capacity. They initially powered camcorders and portable electronics.
Today, lithium-ion batteries are the energy source behind smartphones, laptops, electric vehicles, and grid-scale energy storage. Without them, the portable electronics revolution of the 2000s and the current shift toward electric transportation simply wouldn’t have been possible.
CRISPR Gene Editing
In June 2012, Jennifer Doudna and Emmanuelle Charpentier published a paper describing a method for precisely editing DNA, using a natural bacterial defense system called CRISPR-Cas9. Within roughly six months, six separate research teams had used the technique to edit genes in living human and animal cells.
CRISPR works like molecular scissors, allowing scientists to cut a specific section of DNA and either remove it, replace it, or insert new genetic material. It’s faster, cheaper, and more accurate than any previous gene-editing method. The technology has already led to approved therapies for sickle cell disease and is being explored for conditions ranging from certain cancers to inherited blindness. It represents one of the most significant biological tools ever developed, and its full impact is still unfolding.

