Technology doesn’t double at a single universal rate. Different technologies follow their own exponential curves, and the most famous of these predictions have shifted significantly in recent years. Computing power historically doubled every two years, but that pace has slowed to roughly every three years. Internet bandwidth grows at about 50% per year. DNA sequencing costs have plummeted faster than any of them. Here’s how the major doubling rates break down and where they stand now.
Computing Power: From 2 Years to 3
Moore’s Law is the benchmark most people think of when they ask how fast technology doubles. Gordon Moore observed in 1965 that the number of transistors on a chip doubled approximately every two years, and that pattern held remarkably well for decades. It drove the explosion of personal computers, smartphones, and cloud computing.
That pace has slowed. Former Intel CEO Pat Gelsinger stated at the end of 2023 that “we’re no longer in the golden era of Moore’s Law” and that effective doubling now happens closer to every three years. Semiconductor advancement across the industry has lagged behind Moore’s original prediction since around 2010, though leading manufacturers like TSMC and Samsung have managed to keep pushing to smaller chip designs. The next frontier, 2nm chips, is expected to enter mass production around 2027 or 2028, with 1.4nm targeted for 2029. At these tiny scales, the transistor structures are fundamentally more complex to build, and getting manufacturing yields high enough to be cost-effective is the single hardest challenge chipmakers face.
Internet Bandwidth: Doubling Every 2 Years
Jakob Nielsen tracked high-end internet connection speeds starting in 1983 and found a consistent pattern: user bandwidth grows about 50% per year. That translates to a doubling roughly every 21 months. Over the full span of his data, real-world speeds increased by a factor of 57 times what they were at the start, and the empirical data fits the exponential curve closely.
This rate is notably slower than computing power grew during its peak years. Nielsen himself pointed out that Moore’s Law corresponded to about 60% annual growth, compared to bandwidth’s 50%. That gap matters. It’s one reason why software capabilities have historically outpaced the network speeds needed to deliver them, and why buffering, latency, and download times remain frustrations even as connections get faster.
Energy Efficiency: Slowing Down Too
Koomey’s Law tracks a different dimension of computing: how many computations you can perform per kilowatt-hour of energy. When Jonathan Koomey analyzed data from 1946 to 2009, he found that energy efficiency doubled every 1.57 years. That’s significant because it meant computers could do the same work with half the power consumption in under two years, which is what made laptops thinner and phones last longer on a single charge.
Recent analysis tells a different story. Energy efficiency still improves exponentially, but the doubling interval has stretched to about 2.29 years. Raw computing performance doubles faster, approximately every 1.85 years, which means performance gains are outpacing efficiency gains. In practical terms, newer chips are more powerful and more efficient than their predecessors, but the efficiency improvements aren’t keeping up with the power demands of the performance increases. This is part of why data centers and AI training consume so much electricity despite each individual chip being far more efficient than those from a decade ago.
Data Storage: A Boom That Faded
Hard drive storage density once followed the fastest doubling rate of any major technology. During its peak growth period around the early 2000s, magnetic storage density was increasing at roughly 130% per year, meaning it doubled every nine months. That’s significantly faster than Moore’s Law at its best. This era is sometimes called Kryder’s Law, after a Seagate executive who projected the trend forward.
That pace didn’t last. Storage density growth slowed dramatically after hitting physical limits in how tightly data could be packed onto magnetic platters. The industry shifted toward solid-state drives and new recording techniques to keep capacity climbing, but the doubling time stretched well beyond nine months. Storage is still getting cheaper and denser, just not at the breakneck speed it once did.
DNA Sequencing: Faster Than All of Them
The cost of sequencing a human genome has dropped faster than virtually any other technology metric. The National Human Genome Research Institute tracks these costs over time and plots them against Moore’s Law for comparison. Until about 2008, sequencing costs fell at roughly the same rate as computing costs. Then, starting in January 2008, sequencing costs began plummeting so fast they dramatically outpaced Moore’s Law on a logarithmic scale. The introduction of next-generation sequencing platforms caused a sudden, steep drop that no computing trend has matched.
The first human genome cost roughly $3 billion to sequence. Today it costs a few hundred dollars. That collapse in price has made genetic testing accessible for medical diagnostics, ancestry services, and cancer research in ways that would have seemed impossible even 15 years ago.
Why Doubling Rates Slow Over Time
Every exponential trend in technology eventually hits constraints. For computing, the barriers are physical: transistors are now measured in nanometers, approaching the scale of individual atoms. Building chips at these sizes requires entirely new transistor designs and manufacturing processes that are harder to perfect. Yields (the percentage of chips that come off the production line actually working) become a major bottleneck, and each new generation costs billions more in factory equipment.
For storage and energy efficiency, the pattern is similar. Early gains come from relatively straightforward engineering improvements, but each subsequent doubling requires solving harder problems. The exponential curve doesn’t break all at once. It bends gradually as the doubling interval stretches from 18 months to two years to three.
The overall picture is that technology still doubles, but the timeline depends on what you’re measuring. Computing power roughly every three years. Internet speeds every two. Energy efficiency every two and a quarter. DNA sequencing costs have collapsed even faster than any of these. None of these rates are fixed laws of nature. They’re empirical observations that shift as engineering challenges change, and nearly all of them are slower now than they were a decade or two ago.

