Bringing a single new drug to market costs roughly $1 billion to $2.6 billion, depending on how you measure it and who’s counting. That enormous range isn’t vague hand-waving. It reflects real differences in what gets included in the calculation: just the cash spent, or also the cost of all the failed attempts along the way? The answer changes dramatically based on which costs you fold in.
Why the Estimates Vary So Much
The most widely cited figure comes from a 2016 Tufts Center study, which estimated $2.6 billion per approved drug in 2013 dollars. That number includes two things beyond what a company actually writes checks for. First, it accounts for the cost of capital, meaning the returns the company could have earned if it had invested that money elsewhere during the 10-plus years of development. Second, it folds in the cost of every failed candidate that never made it to market, spreading those losses across the drugs that did succeed.
The actual out-of-pocket spending in that same study was $1.4 billion per approved drug. Still enormous, but roughly half the headline number. Adding post-approval research brought the capitalized total to $2.87 billion.
A 2020 analysis published in JAMA took a different approach, examining 63 drugs developed by 47 companies between 2009 and 2018. It found a median cost of $985 million and a mean of $1.3 billion, including the cost of failures and capital. The median is arguably more useful here because a handful of extraordinarily expensive programs pull the average up. When the researchers restricted their analysis to the highest-quality cost estimates, the median rose slightly to about $1.05 billion.
Yet another 2024 study in JAMA Network Open estimated the mean out-of-pocket cost at just $173 million per drug. Once failures and capital costs were added, that jumped to $879 million. The takeaway: the actual cash a company spends developing a single successful drug is a fraction of the headline figures, but the economic cost of all the drugs that fail alongside it is what inflates the total.
Where the Money Goes by Phase
Drug development moves through distinct clinical trial phases, and costs escalate sharply at each step. Phase 1 trials, which test safety in small groups of volunteers, cost an average of about $55 million per drug. Phase 2 trials, which begin testing whether the drug actually works, average around $104 million. Phase 3 trials, the large-scale studies needed to prove effectiveness and safety before approval, are the most expensive single stage at roughly $298 million on average.
That adds up to about $457 million in clinical trial costs alone per successful drug. This figure doesn’t include the preclinical research that happens before human trials begin, which can take years of laboratory and animal testing. It also doesn’t include the regulatory review process itself or the manufacturing scale-up needed to actually produce the drug commercially.
After approval, the FDA often requires additional Phase 4 studies to monitor long-term safety. A typical post-market study costs around $1.8 million (the median), though large cardiovascular safety trials and other mega-studies can exceed $50 million, pushing the average to about $2.4 million.
Most Candidates Never Make It
The reason failure costs dominate these estimates is simple: most drug candidates fail. Only about 14% of drugs that enter Phase 1 trials ever reach FDA approval. That means for roughly every seven drugs a company puts into human testing, six will be abandoned at some point, and all the money spent on those six is effectively the price of the one that succeeds.
This is why the gap between out-of-pocket spending on a single drug and the “true” cost per approval is so large. A company doesn’t just pay for the winners. It pays for years of work on candidates that showed promise in early testing but failed in Phase 2 or collapsed in a massive Phase 3 trial after hundreds of millions had already been spent.
How Long the Process Takes
The timeline from the start of clinical trials to FDA approval is a median of 8 years, with a range of 4 to 20 years and an average closer to 9.4 years. That’s just the clinical portion. The foundational science behind a new drug, from identifying a biological target to developing a compound worth testing in humans, can add decades. One study tracking the full arc from initial technology development to approval found a median of 36 years, though much of that early period involves basic research that may not be funded by the company that eventually commercializes the drug.
Drugs built on more mature science tend to move faster. When a compound enters clinical trials after its underlying biology is already well understood, the time from first trial to approval drops to about 8.5 years, compared to 11.5 years for drugs based on less established science.
Costs Vary Widely by Therapeutic Area
Not all drugs are equally expensive to develop. The therapeutic area has a massive influence on cost, largely because some diseases require bigger, longer, or more complex clinical trials.
- Pain and anesthesia: the most expensive category, averaging $1.76 billion per approved drug (including failures and capital costs). These drugs often need very large trials and face high failure rates.
- Oncology: $1.21 billion, reflecting the complexity of cancer biology and the need for specialized trial designs.
- Ophthalmology: $1.19 billion.
- Central nervous system: $895 million, with brain-targeting drugs historically plagued by high failure rates in diseases like Alzheimer’s.
- Cardiovascular: $890 million.
- Anti-infectives: the least expensive at $379 million, roughly one-third the overall average. Antibiotics and antivirals often have shorter, more straightforward trials.
The range here is striking. Developing a pain drug costs more than four times what it costs to develop an anti-infective, on average.
Orphan Drugs Cost Less to Develop
Drugs for rare diseases, known as orphan drugs, are significantly cheaper to bring to market. The capitalized clinical cost per approved orphan drug is about $291 million, compared to $412 million for non-orphan drugs. When looking only at entirely new molecules rather than reformulations, orphan drugs cost about half as much: $242 million versus $489 million.
Several factors drive this difference. Rare disease trials are smaller by necessity since fewer patients exist to enroll. Regulatory pathways are often faster, with the FDA granting priority reviews and flexible trial designs. The Orphan Drug Act also provides a tax credit covering up to 50% of clinical trial costs, which means the actual out-of-pocket expense for companies may be even lower than these estimates suggest. Orphan drugs also receive seven years of market exclusivity after approval, making the financial case more attractive despite smaller patient populations.
What’s Driving the Numbers Up
The cost of drug development has risen substantially over time. The Tufts study found that its $2.6 billion estimate was more than double the inflation-adjusted figure from a decade earlier. Several forces are responsible. Clinical trials have grown larger and more complex, with more endpoints, more countries, and more regulatory requirements. The “easy” drug targets, diseases where the biology is relatively simple, have largely been addressed. Companies are increasingly pursuing complex conditions like neurodegeneration and autoimmune diseases where the biology is poorly understood and failure rates are higher.
Trial operations have also become more expensive. Recruiting and retaining patients is one of the biggest bottlenecks, often requiring global enrollment across dozens of sites. Regulatory requirements for safety monitoring have expanded, and the digital infrastructure needed to manage modern trials adds another layer of cost. All of this compounds over the 8 to 10 years a typical drug spends in clinical development, making the time value of money a significant factor in the final tally.

