Pharmaceutical research is funded by a mix of private industry, government agencies, and philanthropic organizations, with each playing a distinct role at different stages of drug development. Private companies now fund the majority of clinical trials, but nearly every drug that reaches the market has roots in publicly funded basic science. Understanding who pays for what reveals how a new medicine actually gets from a laboratory idea to your pharmacy shelf.
The NIH: Largest Public Funder
The National Institutes of Health is the single largest public funder of biomedical research in the world, investing nearly $48 billion annually. About 82% of that budget goes to researchers at more than 2,500 universities, medical schools, and research institutions through competitive grants. Another 11% supports roughly 6,000 scientists working in NIH’s own laboratories, mostly on its campus in Bethesda, Maryland.
The NIH’s contribution shows up in the final product more than most people realize. A 2023 analysis published in JAMA Health Forum found that NIH funding contributed to basic or applied research behind 99.7% of drugs approved by the FDA between 2010 and 2019. That includes not just early-stage biology but also clinical trials: NIH supported phased clinical testing for 62% of those approved drugs. In practical terms, taxpayer-funded research laid the scientific groundwork for almost every new medication that came to market in that decade.
Private Industry Dominates Clinical Trials
While government funding tends to concentrate on basic science and early discovery, private pharmaceutical and biotech companies spend the most on the expensive later stages of development, particularly the large clinical trials required for FDA approval. Since 2006, the number of industry-funded clinical trials registered each year has grown by 43%, while NIH-sponsored trials declined by 24% over the same period. By 2014, industry-funded studies outnumbered NIH-funded ones by more than six to one.
This shift matters because clinical trials are the most expensive part of bringing a drug to market. A 2024 study in JAMA Network Open estimated that the median cost to develop a single new drug is $708 million, and the average is $1.31 billion once you factor in the cost of capital and the many candidates that fail along the way. The range is enormous: some drugs cost as little as $247 million to develop, while others exceed $1.4 billion. Companies recoup these costs through drug pricing, which is one reason the question of who funds research is so closely tied to debates about drug affordability.
Large pharmaceutical companies fund most of this work from their own revenue and investor capital. Smaller biotech firms often rely on venture capital funding in their early years, then either partner with larger companies or go public to raise the money needed for late-stage trials.
Universities and Technology Transfer
Universities sit at the boundary between public funding and commercial drug development. Researchers at academic institutions, often working with NIH grants, make the initial discoveries that later become drug candidates. Before the Bayh-Dole Act of 1980, which allowed universities to patent inventions from federally funded research, no new drugs were commercialized from government-supported university work. Since its passage, more than 153 drugs that originated in university laboratories have received FDA approval for conditions ranging from cancer to HIV.
When a university patents a discovery and licenses it to a pharmaceutical company, the resulting revenue flows back to the institution. U.S. universities earned a combined $2.6 billion in licensing income in 2012. That money is relatively unrestricted, meaning institutions can reinvest it in further research. This has become increasingly important as state and federal funding for universities has declined, pushing schools to look at technology transfer as a way to sustain their research programs. Inventors typically receive a personal share of licensing revenue along with funds for continued research.
There is a persistent gap, however, between an early-stage university discovery and a technology ready for commercial development. Bridging that gap requires additional funding that neither government grants nor industry partnerships always cover, which is why many promising discoveries never advance beyond the lab.
Philanthropic Foundations
Philanthropy plays a smaller but targeted role. The Wellcome Trust, based in the United Kingdom, is the largest philanthropic funder of health research globally, spending roughly $909 million per year. The Howard Hughes Medical Institute in the U.S. follows at about $752 million annually. Both focus heavily on basic biomedical science. The Bill & Melinda Gates Foundation, often associated with global health, spends around $463 million on health research, with a particular focus on infectious diseases affecting low-income countries.
These foundations fill niches that government and industry tend to overlook. They fund research on diseases that disproportionately affect populations without significant purchasing power, meaning there is little commercial incentive for private companies to invest. They also support long-term, high-risk basic science that may not produce results for decades.
Government Incentives and Partnerships
Beyond direct NIH funding, the federal government uses financial incentives to steer private investment toward areas of public need. The Orphan Drug Act, for example, encourages companies to develop treatments for rare diseases by offering market exclusivity and grant funding. The FDA’s Orphan Products Grants Program funds clinical trials directly, providing up to $250,000 per year for Phase 1 trials and up to $500,000 per year for Phase 2 and 3 trials, typically over four-year periods. In fiscal year 2024, the FDA selected seven new clinical trials under this program, committing $17.2 million over four years, plus an additional $4.7 million for natural history studies.
For national security and pandemic preparedness, the Biomedical Advanced Research and Development Authority (BARDA) partners with private companies to develop vaccines, drugs, and diagnostic tools for threats like pandemics, bioterrorism, and emerging infectious diseases. BARDA doesn’t conduct research itself. Instead, it contracts with industry to accelerate products through the later stages of development that companies might not otherwise pursue because the commercial market is uncertain. The COVID-19 vaccines were the most visible example of this model in action.
How the Funding Flows in Practice
The path from idea to approved drug rarely involves a single funder. A typical trajectory starts with NIH-funded basic research at a university, where scientists identify a biological mechanism or molecular target. The university patents the discovery and licenses it to a biotech startup or pharmaceutical company. That company, using private capital, develops the compound through preclinical testing and early clinical trials. If results look promising, a larger company may acquire the startup or sign a licensing deal to fund the massive Phase 3 trials needed for FDA approval.
At each stage, different funders absorb different risks. Taxpayers, through the NIH, bear the risk of basic research that may lead nowhere. Venture capitalists and biotech investors take on the risk of early clinical development, where failure rates exceed 90% for some disease areas. Large pharmaceutical companies shoulder the cost of late-stage trials and manufacturing scale-up. Philanthropies often step in where none of these other funders see sufficient return, whether financial or strategic.
This layered system means that no single entity “pays for” a new drug. The more accurate picture is a relay race where public money starts the process, private money finishes it, and the question of who deserves credit (and who should control pricing) depends heavily on which part of the race you consider most important.

