Healthcare isn’t free in the United States primarily because the country built its system around employer-sponsored private insurance during World War II, and the political and economic structures that grew from that decision have proven extraordinarily difficult to dismantle. Unlike most wealthy nations that established government-run systems in the postwar era, the US went a different direction, and decades of industry growth, lobbying, and cultural attitudes toward government have kept it there.
How Employer-Based Insurance Took Hold
The story starts with a wartime accident of policy. In 1942, the federal government imposed wage and price controls to stabilize the economy. The following year, the War Labor Board ruled that employer contributions to insurance and pension funds didn’t count as wages. With a labor shortage and no ability to offer higher pay, companies began competing for workers by offering health benefits instead. By the end of the war, the number of people with health coverage had tripled.
What might have been a temporary workaround became permanent through a series of legal decisions. In 1948, the National Labor Relations Board ruled that employers were required to bargain over health insurance benefits, and the Supreme Court upheld that ruling in 1949. Then in 1954, the Internal Revenue Code cemented the arrangement: employer contributions to health plans became tax-deductible business expenses and were excluded from employees’ taxable income. This created a massive financial incentive for both employers and workers to keep insurance tied to jobs rather than push for a government alternative.
By the time other wealthy nations were building universal systems in the 1950s and 1960s, the US already had a sprawling private insurance infrastructure with millions of jobs, billions in revenue, and deep ties to both employers and unions. That infrastructure became its own political constituency, resistant to being replaced.
The Industries That Profit From the Current System
The US healthcare system is enormous, accounting for roughly 17% of the national economy. Several powerful industries depend on keeping it structured the way it is.
Private insurance companies sit at the center. They negotiate rates with hospitals and doctors, manage claims, and take a percentage for administrative costs and profit. A shift to a single government payer would eliminate much of their business model. These companies spend heavily on lobbying and political contributions to protect their role.
Drug pricing is another major factor. US manufacturer prices for prescription drugs in 2022 were 278% of prices across 33 other wealthy nations. For brand-name drugs specifically, US prices were 422% of what other countries pay. Even after accounting for the rebates that drug companies pay back to insurers and pharmacy benefit managers, brand-name prices were still 308% of international levels. Generic drugs, which make up 90% of US prescriptions by volume, are actually cheaper in the US (about 67% of international prices), but brand-name drugs drive total spending. Most countries with universal healthcare use government negotiating power to set or cap drug prices. The US has historically allowed manufacturers to set their own prices for most of the market.
Physician compensation also plays a role, though it’s often overstated. The average US physician earns about $316,000 per year, compared to $183,000 in Germany and $138,000 in the UK. Those higher salaries partly reflect higher medical school debt and training costs, but they also reflect a system that rewards volume. The dominant payment model in the US, known as fee-for-service, pays doctors and hospitals for each test, procedure, and visit they perform. This creates a financial incentive to do more rather than focus on keeping people healthy in the first place.
Why Hospital Prices Vary So Wildly
In countries with universal systems, the government typically sets standard prices for medical procedures. In the US, each hospital sets its own list prices, and then negotiates different rates with each insurance company. The result is staggering variation. For common procedures, hospitals at the 90th percentile of pricing charge 3 to 11 times more than hospitals at the 10th percentile for the same service.
Ownership matters too. For-profit hospitals charge about 39% more than government-owned hospitals, and nonprofits charge about 14% more. Hospitals that contract with more insurance plans tend to have higher prices, while hospitals in more competitive markets have lower ones. This fragmented, opaque pricing system is a direct consequence of having no central authority setting rates. It also makes it nearly impossible for patients to comparison-shop, which would normally push prices down in a functioning market.
Political and Cultural Resistance
Americans are broadly supportive of the idea that the government should ensure people have healthcare coverage, but they’re deeply split on how. A 2025 Pew Research Center survey found that 35% of adults favor a single national health insurance system run by the government. Another 31% prefer keeping the current mix of private companies and government programs. That split reflects a genuine ideological divide: many Americans view government-run systems as inefficient or as an overreach of federal power, while others see private insurance as wasteful and exclusionary.
Every serious attempt at universal coverage in the US has faced opposition from some combination of the insurance industry, pharmaceutical companies, physician groups, business associations, and ideological opponents of government expansion. President Truman proposed national health insurance in 1945 and was defeated. The Clinton administration tried again in 1993 and failed. The Affordable Care Act in 2010 expanded coverage significantly but preserved the private insurance framework, and even that faced years of legal and political challenges.
Who Falls Through the Gaps
As of 2024, about 92% of the US population (310 million people) had health insurance for at least part of the year. That leaves roughly 26 million people without any coverage. Workers in farming, fishing, and forestry occupations have among the highest uninsured rates. Many uninsured people work but in jobs that don’t offer benefits, earn too much to qualify for Medicaid, or live in states that didn’t expand Medicaid under the Affordable Care Act.
Even having insurance doesn’t mean care is affordable. High deductibles, copays, and out-of-network charges mean many insured Americans delay or skip care because of cost. The system produces a paradox: the US spends more per person on healthcare than any other country, yet millions of its residents can’t access basic care.
Why Change Is So Difficult
Transitioning to a universal system would mean restructuring roughly one-sixth of the US economy. Private insurers employ hundreds of thousands of people. Hospitals and drug companies have built their business models around negotiated rates and pricing flexibility. Physicians have taken on debt expecting US-level salaries. None of these groups would absorb the change easily, and all of them have significant political influence.
There’s also a structural problem in American government. Major legislation requires overcoming the Senate filibuster, which effectively means 60 out of 100 senators must agree. Given the polarization of healthcare as a political issue, that threshold has been nearly impossible to reach. The result is incremental change (expanding Medicaid eligibility, adding drug price negotiation for some Medicare drugs) rather than systemic overhaul.
The US system wasn’t designed from a blueprint. It grew out of a wartime workaround, was reinforced by tax policy, and became entrenched as industries and political interests organized around it. Changing it would require not just a policy shift but a willingness to disrupt some of the most powerful economic forces in the country.

