When Did Healthcare Become a Business in the U.S.?

Healthcare in the United States didn’t become a business in a single moment. It evolved into one through a series of legislative decisions, corporate innovations, and economic shifts spanning roughly 50 years, from the late 1960s through the early 2000s. By the time the transformation was complete, the U.S. was spending 18% of its entire GDP on healthcare, up from about 5% in 1960, and the system looked almost nothing like the community-based charity model it replaced.

What Healthcare Looked Like Before Profit

For most of American history, medical care operated outside the normal rules of commerce. Doctors charged modest fees, often on sliding scales, and hospitals were largely run by religious orders, municipalities, or charitable organizations. A 1932 study by the Committee on the Cost of Medical Care found that despite widespread charity care from physicians, hospitals, and health departments, about half of the lowest-income Americans received no medical care at all. The system wasn’t efficient or equitable, but it wasn’t designed to generate returns for investors either.

Doctors made house calls. Hospitals covered their operating costs through donations and modest patient fees. There was no insurance industry to speak of, no billing departments employing dozens of coders, and no shareholders expecting quarterly earnings growth. The committee’s most controversial recommendation at the time was national health insurance, either voluntary or tax-funded. That idea was rejected, and the country instead drifted toward a patchwork of private insurance that would eventually reshape the entire system.

The 1960s: Hospitals Go Corporate

The founding of Hospital Corporation of America (HCA) in 1968 marked one of the earliest and most consequential shifts. HCA was among the first investor-owned hospital companies in the nation, pioneering an entirely new model: hospitals run not by community boards or religious organizations but by corporate management answering to shareholders. The company started in Nashville and expanded rapidly, proving that hospitals could be operated as scalable, profitable enterprises the same way hotel chains or restaurant franchises could.

HCA’s success inspired imitators. Throughout the 1970s and 1980s, for-profit hospital chains bought up community hospitals across the country, consolidating what had been a fragmented landscape of independent institutions into something resembling an industry. The shift wasn’t just about ownership. Corporate management brought standardized cost controls, centralized purchasing, and a focus on high-margin services, all practices borrowed directly from the business world.

1973: Congress Opens the Door to For-Profit Medicine

The Health Maintenance Organization Act of 1973 was a critical legislative turning point. The law defined health maintenance organizations (HMOs) as public or private entities that could provide health services to members under a prepaid, managed structure. Crucially, it opened the door for private, for-profit companies to enter the business of delivering and managing healthcare at scale. The federal government even provided grants and loans to help new HMOs get started.

The logic was that competition and managed care would control rising costs. What it also did was create an entirely new profit center: companies that made money not by providing care but by managing who got care, how much of it, and at what price. The insurance and managed care industry that dominates American healthcare today traces its corporate DNA directly to this law.

1980: Federally Funded Research Becomes Private Property

The Bayh-Dole Act of 1980 transformed how medical discoveries moved from the lab to the marketplace. Before the law, inventions developed with federal research funding belonged to the government. Bayh-Dole gave universities, nonprofit research institutions, and small businesses the right to patent and commercialize those discoveries. The intent was to accelerate innovation by giving institutions a financial incentive to move research into practical applications.

The effect was dramatic. Universities established technology transfer offices, launched spin-off companies, and signed licensing agreements that turned patents into lucrative revenue streams. Academic medical centers, once focused primarily on education and patient care, became active participants in the market economy. Drug development, medical devices, and diagnostic technologies all became potential profit generators, even when the underlying research had been paid for with public dollars. This created a self-sustaining cycle where the commercial potential of research influenced which projects got funded in the first place.

1983: Medicare Rewires Hospital Incentives

Before 1983, Medicare reimbursed hospitals on a retrospective cost basis. Hospitals spent whatever they spent, and Medicare paid the bill. Higher costs meant higher reimbursement. There was virtually no incentive to control spending.

That changed in October 1983 when Medicare implemented its Prospective Payment System. Instead of reimbursing actual costs, the government now paid a single flat rate per discharge based on the patient’s diagnosis. Each hospital kept the difference between that flat payment and what it actually cost to treat the patient, or absorbed the loss if costs exceeded the payment. The system was explicitly designed, in the government’s own words, to “provide strong financial incentives for hospitals to control their input costs and resource use.”

This single policy change forced every hospital in America, nonprofit or otherwise, to think like a business. Administrators had to track costs per patient, reduce lengths of stay, and make strategic decisions about which services to offer based on their financial margins. Hospitals that couldn’t adapt faced real financial consequences. The language of healthcare shifted: patients became “discharges,” treatments became “units of care,” and hospital success was increasingly measured in financial terms.

1986: Blue Cross Goes Corporate

Blue Cross and Blue Shield plans had operated as tax-exempt nonprofit organizations since their founding in the 1930s. The Tax Reform Act of 1986 changed that. Starting with taxable years after December 31, 1986, these organizations were taxed “in the same manner as if they were stock insurance companies.” The law preserved some special treatment for plans that maintained community rating and open enrollment, but it fundamentally altered the competitive landscape.

Once the largest health insurers in the country were taxed like for-profit corporations, they increasingly behaved like them. Over the following decades, many Blue Cross Blue Shield plans converted to for-profit status outright, pursued mergers and acquisitions, and adopted the same shareholder-focused strategies as any other publicly traded company. The nonprofit ethos that had defined health insurance for half a century gave way to the same profit expectations that governed the rest of the financial sector.

The Administrative Explosion

One of the clearest markers of healthcare’s transformation into a business is the growth of its bureaucracy. Between 1975 and 2010, the number of healthcare administrators in the United States grew by 3,200%. During that same period, the number of physicians grew by 150%. The people running the business side of healthcare now vastly outnumber the people providing care, and the gap has only widened since.

This explosion wasn’t accidental. Every layer of the commercialized system requires administrative support: insurance claims processing, prior authorization, billing and coding, compliance with payer contracts, credentialing, utilization review. None of these functions existed in the charity hospital era. They exist because healthcare became a system where money flows through complex intermediaries, each with its own paperwork requirements, before a doctor ever sees a patient.

Private Equity Enters the Picture

The most recent chapter in healthcare’s commercialization is the entry of private equity firms. Between 2018 and 2023 alone, private equity spent $505 billion on healthcare acquisitions, according to research published in the JAMA Network. These firms have purchased hospitals, physician practices, dental chains, nursing homes, and specialty clinics, often loading them with debt and restructuring operations to maximize short-term returns.

Hospital acquisitions by private equity peaked in 2018, with 67 hospitals changing hands that year. The model is straightforward: buy a healthcare operation, cut costs aggressively, increase revenue where possible, and sell it within a few years at a profit. Critics argue this model prioritizes investor returns over patient outcomes. Supporters say it brings needed efficiency to a bloated system. Either way, it represents the furthest extension of a trend that began decades earlier: treating healthcare not as a public good but as an asset class.

The Scale of the Transformation

The numbers tell the story most clearly. In 1960, U.S. healthcare spending represented roughly 5% of GDP. By 2024, it reached $5.3 trillion, or 18% of GDP, amounting to $15,474 per person. No other country spends anywhere close to this proportion of its economy on healthcare.

That growth didn’t happen because Americans got sicker. It happened because healthcare became a multi-trillion-dollar industry with its own corporate logic: investor expectations, executive compensation packages, lobbying operations, and marketing budgets. The transformation from charity model to business model took roughly five decades of incremental policy decisions, each one making it a little easier to profit from the act of keeping people alive. No single law or event flipped the switch. But by the time most Americans noticed, the switch had been flipped completely.