Companies began offering health insurance in the early 1900s, but the practice didn’t become widespread until World War II forced their hand. The story spans roughly four decades, from a handful of experimental policies to a system that now covers 155 million Americans.
Early Experiments Before 1930
Before companies got involved, workers who wanted financial protection against illness or injury relied on fraternal organizations, mutual aid societies, and unions. By 1917, 179 national fraternal organizations in the United States paid out $97 million in benefits, but only 1 percent of that money went toward medical expenses. Most of it covered lost wages, not doctor bills. These groups, many founded by immigrant communities adapting European models, were essentially pooling money to help members survive a crisis rather than pay for care.
The earliest corporate policies followed a similar logic. In 1912, Montgomery Ward insured its 2,912 employees through a group life policy with Equitable Life. The plan paid a $100 burial benefit at death, a lump sum equal to one year’s salary (up to $3,000) for employees without dependents, and weekly payments of 25 percent of the worker’s salary to a surviving spouse. Negotiations for this policy had begun in 1910, and its sheer size forced the rest of the insurance industry to pay attention. Group insurance as an employment benefit started gaining traction from that point, though it remained focused on life insurance and disability rather than hospital or medical costs.
The 1929 Baylor Plan That Changed Everything
The first recognizable health insurance plan appeared in 1929, when Baylor University Hospital in Texas struck a deal with 1,500 local schoolteachers. For $6 per person per year (about 50 cents a month), each teacher received up to 21 days of hospital care annually. This was a radical concept: prepaying for hospital services the way you’d pay for any other subscription.
The Baylor model spread quickly during the Great Depression, when hospitals were desperate for steady revenue and patients couldn’t afford unexpected bills. Similar plans popped up across the country and eventually organized under the Blue Cross name. These weren’t employer-sponsored plans in the modern sense, but they established the template. Employers soon realized they could purchase these hospital service plans on behalf of their workers, and the infrastructure for group coverage was in place by the time war arrived.
World War II Made Health Benefits Standard
The real turning point came in 1942. With millions of workers drafted into military service, companies faced intense competition for the labor that remained. Normally, they would have raised wages. But the Stabilization Act of 1942, signed by President Roosevelt, froze wages and prices to control wartime inflation. Salaries were locked to levels that existed on September 15, 1942.
Here’s the critical detail: the law explicitly excluded “insurance and pension benefits in a reasonable amount” from its definition of wages and salaries. Companies couldn’t offer more money, but they could offer health insurance. Practically overnight, health coverage became the primary tool for attracting and retaining workers. The National War Labor Board reinforced this in 1943 by ruling that employer contributions to health insurance were not subject to wage controls, though it did cap how much could be spent tax-free.
The numbers tell the story of how fast this shift happened. In 1940, about 2,500 people held group hospital insurance through commercial insurers, along with 140,000 through employer-employee-union plans. By 1955, group policies through insurance companies alone covered 35 million people, and employer-union plans covered 1.7 million. In fifteen years, the system had grown by more than a thousandfold.
Tax Law and Courts Locked It In
Two postwar developments cemented employer-sponsored insurance as the American norm. The first was a legal ruling. In the late 1940s, the National Labor Relations Board decided that health insurance was a mandatory subject for collective bargaining. This meant employers couldn’t refuse to negotiate over health benefits with unions. The ruling, which originated in a case involving Inland Steel Company, was later extended to cover group health and accident insurance plans specifically. Unions across the country began demanding health coverage in every contract.
The second development was tax policy. During the war, the IRS had treated employer-paid health premiums as tax-free income for workers, but with limits. In 1954, Congress made this tax exemption permanent and removed the wartime caps entirely. Employers could now spend as much as they wanted on health insurance without any of it counting as taxable income for employees. This created an enormous financial incentive. A dollar spent on health insurance was worth more than a dollar in wages, because the worker didn’t owe income tax on it. The tax exclusion remains in effect today and costs the federal government an estimated $329 billion in lost revenue.
From Perk to Pillar of American Life
By the late 1950s, employer-sponsored insurance had gone from a wartime workaround to the primary way most working-age Americans got health coverage. The pattern solidified over the following decades as companies competed for talent partly through benefit packages, and as the tax code continued to reward this arrangement.
Today, roughly 155 million Americans get their health insurance through an employer. The average annual premium for family coverage reached $25,572 in 2024, a 7 percent increase over the prior year. Workers contribute about $6,296 of that cost on average, with employers picking up the rest. What started as a creative response to wage controls during a world war has become the backbone of the American health insurance system, for better or worse, shaping how most people access and pay for medical care.

