When Did Healthcare Start in America? Key Milestones

Healthcare in America doesn’t have a single starting point. It evolved in stages, from a colonial-era hospital founded in 1751 to the federal government’s first medical program in 1798, employer insurance in 1929, and the landmark creation of Medicare and Medicaid in 1965. Each of these moments reshaped how Americans received and paid for medical care.

The First Hospital: Philadelphia, 1751

Before the American Revolution, there were no hospitals in the colonies. Sick people were cared for at home or, if they were poor, left to fend for themselves. That changed when Dr. Thomas Bond and Benjamin Franklin founded Pennsylvania Hospital in 1751 to care for the “sick-poor and insane who were wandering the streets of Philadelphia.” The Pennsylvania colonial assembly granted a charter on May 11, 1751, a temporary facility opened the following year, and the first patients were admitted on February 11, 1753. The idea of a hospital was, at the time, a novelty on this side of the Atlantic.

The First Federal Healthcare Law

The federal government entered healthcare just nine years after the Constitution was ratified. On July 16, 1798, President John Adams signed the Act for the Relief of Sick and Disabled Seamen, the first law requiring any group of Americans to pay into a government health fund. Under the law, ship owners had to deduct 20 cents per month from every sailor’s wages. That money went to local tax collectors, who forwarded it to the Treasury. The president was then authorized to use the funds to provide “temporary relief and maintenance of sick or disabled seamen” at hospitals in port cities.

If a port had no hospital, the president could direct care in whatever way he saw fit. And if money accumulated beyond what was needed for immediate care, the law allowed the government to purchase land and build dedicated hospitals. This network of marine hospitals eventually became the Marine Hospital Service, the direct ancestor of today’s U.S. Public Health Service. It was, in effect, the country’s first public health insurance program, funded by mandatory payroll deductions nearly 170 years before Medicare.

Raising the Bar for Doctors and Nurses

For most of the 1800s, medical training in America was shockingly inconsistent. Many doctors graduated from for-profit schools with little clinical experience and almost no grounding in science. The American Medical Association was founded in 1847 specifically to address this problem, creating a national professional organization dedicated to raising the standards of medical training and practice.

The real turning point came in 1910, when education reformer Abraham Flexner published a landmark report exposing just how bad things were. He documented widespread deficiencies: students who lacked the science background to understand what they were learning, too much lecturing and too little hands-on work, and almost no testing of practical skills. Flexner held up Johns Hopkins as the gold standard and argued that medical schools should be university-based, require at least two years of basic science, and employ faculty who were scientists, not just practicing doctors. The report led to the closure of dozens of substandard schools and established the framework for medical education that still exists today.

Professional nursing followed a similar arc. The year 1873 marked a watershed when three training programs opened almost simultaneously: the New York Training School at Bellevue Hospital, the Connecticut Training School at the State Hospital in New Haven, and the Boston Training School at Massachusetts General Hospital. All three were based on principles advanced by Florence Nightingale, and they are generally acknowledged as the forerunners of organized, professional nursing education in the United States.

How Employer Insurance Took Root

The idea that your job should come with health coverage traces back to a single experiment in Dallas, Texas. In 1929, administrators at Baylor University Hospital devised a plan to make hospitalization more affordable: local teachers could prepay 50 cents a month and receive up to 21 days of hospital coverage per year. The Baylor Plan was an immediate success and quickly expanded to employees across the city. It became the model for Blue Cross, which spread nationwide over the following decade.

World War II accelerated the trend dramatically. With wages frozen by wartime controls, employers began offering health insurance as a way to compete for scarce workers. The National War Labor Board allowed this but capped tax-free health insurance spending at less than five percent of a worker’s annual salary. The real shift came in 1954, when Congress made the tax exemption for employer-paid health premiums permanent and removed the wartime spending cap entirely. Employers could now spend as much as they wanted on health benefits without any of it counting as taxable income for employees. This single tax decision cemented employer-sponsored insurance as the backbone of American healthcare, a role it still plays for roughly half the population.

Medicare, Medicaid, and the Government Safety Net

On July 30, 1965, President Lyndon B. Johnson signed the Social Security Amendments of 1965, creating Medicare and Medicaid in one stroke. These became two of the most enduring social programs in American history, but they took very different paths to get there.

The idea of federally financed health insurance for the elderly had been discussed publicly since 1952 and gained traction in Congress starting in 1957. Medicare was deliberately limited in scope to make it politically viable. It focused on hospital care rather than doctor visits, and its architects emphasized that the government would exercise no supervision or control over how hospitals operated. The goal was coverage, not a government takeover of medicine.

Medicaid had a separate lineage. Federal grants to help states pay for medical care for the poor had existed in various forms, and in 1960 Congress created the Kerr-Mills program to fund medical services for elderly people who couldn’t afford care. When the 1965 legislation came together, lawmakers expanded this concept to cover not just the elderly poor but also children on welfare and other low-income groups. Medicaid emerged as a joint federal-state program, administered by each state individually, which is why eligibility and benefits still vary so much from state to state.

The Affordable Care Act and Modern Reform

The most recent major overhaul came on March 23, 2010, when President Barack Obama signed the Patient Protection and Affordable Care Act into law. Some provisions took effect within months: a ban on canceling people’s coverage after they got sick, a prohibition on denying coverage to children under 19 with preexisting conditions, and the right for young adults to stay on a parent’s plan until age 26. Preventive services like screenings and vaccinations had to be covered without copays starting in the first plan year after September 23, 2010.

The broadest changes arrived on January 1, 2014. That was when state insurance marketplaces opened, Medicaid expanded in participating states, subsidies began flowing to help individuals and small businesses afford coverage, and the individual mandate required most Americans to carry insurance. The law also banned insurers from charging higher prices or denying coverage to adults with preexisting conditions.

From a single hospital in Philadelphia to a system that now consumes roughly one-fifth of the national economy, American healthcare was never designed as a unified system. It was assembled piece by piece over nearly three centuries, each layer responding to the failures and gaps of the one before it.