How Was Mental Health Treated in the 1960s?

Mental health treatment in the 1960s was defined by a massive shift: the beginning of the end for large state hospitals and the rise of community-based care, new psychiatric drugs, and talk therapy. But for hundreds of thousands of patients living inside institutions, daily reality often meant overcrowding, heavy sedation, forced labor, and treatments administered with little regard for individual dignity. The decade was a turning point, though the promises made during this era would take decades to fulfill, and many never were.

Life Inside State Mental Hospitals

At the start of the 1960s, roughly 600,000 Americans lived in state-run psychiatric institutions. These facilities were often enormous, housing thousands of patients on sprawling campuses. Conditions varied, but overcrowding was the norm. One former patient at Bellevue described beds lining the hallways and constant conflict among patients. Admission wards were chaotic, and even transfers to specialty units offered only modest improvement.

Daily life for many patients was monotonous and dehumanizing. A first-person account from the era describes a routine of “institutional food, chlorpromazine, and boredom.” Patients were frequently put to work without pay, carrying soiled laundry between buildings for a candy bar or laboring in the hospital bakery after receiving electroconvulsive therapy. Some hospitals let patients shave other patients using dull razors, resulting in nicks and cuts. Social activities like dancing and singing existed but struck many patients as bizarre. Occupational therapy, mostly crafts, was one of the few structured outlets. Weight gain from inactivity and medication side effects was common.

The Drug That Changed Psychiatry

Chlorpromazine, sold as Thorazine, had arrived in the 1950s and by the 1960s was the dominant tool in psychiatric treatment. It was the first drug that could genuinely reduce psychotic symptoms like delusions and hallucinations, not just sedate patients into compliance. At daily doses as low as 75 milligrams, it controlled agitation and excitement in most patients. Its effect was distinctive: patients became disinterested in their surroundings without losing consciousness, with only a slight tendency toward sleep.

The drug came with real costs. Drowsiness and fatigue were constant companions. Repeated injections caused vein irritation and tissue damage. Patients gained weight and felt sluggish. But compared to what came before, including physical restraints, prolonged baths, and insulin-induced comas, chlorpromazine represented genuine progress. Its widespread adoption was one of the key reasons lobotomies fell out of favor and hospital populations began to shrink.

Other medications were finding their footing during this period. Lithium, a simple salt that stabilized mood swings, was already used in 49 countries by the 1960s but remained unapproved in the United States. American psychiatrists who believed in it formed what became known as a “lithium underground,” prescribing it without formal FDA permission. The U.S. didn’t approve lithium until 1970, becoming the 50th country to do so.

Electroconvulsive Therapy and Lobotomy

Electroconvulsive therapy was widely used throughout the 1960s, though the experience varied enormously depending on the facility. Some hospitals had adopted general anesthesia and muscle relaxants by 1960, making the procedure safer and less traumatic. Patients would receive medication to prevent excessive salivation, a barbiturate to induce sleep, and a muscle relaxant to minimize the physical convulsion. These protocols were carefully documented.

Many state hospitals, however, lagged behind. One patient described an assembly-line approach: watching person after person ahead of him convulse, then waking afterward to coffee and a snack before being sent back to the ward. There was little individualized care and no real consent process as we’d understand it today. ECT was sometimes used as punishment or behavioral control rather than targeted treatment for depression.

Lobotomy, the surgical destruction of brain tissue to alter behavior, was already in steep decline by the 1960s. Walter Freeman, its most prolific American practitioner, had performed over 3,000 procedures between 1930 and 1960. But poor patient outcomes, unfavorable portrayals in books and film, growing regulatory scrutiny, and the availability of chlorpromazine collectively pushed the procedure to the margins. By the mid-1960s, lobotomies were rare.

Kennedy’s Push for Community Care

The most significant policy change of the decade came in 1963, when President Kennedy signed the Community Mental Health Act. In a message to Congress that February, Kennedy proposed cutting the institutionalized population in half within a decade or two. The law allocated $150 million in federal grants to build 1,500 community mental health centers across the country. Each center was required to offer five services: inpatient care, outpatient clinics, emergency response, partial hospitalization, and community education.

The idea was straightforward. Instead of warehousing people in distant state hospitals, local centers would provide treatment closer to home, allowing patients to maintain family ties and eventually live independently. Federal funding covered initial construction and three years of staffing, with the expectation that state and local governments would sustain operations afterward.

In practice, the transition was far rougher than planned. State hospitals began discharging patients faster than community centers could open or absorb them. Many of the 1,500 planned centers were never built. Those that did open often focused on patients with milder conditions, leaving people with severe mental illness without adequate support. The consequences of this gap between vision and execution would define mental health policy for decades to come.

New Approaches to Talk Therapy

The 1960s saw a significant expansion in psychotherapy beyond traditional Freudian psychoanalysis. Carl Rogers’ client-centered therapy, developed in prior decades, gained broad clinical influence during this period. Its core premise was that patients held the keys to their own recovery, and the therapist’s job was to create the right conditions for self-discovery rather than to interpret unconscious conflicts.

Rogers identified three conditions a therapist needed to provide: unconditional positive regard (full acceptance without judgment), accurate empathy (skilled listening that reflected the patient’s experience back to them), and genuineness. The approach assumed that people naturally move toward health and wholeness when given a supportive environment. Rather than diagnosing what was wrong and prescribing a fix, the therapist reflected what the client felt and helped replace negative self-perceptions with positive ones. This was a radical departure from the analyst’s couch, and it reshaped how therapists across many traditions thought about the therapeutic relationship.

Milieu Therapy and Therapeutic Communities

Some facilities experimented with milieu therapy, an approach that treated the entire social environment of a ward or residential program as the therapeutic tool. Rather than relying solely on one-on-one sessions or medication, milieu therapy held that everyday interactions, routines, and group dynamics could drive recovery. Workers were trained to see every moment in the facility, not just formal therapy hours, as a potential site of positive change.

In children’s residential treatment, this approach drew on the work of pioneers who had developed theory from clinical cases, emphasizing how a carefully designed physical and social environment could restore healthy development. Staff sometimes chose not to intervene directly in a patient’s behavior, trusting instead that the social pressures and opportunities of group living would shape behavior toward healthier patterns over time. Patients practiced new social skills in the facility with the goal of transferring them to life outside.

The Anti-Psychiatry Movement

The 1960s also gave rise to powerful intellectual challenges to psychiatry itself. Thomas Szasz published “The Myth of Mental Illness” in 1961, arguing that without evidence of neurological disease or damage, psychiatric diagnoses were meaningless. He proposed that most people labeled mentally ill were better understood as experiencing “problems in living.” His 1970 follow-up, “The Manufacture of Madness,” compared the mental health system to the Inquisition.

Szasz’s arguments landed in fertile ground. American culture already valued individualism and was growing skeptical of medical paternalism. Combined with legitimate concerns about the efficacy and decency of psychiatric treatment and the shaky evidence base for some diagnoses, these ideas contributed to what The Lancet described as a “delegitimisation of the field of mental health care” in favor of a philosophy centered on patient autonomy and rights. The practical impact was real: some U.S. mental health laws that followed made it extremely difficult to compel treatment, even for people in psychotic states who could not meaningfully advocate for themselves. Critics later argued that these laws ended up protecting a patient’s psychosis more than the patient.

The anti-psychiatry movement forced the field to confront genuine abuses and sloppy diagnostic practices. But it also made it harder to build public support for the kind of robust, well-funded community mental health system that Kennedy had envisioned. The tension between patient rights and the need for treatment remains one of the central unresolved questions in mental health care, with roots planted firmly in the 1960s.