Scientists first suspected smoking caused cancer in the late 1890s, but it took until 1964 for the U.S. government to officially declare it dangerous. The gap between those two dates is one of the most consequential delays in public health history, shaped by incomplete science, industry interference, and a product that was deeply woven into everyday life. Here’s how the timeline actually unfolded.
The Earliest Suspicions: 1898 to 1930s
In 1898, a German medical student named Hermann Rottmann proposed that tobacco might be behind the unusually high rate of lung tumors among tobacco workers. He blamed tobacco dust rather than smoke, but the seed was planted. By 1912, a physician named Isaac Adler published the world’s first book dedicated to lung cancer, noting that cases seemed to be rising and pointing to “the abuse of tobacco and alcohol” as one possible explanation. He also cautioned that the evidence wasn’t conclusive.
Through the 1920s, surgeons noticed they were operating on lung cancer more and more often. Smoking was one of several suspected causes, alongside asphalt dust from newly paved roads, industrial pollution, and lingering effects of poison gas exposure from World War I. No one had yet isolated tobacco as the primary driver.
That started to change in the 1930s. Angel Roffo, an Argentine cancer researcher, showed in 1931 that chemicals condensed from burning tobacco could cause tumors when applied to the skin of rabbits. He went on to publish dozens of papers linking smoking to cancer. Around the same time, pathologists discovered that cigarette smoke paralyzes the tiny hair-like structures inside the airways that normally sweep contaminants out of the lungs. This offered a plausible mechanism: smoke was getting trapped where it could do damage. Researchers also identified cancer-causing compounds called polycyclic aromatic hydrocarbons in coal tar during this decade, and Roffo was the first to find the same class of chemicals in cigarette smoke.
The First Real Evidence: 1939 to 1950
In 1939, a German researcher named Franz Hermann Müller published the first case-control study directly comparing lung cancer patients to healthy controls. He examined 86 people with lung cancer and a similar number without it, and found that the cancer patients were far more likely to have been smokers. It was a small study, and it arrived during wartime, so it didn’t get the international attention it deserved.
The real turning point came in 1950, when two large studies landed almost simultaneously. In the United States, Ernst Wynder and Evarts Graham published research linking smoking to lung cancer. In England, Richard Doll and Austin Bradford Hill did the same, demonstrating that the risk of lung cancer climbed with the number of cigarettes smoked per day. Heavy smokers (more than 25 cigarettes daily) faced a risk 25 times higher than nonsmokers. These studies are widely considered the moment the scientific case became undeniable.
Doll and Hill didn’t stop there. In 1951, they launched what became known as the British Doctors Study, mailing questionnaires about smoking habits to every physician registered in the UK. Nearly 60,000 responded. The researchers then tracked these doctors’ health and causes of death for decades, building one of the most powerful long-term datasets in medical history. The study’s findings, updated over the years, would repeatedly confirm and strengthen the link between smoking and premature death.
Smoking and Heart Disease: 1960
Cancer wasn’t the only concern. The Framingham Heart Study, a landmark research project tracking thousands of residents in a Massachusetts town, formally identified cigarette smoking as a risk factor for heart disease in 1960. This was significant because heart disease kills far more people than lung cancer. Smoking wasn’t just a cancer problem; it was a cardiovascular problem, and that broadened the scope of the threat considerably.
The Official Declarations: 1962 and 1964
Governments were slow to act on what scientists already knew. The United Kingdom moved first. In March 1962, the Royal College of Physicians published a report that laid out a seven-point plan: public education campaigns, restrictions on selling tobacco to children, limits on advertising, tax increases on cigarettes, disclosure of tar and nicotine content, restrictions on smoking in public places, and the creation of antismoking clinics. It was the first time a major medical institution put its authority behind a comprehensive policy agenda against tobacco.
Two years later, the U.S. followed. On January 11, 1964, Surgeon General Luther Terry released a report concluding that cigarette smoking causes lung cancer and is a major health hazard. This is the date most people point to when they say smoking was officially declared “bad.” The report drew on over 7,000 scientific articles and represented the consensus of a panel of experts. It landed like a bomb in a country where more than 42% of adults smoked.
Warning Labels and the First Laws
The Surgeon General’s report created political pressure to do something. In 1965, Congress passed the Federal Cigarette Labeling and Advertising Act, which required every pack of cigarettes sold in the United States to carry the words: “Caution: Cigarette Smoking May Be Hazardous to Your Health.” The law took effect on January 1, 1966. The language was deliberately mild, using “may be hazardous” rather than anything stronger.
The tobacco industry didn’t fight these early warning labels as hard as you might expect. Internal documents from British American Tobacco reveal that by 1970, the industry had calculated that vague, small-print warnings actually helped them. A health message attributed to the government, rather than to the companies themselves, gave tobacco manufacturers a defense in lawsuits. They could argue that consumers had been warned. The industry reserved its fiercest opposition for graphic warnings and stronger language that might actually change behavior.
Secondhand Smoke: 1986
For decades, the conversation focused on the people lighting up. That changed in 1986, when the Surgeon General released a report dedicated entirely to involuntary smoking. The findings were stark: the chemical composition of smoke drifting off a burning cigarette between puffs is qualitatively similar to what the smoker inhales. Of 13 studies reviewed, 11 found a positive relationship between secondhand smoke exposure and lung cancer in nonsmokers, with six reaching statistical significance.
The report concluded that involuntary smoking causes disease, including lung cancer, in healthy nonsmokers. It also found that children of smokers had more respiratory infections and slightly slower lung development than children of nonsmokers. A companion review estimated that roughly 20% of the approximately 12,200 lung cancer deaths occurring each year in nonsmokers were attributable to secondhand smoke. This shifted smoking from a personal choice issue to a public health issue affecting bystanders, and it laid the groundwork for smoking bans in workplaces, restaurants, and public buildings over the following decades.
How Smoking Rates Actually Changed
Knowing something is dangerous and acting on that knowledge are two different things. In 1965, the year the first warning labels were mandated, 42.4% of American adults smoked. The decline was gradual, driven by accumulating evidence, advertising restrictions, tax increases, indoor smoking bans, and cultural shifts that made smoking less socially acceptable. By 2022, the adult smoking rate had fallen to 11.6%. By 2023, exclusive cigarette smoking dropped further to 7.9%.
That trajectory spans nearly 60 years. The science was largely settled by 1964, but it took generations of policy changes, public education, and social pressure to cut smoking rates by more than 80%. The lag between scientific certainty and widespread behavioral change is one of the defining features of the tobacco story, and it’s a reminder that discovering a health risk is only the beginning.

