Gluten-free eating has roots in medical research stretching back to the 1880s, but it didn’t enter mainstream culture until the early 2010s. The path from obscure medical diet to a $6.3 billion global industry took over a century, driven by wartime observations, better diagnostic tools, food labeling laws, and a cultural shift that turned “gluten-free” into both a medical necessity and a lifestyle choice.
The First Clues: 1888 to the 1940s
The story starts with a British physician named Samuel Gee, who described “celiac affection” in 1888. He recognized that diet was central to the condition, noting that “the allowance of farinaceous food must be small.” Farinaceous foods are starchy, grain-based foods, exactly the ones that contain gluten. Gee even documented a child who thrived on a daily diet of Dutch mussels but relapsed when mussel season ended and grains returned to the plate. He was circling the answer without quite landing on it.
The breakthrough came from a Dutch pediatrician, Willem-Karel Dicke. In 1941, he published observations advocating for a wheat-poor diet in children with celiac disease. Then history provided a grim natural experiment. During the Dutch famine of 1944, when almost no bread was available in the Netherlands, children with celiac disease improved dramatically. They relapsed quickly when Allied planes parachuted white bread into the country. Between 1946 and 1960, Dicke and his colleagues conducted the clinical studies that proved wheat flour was the culprit. This was the moment gluten-free eating became a legitimate medical treatment, though it remained confined to the relatively small number of people diagnosed with celiac disease.
Decades as a Niche Medical Diet
For most of the 20th century, gluten-free eating was something only celiac patients dealt with, and not many of them at that. Celiac disease was considered rare, partly because diagnosis depended on intestinal biopsies and recognizing a narrow set of symptoms. The food options were limited and often unappetizing. There were no dedicated grocery aisles, no labeling standards, and very little public awareness.
The first international attempt to define “gluten-free” came in 1981, when the Codex Alimentarius Commission (the food standards body run by the World Health Organization and the United Nations) established standards for gluten-free foods based on the nitrogen content of raw ingredients. These early standards were crude by today’s measures, but they marked the first time any governing body tried to put a formal definition behind the term.
Better Tests Changed Everything
A quiet revolution in the 1990s and 2000s reshaped understanding of how common celiac disease actually is. Older screening methods, based on anti-gliadin antibody testing, were eventually replaced by more sensitive and specific blood tests for transglutaminase autoantibodies. These newer tests made screening faster and more accurate, which meant more people got diagnosed. Studies using these tools revealed that celiac disease affected roughly 1 in 100 people in many populations, far more than previously believed. Suddenly, gluten-free wasn’t a fringe concern. It was relevant to millions of people who had gone undiagnosed for years or even decades.
The Early 2000s: Grocery Stores Take Notice
By the mid-2000s, retailers were beginning to respond. An NBC News report from that era noted that Whole Foods Market listed more than 800 gluten-free items, up from about 250 seven years earlier. Walmart began stocking gluten-free products on its shelves. Whole Foods even operated a dedicated Gluten-Free Bakehouse in North Carolina. These were early signs of a shift: gluten-free was moving from specialty health food stores into mainstream retail.
The products themselves were changing, too. Early gluten-free bread and pasta had a reputation for tasting like cardboard. As demand grew, food manufacturers invested in better formulations, and the quality gap between gluten-free and conventional products started to narrow.
The 2010s Boom
The real explosion happened between 2010 and 2015. Google search data tells the story clearly: in 2010, only about 14% of countries worldwide had “gluten-free diet” among their top three dietary search terms. By 2014, that figure had tripled to nearly 42%. By 2018, half of all countries tracked showed gluten-free diet as a top search interest. American spending on gluten-free foods surpassed $15.5 billion in 2016 alone.
Several forces converged to fuel this surge. Books like “Wheat Belly” (2011) and “Grain Brain” (2013) made bestseller lists by arguing that wheat and gluten were harmful for everyone, not just celiac patients. Celebrity endorsements followed. Tennis star Novak Djokovic credited a gluten-free diet with improving his performance. Gwyneth Paltrow promoted it through her lifestyle brand. Media coverage snowballed, and gluten-free shifted from a medical diet into a wellness trend.
In 2011, a panel of international experts met in London to formally recognize a condition called non-celiac gluten sensitivity. Their consensus, published in 2012, acknowledged that some people experienced real symptoms from gluten without having celiac disease or a wheat allergy. This gave scientific weight to what many people were reporting anecdotally, though the experts noted the definition was provisional and needed refinement. The recognition of this new category expanded the potential audience for gluten-free products well beyond celiac patients.
Official Labeling Rules Arrive
On August 2, 2013, the U.S. Food and Drug Administration issued a final rule defining “gluten-free” for food labeling. For the first time, any product carrying a “gluten-free” label in the United States had to meet a specific standard: fewer than 20 parts per million of gluten. This threshold was based on research suggesting it was safe for most people with celiac disease. Before this rule, “gluten-free” on a package was essentially an honor system. The FDA regulation gave consumers, especially those with celiac disease, confidence that the label actually meant something.
The Codex Alimentarius updated its own international standards around the same time, endorsing a specific lab test (the R5-ELISA) as the method for verifying gluten content. Countries across Europe, Australia, and South America adopted similar thresholds, creating a more consistent global framework.
Where the Market Stands Now
The global gluten-free food market was valued at $6.3 billion in 2025, with projections reaching $13.2 billion by 2036 at a growth rate of about 7% per year. That growth is notable because it persists long after the initial trend peaked. The market has settled into something more durable than a fad, sustained by a combination of diagnosed celiac patients, people with non-celiac gluten sensitivity, and a substantial group who choose gluten-free eating as a lifestyle preference.
That last group is worth noting. Research has found no evidence that a gluten-free diet benefits people who don’t have a gluten-related medical condition. For those without celiac disease or gluten sensitivity, avoiding gluten can mean missing out on whole grains, fiber, and certain vitamins that are abundant in wheat-based foods. The trend has been a genuine gift for people who medically need these products, making their daily lives vastly easier. For everyone else, it’s a dietary choice with trade-offs rather than clear benefits.

