On this page
- The Influence of the Seventh-Day Adventist Church
- 1898
- The 1950s: The Rise of Ancel Keys and the “Seven Countries Study”
- The “Seven Countries Study”: A Landmark, But Flawed
- But Here’s the Catch: Methodological Mayhem
- The Sucrose Connection
- The Foundation of Dietary Guidelines (Despite the Flaws)
- 1960s: Sugar Steps into the Spotlight
- Early 1970s: Economic Shocks & the Rise of Cheap Food
- Nixon, Gold, and Food Prices
- Enter Earl Butz and Subsidies Galore
- The Consequences
- Early Seeds of Dietary Change
- The Mid-1970s: The “Great Nutritional Experiment” Begins
- HFCS Comes to America (and Changes Everything)
- The McGovern Report and the 1977 Dietary Goals
- The Low-Fat Craze Takes Off
- The Framingham Heart Study: A Missed Opportunity
- The Minnesota Coronary Survey: A Suppressed Counter-Narrative?
- Beyond the 1970s: Cementing the Low-Fat Dogma
- Challenging the Hypothesis (and Failing): The MRFIT Study
- The Lipid Research Clinic Coronary Primary Prevention Trial (LRC-CPPT): “Proof” at a Price
- The NIH Consensus Conference: “Sign It or Stay In”
- The Food Guide Pyramid: Reinforcing the Message
- Conclusion: Unraveling the Low-Fat Legacy
The Influence of the Seventh-Day Adventist Church
Founded by Ellen G. White, who, following a traumatic brain injury, professed to believe in an impending apocalypse and a need to purify both body and soul, the church held a strong conviction that the food supply needed to be altered to reduce meat consumption. The church sincerely believed that abstaining from meat was essential for both spiritual well-being and physical health, seeing meat as a cause of “carnal desires” and various health problems.
1898
Enter John Harvey Kellogg. Hired by Ellen White and tasked with developing a meat substitute for breakfast, Kellogg, a man of… unconventional ideas, created Kellogg’s Corn Flakes. But these weren’t just any breakfast cereal; Kellogg viewed them as an “anti-masturbatory” food, designed to suppress the sex drive, particularly in young men. (Yes, you read that right.)
Kellogg wasn’t just a cereal inventor, though. He was a celebrity doctor in his day, a prominent speaker, and the author of numerous books. He advocated for extreme (and frankly, horrifying) measures to “purify” people from sexual urges. He’s even been known to promote putting carbolic acid on the clitorises of young females and performing surgeries without anesthesia.
It sounds bizarre, and well, it is. But here’s where it connects back to our larger story: the Seventh-Day Adventist Church, with its emphasis on a plant-based diet, contributed to a shift away from traditional, meat-heavy meals. And according to Lysiak, some research even suggests that eating more grains and less meat can lower one’s sex drive. (Whether that’s a good thing or not is a topic for another blog post!). And Global India University, an arm of the Seventh-Day Adventist Church, has been a source of observational studies critical of red meat.
After Kellogg, Lanna Cooper stepped in and started the American Dietetic Association, which became the institutionalized government arm of nutrition.
The 1950s: The Rise of Ancel Keys and the “Seven Countries Study”
If the Seventh-Day Adventist Church laid some of the groundwork for a shift away from meat-heavy diets, it was Ancel Keys who truly ignited the firestorm that would become the saturated fat debate. The 1950s marked the beginning of Key’s rise to prominence, and his research would profoundly shape dietary recommendations for decades to come – for better or for worse (and, as we’re exploring, possibly for the worse). With President Eisenhower’s heart attack in 1955, public fear drove attention and money towards finding scientific explanations.
The year was 1952, and the setting was a World Health Organization meeting in Amsterdam. It was there that Ancel Keys presented his central hypothesis: “fatty diet raised serum cholesterol caused atherosclerosis and myocardial infarction.” In simpler terms, Keys proposed that eating a diet high in fat (particularly saturated fat) raised cholesterol levels in the blood, which in turn led to the build-up of plaque in the arteries (atherosclerosis), ultimately causing heart attacks (myocardial infarction). It sounded straightforward, even logical. But the evidence behind this hypothesis, as we’ll see, was far from rock solid.
The “Seven Countries Study”: A Landmark, But Flawed
To support his hypothesis, Keys embarked on what would become his most famous (or infamous, depending on who you ask) project: the “Seven Countries Study.” Launched in 1958 and continuing into the 1970s, this ambitious study aimed to examine the relationship between diet, lifestyle, and heart disease in different populations across the globe. The countries included were: the United States, Japan, Italy, Greece, Yugoslavia, the Netherlands, and Finland.
Keys and his team collected data on dietary habits, cholesterol levels, and rates of heart disease in these different countries. The team performed the first multivariate regression analysis without the help of computers. The initial results seemed to support his hypothesis: countries with higher intakes of saturated fat, like Finland and the United States, also had higher rates of heart disease. Conversely, countries with lower saturated fat intakes, like Japan and Greece, had lower rates of heart disease.
The study’s impact cannot be overstated. It became a cornerstone of the saturated fat hypothesis, providing seemingly compelling evidence that saturated fat was a major culprit in heart disease. This study and Keys’ ideas rapidly gained traction, influencing doctors, public health officials, and eventually, dietary guidelines.
But Here’s the Catch: Methodological Mayhem
Despite its widespread influence, the Seven Countries Study has been heavily criticized for its methodological flaws. In fact, some critics have stated the study methodology wouldn’t “stand today even amongst an 8th grader trying to do a science fair project”. Let’s delve into some of the most significant issues:
- The 22 Countries Study: Keys had actually be analyzing every country in the world that had data for two things: mortality (how people died) and food availability (how much and what kind of food is produced). 22 countries met that criteria and yet he cherry picked the 7 countries that neatly fit his desired trend to support his hypothesis. This is a clear example of selection bias, only including data that supports a pre-existing conclusion.
- Food Availability Data: The study didn’t actually measure what people were eating. They only used data about how much food was produced in each country as a proxy. This didn’t take into account imports, exports, and food waste which would paint a much more accurate picture of the population’s diet.
- Mortality Data: Determining someone’s cause of death is a process of reducing a complicated story into a neat bucket. For example, someone with diabetes might end up in the hospital for sepsis and need an amputation then dies from blood clot complications. This person’s death certificate would say “Heart Disease”. If that was the quality of data from the US, other countries were doing an even worse job at reporting without autopsies. Furthermore, the study showed that increased fat consumption was correlated with a decreased rate of mortality from every other disease. But that never gets talked about.
- Ecological Study Design: The study was an “ecological” study, meaning it looked at data at the population level rather than at the individual level. This makes it difficult to establish direct cause-and-effect relationships. Just because a country consumes more saturated fat and has higher rates of heart disease doesn’t prove that saturated fat causes heart disease in individuals within that country.
- Confounding Factors: The study failed to adequately account for other factors that could have influenced heart disease rates. Things like physical activity, smoking, stress levels, and access to healthcare were not properly controlled for.
- Data Manipulation/Destruction: Keys manipulated or selectively presented the data to strengthen his argument. Furthermore, the analysis was done without computers. With original raw data from the study having been lost or destroyed, no one has been able to independently verify his findings.
- Conflicts of Interest: Keys had received funding from the sugar industry.
The Sucrose Connection
The fact that the incidence rate of coronary heart disease was significantly correlated with the average percentage of calories from sucrose in the diets is explained by the intercorrelation of sucrose with saturated fat.
This is a direct quote from keys’ own published work. He acknowledged that the same correlation was observed between increased sugar consumption and and heart disease. Yet, he decided to publish the version that agreed with his theory.
The Foundation of Dietary Guidelines (Despite the Flaws)
Despite all these criticisms, Keys’ theory that saturated fat causes heart disease, though lacking solid evidence, became the foundation for dietary guidelines. It’s a remarkable example of how a flawed study can have a lasting and profound impact on public health. The stage was set for a decades-long battle against saturated fat, a battle that continues to this day.
1960s: Sugar Steps into the Spotlight
As Ancel Keys’ influence grew, the message to limit saturated fat began to permeate the medical and public health communities. But while saturated fat was being demonized, another player was quietly positioning itself to take center stage: sugar. The 1960s witnessed some key developments that would profoundly affect the trajectory of dietary advice, not least of which was the advent of High Fructose Corn Syrup (HFCS).
The sugar industry, feeling threatened by the growing focus on saturated fat, decided to take matters into its own hands.
-
Paying for “Science”: In 1965, the Sugar Research Foundation (SRF) approached Dr. Frederick Stare, then chair of Harvard’s School of Public Health Nutrition Department, to conduct a literature review on sugar, fat, and coronary heart disease. The SRF then paid Harvard researchers the equivalent of $48,900 in today’s dollars to conduct a literature review on sucrose consumption and coronary heart disease. It’s important to note that the SRF was funded by the Sugar Association, a powerful trade group representing the interests of the sugar industry.
-
A “Sweet” Conclusion: This funded review was published in 1967 in a top medical journal. The conclusion? Reducing cholesterol and saturated fat was the only dietary intervention needed to prevent heart disease. In other words, sugar was essentially given a free pass.
-
The Catch? No Disclosure: Critically, the funding and industry influence were not disclosed at the time of publication. Readers of the study would have had no idea that the research was essentially bought and paid for by the sugar industry.
In 1966, Japan successfully developed high fructose corn syrup. While it might not seem significant at first glance, this seemingly minor technological advancement would have major ramifications in the coming decades. HFCS offered a much cheaper alternative to sucrose (table sugar), making it an attractive option for food manufacturers looking to cut costs and sweeten their products. Little did anyone know, this cheap sweetener would soon become ubiquitous in the American food supply, contributing to a surge in sugar consumption and, potentially, a host of health problems.
Early 1970s: Economic Shocks & the Rise of Cheap Food
The 1960s ended with a growing debate over sugar versus fat, and the 1970s opened with a series of economic shocks that would reshape the American food landscape in ways few could have predicted. This was an era of government intervention, economic upheaval, and the beginning of a long-term shift toward cheaper, more processed foods. The key players? President Richard Nixon and his Secretary of Agriculture, Earl Butz.
Nixon, Gold, and Food Prices
In the early 1970s, governments were very aware of the link between government instability and rising food prices. The price of food was becoming a serious political concern. To understand what happened next, we need to grasp a key point about economic policy. Arthur Burns, who headed the Treasury under Nixon, recognized that rising food prices were a politically sensitive issue. He believed there were essentially two ways to manage those prices (or at least the perception of those prices):
- Suppress Information: Remove or downplay elements in the Consumer Price Index (CPI) that accurately reflected the true cost of essential goods, particularly food. This is a way of masking the real effects of inflation.
- Intervene Directly: Subsidize food to make it artificially cheaper. This, of course, requires government spending and can lead to unintended consequences, as we’ll see.
The first domino to fall was the decoupling of the dollar from gold in 1971. For years, foreign banks (though not the American public) had been able to redeem their dollars for gold, which provided a restraint on the government’s ability to simply print more money. This kept inflation relatively in check.
However, on August 15th, 1971, in a speech now known as the “Nixon Shock,” Nixon unilaterally severed the dollar’s direct convertibility to gold. This effectively ended the Bretton Woods system and ushered in an era of fiat currency – money that is not backed by a physical commodity like gold. Government leaders were very aware that food prices were going to rise extensively.
Enter Earl Butz and Subsidies Galore
With the dollar no longer tied to gold, the government faced the challenge of managing rising food prices. Volatility of food prices affected presidential approval so Nixon tasked Earl Butz, the Secretary of Agriculture, with lowering and stabilizing food prices. Butz’s solution was to make food artificially cheaper at the expense of quality.
Butz printed billions in subsidies to the corn, soy, and sugar industries with the goal of consolidating American farmland. The idea was that by subsidizing these key agricultural commodities, they could lower the cost of the American food supply for the consumer, thereby offsetting the inflationary pressures of the new fiat currency system. It essentially amounts to gaslighting, instead of revealing to Americans the real consequences of decoupling and monetary inflation, the food supply was shifted, so that Americans wouldn’t really notice the rising cost of food.
The Consequences
- Overabundance of Cheap Crops: Subsidies led to a massive overproduction of corn, soy, and sugar.
- Rise of Processed Foods: This surplus of cheap ingredients fueled the rise of processed foods, which rely heavily on corn syrup, soybean oil, and sugar.
- Relative Increase in Whole Food Costs: As processed foods became cheaper and more readily available, the cost of nutrient-dense whole foods, particularly meat, increased relative to processed alternatives. This made it more difficult for people to afford healthy, unprocessed foods.
- A 55e Gaslighting Campaing: A Fiat money printer was directed towards “non-foods”, such as Loma Linda University to produce studies validating religious convictions, which validated religious convictions, rather than conducting actual science.
Early Seeds of Dietary Change
Interestingly, even amidst this economic upheaval, dietary recommendations were beginning to shift, foreshadowing the low-fat era to come. In 1970, the USDA, AMA, and AHA all called for dietary fat reduction, with recommendations to limit saturated fats to less than 10% of calories.
The underlying (and ultimately flawed) logic went something like this: Dietary fat raises LDL cholesterol (A leads to B), and elevated LDL cholesterol is correlated with cardiovascular disease (B leads to C). Therefore, dietary fat must cause cardiovascular disease (A leads to C). However, this logic is incorrect. Just because A leads to B, and B leads to C, doesn’t guarantee that A causes C. The only thing that is guaranteed is that no C implies no A (no cardiovascular disease implies no dietary fat).
Even more disconcertingly, there’s evidence that the Fiat money printer was directed toward Loma Linda University, an arm of the Seventh Day Adventist Church, to produce studies validating religious convictions (like the belief that red meat is harmful) rather than conducting actual science. It’s important to note that these studies are often observational, meaning they can show correlations but cannot prove causation.
The events set the stage for an all-out assault on fat and laid the foundation for dietary guidelines that would prioritize carbohydrates over fats. But the question remains: were these guidelines based on sound science, or were they influenced by the economic and political pressures of the time, combined with the influence of specific interest groups like the Seventh-Day Adventist Church?
Despite this shift, in 1971, most Americans still ate meat for breakfast, lunch, and dinner. This disconnect between the emerging dietary advice and actual eating habits highlights why Arthur Burns felt the need to “manage” the public’s perception of rising food costs by tinkering with the Consumer Price Index.
The Mid-1970s: The “Great Nutritional Experiment” Begins
The mid-1970s marked a watershed moment in the history of American dietary advice. The economic and political pressures of the early part of the decade, combined with the lingering influence of Ancel Keys and the sugar-versus-fat debate, culminated in the first set of official U.S. Dietary Guidelines. In essence, this was the beginning of a vast, uncontrolled experiment on the American population, with potentially disastrous consequences.
HFCS Comes to America (and Changes Everything)
Before we delve into the guidelines themselves, it’s crucial to acknowledge a key ingredient that entered the scene in 1975: High Fructose Corn Syrup (HFCS) arrived in America! As we discussed earlier, this cheaper alternative to sugar was a boon to food manufacturers. It allowed them to create sweeter, more palatable products at a lower cost, further fueling the rise of processed foods.
The McGovern Report and the 1977 Dietary Goals
The official name was the Dietary Goals for the United States, but most people know it as the McGovern Report. This report, driven by the Senate Select Committee on Nutrition and Human Needs (chaired by Senator George McGovern), was a landmark document that sought to provide dietary recommendations for the entire nation. This was a pivotal moment.
In 1977, the committee came under pressure from congress to give the public an answer. Mark Hegsted, a nutrition professor at Harvard, led a group of scientists to release the report. The two primary recommendations were: increase carbohydrate consumption to 55-60% of total calories, and reduce fat consumption to 30% of total calories.
However, the path to these recommendations was far from smooth. Scientists pushed back, arguing there was insufficient evidence to support such sweeping dietary changes. George McGovern, the chair of the panel, countered that senators don’t have the luxury that research scientists do, which is to wait until every last shred of evidence is in. They needed to act, he argued, even if the science wasn’t fully settled.
Philip Handler, the president of the National Academy of Science, vehemently opposed this view, asking, “What right has the federal government to propose that the American people conduct a vast nutritional experiment with themselves as subjects on the strength of so little evidence?” The debate was fierce, highlighting a fundamental tension between the need for public health guidance and the limitations of scientific knowledge.
Hegsted presented what he called the “fallback position,” essentially arguing that the question should not be “why should we change our diet?” but “why not?”. He believed that there were no identifiable risks associated with the proposed changes and that important benefits could be expected. This argument leaned heavily on the precautionary principle.
Another key aspect of the report was the unintended consequences of these dietary recommendations. The recommendation to increase carbohydrates was made because something had to replace the fat that was being removed. However, a staff member who helped write the report later lamented that this seemingly simple recommendation inadvertently “opened a Pandora’s Box for an entirely new culture of engineered food products.” A recommendation to limit the consumption of sugar was buried due to lobbying forces. The intention had been to encourage the consumption of fruits and vegetables, but instead, it paved the way for a flood of processed foods high in refined carbohydrates and sugar.
The Low-Fat Craze Takes Off
Despite these concerns, the die was cast. The McGovern Report and the subsequent Dietary Guidelines officially ushered in the era of the “low-fat” diet. The food industry jumped right on board following the government’s lead and fueled the low-fat craze, creating everything from low-fat salad dressing to fat-free yogurt and low-fat desserts.
However, removing fat from processed food made them taste bad so the fat was replaced with either high fructose corn syrup (55% fructose) or sucrose (50% fructose).
As good citizens, we listened wholeheartedly, which might explain our current obesity problem. Quite simply, this turned out to be the largest uncontrolled experiment ever done on human beings and it failed miserably.
The Framingham Heart Study: A Missed Opportunity
Intriguingly, even as the McGovern Committee was crafting its low-fat recommendations, the long-running Framingham Heart Study was yielding data that painted a more nuanced and, in some ways, contradictory picture of heart disease risk. While the Framingham study had been instrumental in identifying cholesterol as a risk factor, newer research was drilling down into specific “fractions” or subclasses of cholesterol to improve the prediction of heart disease.
In the early 1970s, the NIH funded a large cohort study across five different locations: Framingham, Albany, Honolulu, Puerto Rico, and San Francisco. The researchers analyzed data from 900 patients who had experienced a coronary event and 900 matched controls from their respective cohorts.
At the time, directly measuring LDL cholesterol was impossible; they used a formula based on total cholesterol, HDL cholesterol, and triglyceride levels to estimate LDL. What they discovered was significant:
- LDL: A Marginal Risk Factor? In the prospective Framingham cohort of roughly 1,800 to 2,000 people, an elevated level of LDL cholesterol was found to be only a marginal risk factor. This was a significant departure from the prevailing wisdom.
- LDL Patterns: Pattern A LDL are large buoyant particles that are not correlated with heart disease. Pattern B LDL are small dense particles known to be correlated with heart disease. It was also shown that fat consumption icnreases Pattern A while carbohydrate consumption increases Pattern B.
- HDL and Triglycerides: The Real Story? More importantly, the study found that low HDL (high-density lipoprotein) was the single greatest predictor of cardiovascular risk in both men and women, across all age groups from 30 to 80 years old. Elevated triglyceride levels were almost equally strong in predicting cardiovascular risk in all but one subgroup (women under 50).
- Cholesterol in Older Women: In women over 50, cholesterol actually had no predictive value regarding heart disease.
- Diet’s Limited Impact: The authors of the study even noted that diet, as measured in the study, did not fully explain the range of serum cholesterol levels within the Framingham study group. In other words, what people ate didn’t necessarily correlate with their cholesterol levels.
Given that it is well-known that saturated fat raises HDL, while carbohydrates lower it, these findings should have prompted a major re-evaluation of the saturated fat hypothesis. However, the presenter in this clip says that this story went effectively ignored and unnoticed until the mid-1990s.
According to a staff member who helped write the report, After the publication of the Framingham study, the lead author, Gordon, presented the findings to the NIH, but they were viewed as completely irrelevant and uninteresting. The presenter even notes that the Framingham risk calculator over one year actually showed a decrease in the subjects on the high-fat low-carb diet.
Why was this groundbreaking research seemingly dismissed? The dominance of the saturated fat hypothesis, combined with the political and economic pressures to release the Dietary Goals, likely contributed to the oversight. The Framingham study’s findings, had they been heeded, could have altered the course of dietary advice and potentially prevented the “great nutritional experiment” that was about to begin.”
Here’s the subsection on the Minnesota Coronary Survey, highlighting its potential impact and controversial suppression:
The Minnesota Coronary Survey: A Suppressed Counter-Narrative?
This study, conducted from 1973, had the potential to challenge the prevailing narrative about saturated fat, but its results were suppressed for years, preventing it from influencing the ongoing dietary debate. Ancel Keys was not directly involved in conducting the study, which was led by Ivan D. Frantz Jr. and his colleagues who had connections with Keys at the University of Minnesota at the time.
The Minnesota Coronary Survey was a landmark research effort in that it differ from previous studies as it would be a clinical double-blind placebo gold standard study that would establish a causal relationship between saturated fat and heart health. It was a four-year survey done over a few thousand patients in nursing homes in Minnesota.
The study involved 9,000 men and women in six Minnesota mental institutions, providing researchers with an unusual degree of control over the participants’ diets. For four and a half years, one group consumed a diet with 9% saturated fat, while the control group consumed 18% saturated fat. Both groups had roughly the same total fat intake, but the treatment group had less saturated fat and more mono- and polyunsaturated fats. Because all of the patients’ food intake could be controlled, “you could really do good science”. Despite the absence of evidence from the study, the American dietary guidelines moved forward.
What did the study find?
- Cholesterol Reduction: The low-fat diet did lower cholesterol, from 207 to 175 mg/dL, whereas the control group saw a smaller change, from 207 to 203 mg/dL.
- No Difference in Heart Disease or Mortality: Critically, after four and a half years, there was no statistically significant difference in heart disease or overall mortality between the two groups. In fact, there were 269 deaths in the intervention group and 206 deaths in the control group, though this difference was not statistically significant.
- Delayed Publication: The study was completed in 1973, but it wasn’t published until 1989 – a full sixteen years later! According to the presenter, the principal investigator delayed publication because he was “disappointed in the way the results turned out.” This delay is a tragedy because the presenter argues the study could have significantly influenced the debate around dietary fat during those crucial years.
As one source said, the findings were not released or discussed, yet, Despite the absence of evidence from the study, the American dietary guidelines moved forward. The main results were first published in 1989, 16 years after the study’s completion, in the journal Arteriosclerosis
Years later, in 2014, someone from The New York Times investigated the study after remembering hearing about it as a child. The investigator was able to get in touch with one of the researcher’s sons, who had all of the study’s boxes in their basement. The contents of the boxes revealed that the study did not validate Keys’ diet-heart hypothesis. The study indicated that lower cholesterol was associated with poor health outcomes and higher cholesterol was associated with better outcomes and lower mortality rates. When asked why the data wasn’t released earlier, one of the researchers said that they were “so disappointed in the results”. Additional raw data and previously unpublished records, including a 1981 master’s thesis by S K Broste, were recovered and re-analyzed in a 2016 publication in The BMJ.
The Minnesota Coronary Survey raises serious questions about scientific integrity and the selective presentation of data. Its suppression prevented a crucial counter-narrative from entering the public discourse, potentially contributing to the widespread adoption of low-fat diets based on incomplete and biased evidence.”
Beyond the 1970s: Cementing the Low-Fat Dogma
Despite the mixed messages coming from research like the Framingham Heart Study and the suppressed Minnesota Coronary Survey, the low-fat message continued to gain momentum throughout the 1980s and 1990s. This period was marked by large-scale clinical trials, a highly influential NIH consensus conference, and the introduction of the Food Guide Pyramid, all of which reinforced the idea that saturated fat was a primary driver of heart disease.
Challenging the Hypothesis (and Failing): The MRFIT Study
One of the first major trials to test the low-fat hypothesis was the Multiple Risk Factor Intervention Trial (MRFIT). This $15 million study (approximately a quarter of a billion dollars in today’s dollars) involved 13,000 high-risk men aged 35 to 57.
Participants were randomized into two groups: a control group and a treatment arm. The treatment arm received instructions to stop smoking, medication to control blood pressure, and advice to consume a low-fat, low-cholesterol diet.
After seven years, the results were surprisingly underwhelming. There was no statistically significant difference in deaths between the intervention group (41.2 deaths per thousand) and the control group (40.4 deaths per thousand).
The MRFIT study, despite its size and expense, was essentially a negative study that challenged the hypothesis linking dietary fat and heart disease. It suggested that lowering cholesterol and reducing other risk factors might not necessarily translate into reduced mortality.
The Lipid Research Clinic Coronary Primary Prevention Trial (LRC-CPPT): “Proof” at a Price
The Lipid Research Clinic Coronary Primary Prevention Trial (LRC-CPPT) was an even more expensive undertaking than MRFIT. This study looked at 3,800 men with elevated cholesterol who were asymptomatic.
The intervention was twofold: a low-fat, low-cholesterol diet and a new class of medication called a bile acid sequestrant. Participants were randomized into two groups: one group received the low-fat, low-cholesterol diet and a bile acid sequestrant, while the other group received a placebo control.
After 10 years, mortality was 35.8 per 1,000 in the intervention group versus 37.3 deaths per thousand in the control group. While this was a virtually zero absolute risk reduction, it was statistically significant, albeit not necessarily clinically significant.
Despite the minimal clinical benefit, the LRC-CPPT study was hailed as definitive proof that “cholesterol is a killer.” Newspaper headlines blared the message, and the study had a profound impact on public perception. One article notes that this study resulted in the NYT front page publishing images of bacon and eggs with a sad face. The authors of the paper prudently noted that caution should be exercised before extrapolating the LRC findings to cholesterol-lowering drugs other than bile acid sequestrants. They added that the LRC was not designed to directly assess whether cholesterol lowering by diet prevents coronary heart disease.
The NIH Consensus Conference: “Sign It or Stay In”
The culmination of these efforts came in 1984 with the NIH consensus conference, headed by Basil Rifkind, in Bethesda, Maryland. Basil Rifkind, the NIH director of the LRC, stated that it was now indisputable that lowering cholesterol with diet and drugs can actually cut the risk of developing heart disease and having a heart attack. He also stated that the more you lower cholesterol and fat in your diet, the more you reduce your risk of heart disease. The presenter questions where Rifkind was getting this information.
However, the process of reaching that consensus was highly questionable. According to a member of the consensus panel, the members were essentially brought into a room in Bethesda and told, “This is the consensus, sign it.” Panel members fought and disagreed for a day and a half until they finally agreed and signed it.
According to Dan Steinberg, the co-investigator of the LRC study, if there had been a real consensus, there wouldn’t have been a need to have a consensus conference.
The conclusion from the NIH consensus statement was that there is no doubt that a low-fat diet will afford significant protection against coronary heart disease to every American over the age of two. There was nothing about a consensus; the panel members were not allowed to leave the room until they signed.
The Food Guide Pyramid: Reinforcing the Message
The message was clear: fat was the enemy. And in 1992, that message was visually reinforced with the introduction of the Food Guide Pyramid. At the base of the pyramid were carbohydrates, particularly refined carbohydrates like breads, pasta, rice, and cereals, of which we were told to eat six to 11 servings a day.
Meanwhile, fats and oils were relegated to the very tip of the pyramid, to be consumed “sparingly.” These first flawed guidelines were replaced by even worse recommendations—the Food Guide Pyramid in 1992. At the base of the pyramid were carbohydrates, particularly refined carbohydrates like breads, pasta, rice and cereals, of which we were told to eat six to 11 servings a day.
These carbohydrates break down to sugar, which gets stored in your body as fat. In addition to the 152 pounds of sugar we eat every year, we’re getting 146 pounds of flour that also breaks down into sugar. Altogether, that’s nearly a pound of sugar and flour combined for every American, every day! That’s a pharmacologic dose of sugar.
In 2010, MyPlate—our government’s new, “improved” food icon—replaced the outdated Food Pyramid from 1992. The idea was to simplify dietary recommendations and make them easier to understand at a glance. While MyPlate was arguably a slight improvement over the pyramid, it still perpetuated some of the core flaws of the low-fat paradigm.
Conclusion: Unraveling the Low-Fat Legacy
From the unusual influence of the Seventh-Day Adventist Church to the rise of Ancel Keys, the economic pressures of the 1970s, and the flawed science that underpinned the McGovern Report, we’ve journeyed through a complex and often contradictory history of dietary advice. We’ve seen how a confluence of factors – including questionable research, industry influence, and well-intentioned but ultimately misguided policies – led to the widespread adoption of low-fat diets.
Despite data from the Framingham Heart Study and the suppressed Minnesota Coronary Survey pointing to a more nuanced understanding of heart disease risk, the message that saturated fat was the enemy became firmly entrenched. The MRFIT and LRC-CPPT trials, despite their mixed results, were used to further reinforce this message, culminating in the influential NIH consensus conference and the introduction of the Food Guide Pyramid. MyPlate was arguably a slight improvement, but the low-fat paradigm still dominated.
But perhaps we must ask: Was a low fat diet helpful? If so, then we should find that people are healthier today than in the 1950s. Instead, we see higher and higher instances of obesity, type-II diabetes, and heart disease.
The consequences of this “great nutritional experiment” are now becoming increasingly clear: rising rates of obesity, type 2 diabetes, and other chronic diseases. It’s time to critically re-evaluate the low-fat dogma and consider the possibility that we’ve been focusing on the wrong dietary villain all along.
This is just the beginning of the story. We encourage you to explore our other articles on sugar, healthy fats, and the latest research in nutrition to gain a deeper understanding of these complex topics. Knowledge is power, and by educating ourselves, we can make more informed choices about our diets and our health.