John P. A. Ioannidis, MD, DSc, Director of the Stanford Prevention Research Center and Professor of Statistics, has been a champion for exposing bias, financial influence, and sloppy data interpretation in medical studies for decades. I remember reading one of his analyses 25 years ago in which he exposed the inaccuracies and bias in hypertension research that changed the course of many authorities’ thinking, as the data, delivered as authoritative fact by researchers and media, incorporated into clinical practice guidelines, proved completely unreliable.
In particular, Dr. Ioannidis has been a vocal critic of nutritional epidemiology research as unreliable and, more often than not, simply untrue. In a biting editorial in a recent Journal of the American Medical Association, he stated (emphasis mine):
“Some nutrition scientists and much of the public often consider epidemiologic associations of nutritional factors to represent causal effects that can inform public health policy and guidelines. However, the emerging picture of nutritional epidemiology is difficult to reconcile with good scientific principles. The fields needs radical reform.
“In recent updated meta-analyses of prospective cohort studies, almost all foods revealed statistically significant associations with mortality risk. Substantial deficiencies of key nutrients (e.g., vitamins), extreme overconsumption of food, and obesity from excessive calories may indeed increase mortality risk. However, can small intake differences of specific nutrients, foods, or diet patterns with similar calories causally, markedly, and almost ubiquitously affect survival?
“Assuming the meta-analyzed evidence from cohort studies represents life span-long causal associations, for a baseline life expectancy of 80 years, non experts presented with only relative risks may falsely infer that eating 12 hazelnuts daily (1 oz) would prolong life by 12 years (i.e., 1 year per hazelnut), drinking 3 cups of coffee daily would achieve a similar gain of 12 extra years, and eating a single mandarin orange daily (80 g) would add 5 years of life. Conversely, consuming 1 egg daily would reduce life expectancy by 6 years, and eating 2 slices of bacon (30 g) daily would shorten life by a decade, an effect worse than smoking. Could these results possible be true?”
In other words, associations of food with some outcome—saturated fat and heart disease, red meat and colorectal cancer, coffee and diabetes, etc.—identified via observational data in nutritional epidemiology are almost always due to chance or bias, or are at least wildly exaggerated by researchers. In fact, virtually every food has been associated with cancer risk, for example. As has happened over and over again, findings from nutritional epidemiology crumble when scrutinized via randomized studies, i.e., participants are randomly assigned to eat or behave a certain way. Dr. Ioannidis notes that “Observational epidemiology doesn’t seem to square with randomized trials in clinical nutrition. A number of years ago I looked at the highly cited claims across all medicine, and when it came to observational studies five out of six of the most cited claims were refuted within a decade, typically by very large randomized trials.” Nutritional epidemiology is so utterly clouded by confounding factors, i.e., association of, say, one food or nutrient with other foods, nutrients, and behaviors, that cause-effect connections are almost impossible to make. (See my recent discussion about this including my favorite example of observational-data-gone-wrong: Premarin.)
With these remarks, Dr. Ioannidis is taking aim (without naming names) squarely at the tons of nutritional epidemiological data that come from Dr. Frank Sacks, Walter Willett, Frank Hu and others at the Harvard School of Public Health who unabashedly publish findings from their dietary questionnaires, then state the results as cause-effect fact: “Eating red meat causes cancer.” “Healthy whole grains help lose weight.” “Saturated fat causes heart disease.” Widely publicized findings from the Nurses’ Health Study and Physicians’ Health Study are reported as health facts by media when they are nothing of the sort, little more than speculation.
This is a big problem, as the USDA, the U.S. Department of Health and Human Services, the American Heart Association and other agencies embrace findings of what is essentially fiction. Just over the past few weeks, headlines reading “Low-carb diets linked to dying young” and “Coconut oil deadly,” conclusions all drawn from observations originating with nutritional epidemiology, have dominated public consciousness—even though they are likely to be completely untrue. Also, the wild swings in these speculative reports—one day fat causes heart disease and the next day it does not—has created a confused, now cynical, public.
Bottom line: NEVER take health headlines at face value. Look at how and what a study is reporting before making any judgements. More often than not, the headlines are just plain wrong.
More from Dr. Ioannidis on YouTube:
The Role of Bias in Nutritional Research