Wednesday 14 February 2018

Bad Science

If you received all your scientific information through the medium of our (er) media, you'd be excused for giving up. A few months of sensationalist headlines over vague stories that poorly interpret an already poor study and end up contradicting or nullifying the headline in paragraph 22 anyway will wear you down. Check out Kill or Cure [1]. The media misreport and exaggerate every claim and when their own misrepresentations are contradictory they start with "why oh why can't these so called experts make their minds up".

It would be easy to conclude that "studies can prove anything". This is a big win for the industries and organisations who want to keep the truth obscured. "Doubt is our product" as they used to say in the tobacco industry. However, studies can't prove anything. It's very hard for a study to prove the opposite of the truth without flat out lying. There's no study in opposition to Esselstyn showing that the SAD can reverse heart disease, because it can't. What people can do is produce studies that basically give a zero result, that show there's no correlation at all.

This is quite easy to do. Suppose I wanted to show that smoking cigars is not unhealthy. I could take a group of people who all smoke 30 cigarettes a day, and give half of the group a cigar a day as well. After 3 months, I'd find that there was no significant health difference between the two groups. Bang, cigars aren't bad for your health. You'd think no one would fall for this, but it's common practice.  The 2007 Qureshi study on egg consumption did exactly this. “To make [the] analysis representative of the US population” they chose subjects who were overweight (BMI >25) and had cholesterol levels of 220mg/dl or higher (10% higher than the recognised safe level and 50% higher than the vegan average). They took those people, added one egg a day to their diet and found that this did not significantly raise their risk of heart disease - because they already had a high risk of heart disease due to their diet.

Another example is the 2010 Siri-Tarino study which claimed that consumption of saturated fat did not cause heart disease. The study, trumpeted in the press and much loved by the meat industry, did this by comparing a high fat, high animal protein, high cholesterol diet against a low fat... high animal protein, high cholesterol diet. Essentially comparing  steak and pork versus chicken and salmon, finding that each diet was as bad as the other, and “concluding” that saturated fat didn't cause heart disease [2].

Observational studies track the health of large populations and try to correlate the differences to diet, among many other factors. Interventionist studies take a smaller group of people suffering from, or at risk of, a particular disease and see if dietary changes have an effect. Both of those can be expensive, difficult to attract funding for, and suffer from the complexity of unravelling correlation from causation (and additional statistical complexity as detailed below, which can be deliberately exploited).

Reductionist studies, however, are much easier to run. They try to isolate variables by concentrating on biomarkers (such as blood pressure or cholesterol level). The problem is that while we know, for example, that higher cholesterol levels are associated with increased risk of heart disease (and other chronic illnesses), assuming that raising cholesterol levels (or not) is the same as increasing risk of heart disease (or not) is not valid. It's just a marker. As. Dr Campbell points out, it's like noticing that your lawn is going brown and trying to fix the problem by painting it green. It's a gross over-simplification of a very complex system.

The problem is, even with the best of intentions, biomarkers are easily measurable but disease less so. It's much easier to show that a drug reduces blood pressure by 5 points than it is to show that it reduces risk of (or successfully treats) the underlying problem(s) that caused the high blood pressure in the first place. This reductionist approach is the path of least resistance. The experiments are easier to perform and your results are concrete. They just might not be relevant. There is endless research to be done along the lines of does chemical X affect biomarker Y, and considerable profits to be made marketing the drug with the apparent benefit. Statins are a great example. They reduce cholesterol levels by a few points but there's little or no evidence of long-term reduced risk. These studies aren't performed badly, or even badly intentioned, but they're almost always pointless without accompanying research into the bigger picture.

Now, it's also true that broader observational and interventional studies are open to the valid question, more often phrased as a statement, “correlation is not causation”. While I understand that this is colloquially brief, it would be much better with an extra word - “correlation is not necessarily causation”. It could be though. Ideally a combination of broad studies and more reductionist attempts to zero in on the root cause would be best. However, the balance has shifted much too far towards the reductionist, to the point where reductionism and study of complexity for its own sake dominates. Just look at the tens of billions spent on genetic research, the ultimate attempt to find out exactly how many angels dance on the pinhead [3].

Going back to what you read in the papers, you need to do your own homework when evaluating a study. First of all, ignore the media report and go straight to the source. Secondly, check for conflicts of interest in the funding section. If you see an Egg Council or anything like that in there, be very suspicious. Studies that contradict the interests of the people paying for them tend to be quietly buried rather than published. After that, there's nothing for it but to dig in and make your own evaluation of the sample sizes and the methodology. Read the conclusion and check it matches up with what the article or headline said (all too often it flat out doesn't). You can't rely on whoever's telling you about the study - and that includes me! Go and check out all the studies I reference and see what you think.

I had a specific example of this, a story that came out a few weeks ago claiming that “one third of vegans admit to eating meat when drunk”. It was a stupid story based on a “study” that turned out to be a telephone survey run by a coupon website. A better example popped up only a couple of days ago. You may have seen the headlines about how eating cheese reduces risk of heart disease. It was a scam. This was a meta-analysis with at least these notable problems. 1) the studies were mostly dairy industry funded, 2) People reported what they ate themselves, a notoriously unreliable method, 3) as above comparison was made with people already on a high saturated fat diet, they were in effect replacing meat with cheese and 4) it was a cross-sectional observational study, a study type which cannot work for this kind of relationship, for technical reasons which are explained in the sources below. You can bet that the people running this study knew these problems very well, but the studies were run out of a Dutch [4] University funded by the Chinese Dairy company Yili, and you do the math.

A lot of people love to read good news about their bad habits, our right-wing media is particularly keen to bash veganism given its (perceived) left-wing connotation and animal agriculture is more than happy to provide puffed up PR masquerading as research for the media to parrot. Doubt, remember, is our product.

[1] Kill or Cure, "Help to make sense of the Daily Mail’s ongoing effort to classify every inanimate object into those that cause cancer and those that prevent it."

[2] Source for both of these is Meatonomics by David Robinson Simon. Properly interpreted, the Siri-Tarano study actually confounds one of the bigger meat industry myths, that “lean meats” like chicken and salmon are healthier options. If that was true, their study really would have shown that higher fat consumption caused more heart disease.

[3] Particle physics might run it close. I hope that the penny might be dropping there though, as people face the realisation that it might well be turtles all the way down, and all the time and effort devoted to finding new kinds of spuons might be better focused on something more practical, like fusion energy.

[4] The Netherlands is the 3rd largest milk exporting country in the world, quite remarkable given its size.

Further Reading :

Whole: Rethinking the Science of Nutrition by T. Colin Campbell

Meatonomics : How The Rigged Economics of Meat and Dairy Make You Consume Too Much and How To Eat Better by David Robinson Simon

Viewing :

Exposed : Media Claims Cheese is Heart-Healthy (Part 1)

Exposed : Media Claims Cheese is Heart-Healthy (Part 2)

The Saturated Fat Studies: Set Up to Fail


No comments:

Post a Comment