Tag Archives: bad science

Fake weight-loss study example of wider problem

bad science

Click on image to enlarge

Another interesting article in the Conversation – Trolling our confirmation bias: one bite and we’re easily sucked in by Will Grant. It underlines a point  I have often made – that the sensible reader must approach the scientific literature intelligently and critically.
Grant describes a “scientific” prank which fooled many news outlets who reported the “scientific finding”, and, therefore, many readers.

“Last week science journalist John Bohannon revealed that the whole study was an elaborate prank, a piece of terrible science he and documentary film makers Peter Onneken and Diana Löbl – with general practitioner Gunter Frank and financial analyst Alex Droste-Haars – had set up to reveal the corruption at the heart of the “diet research-media complex”.”

The first trick

This was more than just planting a fictitious “science” story:

“To begin the study they recruited a tiny sample of 15 people willing to go on a diet for three weeks. They divided the sample into three groups: one followed a low carbohydrate diet; another followed that diet but also got a 42 gram bar of chocolate every day; and finally the control group were asked to make no changes to their regular diet.

Throughout the experiment the researchers measured the participants in 18 different ways, including their weight, cholesterol, sodium, blood protein levels, their sleep quality and their general well being.”

So – that was the first trick. “Measuring such a tiny sample in so many ways means you’re almost bound to find something vaguely reportable.” As Bohannon explained:

“Think of the measurements as lottery tickets. Each one has a small chance of paying off in the form of a “significant” result that we can spin a story around and sell to the media. The more tickets you buy, the more likely you are to win. We didn’t know exactly what would pan out — the headline could have been that chocolate improves sleep or lowers blood pressure — but we knew our chances of getting at least one “statistically significant” result were pretty good.”

Publication

Now to get credibility they needed to publish in a scientific journal:

“But again, Bohannon chose the path that led away from truth, picking a journal from his extensive list of open access academic journals (more on this below). Although the journal, (International Archives of Medicine), looks somewhat like a real academic journal, there was no peer review. It was accepted within 24 hours, and published two weeks later.”

Now for the publicity

Bohannon then whipped up a press release to bait the media :

“The key, Bohannon stated, was to “exploit journalists’ incredible laziness” – to write the press release so that reporters had the story laid out on a plate for them, as it were. As he later wrote, he “felt a queazy mixture of pride and disgust as our lure zinged out into the world”. And a great many swallowed it whole.

Headlines around the world screamed Has the world gone coco? Eating chocolate can help you LOSE weight, Need a ‘sweeter’ way to lose weight? Eat chocolates! and, perhaps more boringly, Study: Chocolate helps weight loss.”

We should be concerned at the way the news media and reporters handle such matters:

“None did the due diligence — such as looking at the journal, looking for details about the number of study participants, or even looking for the institute Bohannon claimed to work for (which exists only as a website) — that was necessary to find out if the study was legitimate.”

This criticism, unfortunately, applies to almost anything in our news media. it really is a matter of “reader beware.”

Grant summarises the process that leads to such devious “science’ stories in the media:

  • we’ve got researchers around the world who have taken to heart the dictum that the quantity of research outputs is more important than the quality
  • we’ve got journal publishers at the high quality end that care about media impact more than facts
  • we’ve got journal publishers at the no-quality end who exploit the desperation of researchers by offering the semblance of publication for a modest sum
  • we’ve got media outlets pushing their journalists ever harder to fill our eyeballs with clickbaity and sharebaity content, regardless of truth
  • and we’ve got us: simple creatures prone to click, read and share the things that appeal to our already existing biases and baser selves.

 Problem wider than the diet industry

Bohannon gives his prank as an example of a “diet research-media complex . . that’s almost rotten to the core.” I agree readers should be far more sceptical of such diet-related science stories. But the problem is far wider than that industry. I think is particularly relevant to any area where people are ideologically motivated, or their feelings of inadequacy or danger, can be manipulated.

Take, for example, the anti-fluoride movement. I have given many examples on this blog of science being misrepresented, or poor quality science being published and promoted by this movement. There are examples of anti-fluoride scientists doing poor quality research – often relying on “statistical fairy tales. Examples of using shonky journals to get poor quality work published. But also examples of such work making its way through inadequate journal peer-review processes.

These anti-fluoride researchers, and their allied activist groups, commonly use press releases to promote their shonky findings.  Social media like Facebook and Twitter are roped in to spread the message even more widely.

There is also a link with big business interests – in this case an active anti-fluoride “natural” health business-research-media complex.

So readers beware – there are people, businesses and ideological interests out there attempting to fool you. And they are not averse to using shonky or false science, biased press releases and lazy journalists to do this.

 See also: A rough guide to spotting bad science from Compound Interest (Click to enlarge).

ck4wisusvgsuwz4u8byd

Similar articles

 

 

Spotting Bad Science

Compound Interest has produced another great infographic in their series. This one helps us to spot bad science.

 Click to enlarge.  You can download the current version as a PDF here.

It is worth thinking about each of the suggested 12 criteria.

I particularly liked that it advises one to carefully evaluate scientific papers even when they are published in a reputable journal. A good journal and peer review is not a guarantee that the paper is faultless or that its findings can be accepted without proper consideration.

Dishonesty of intelligent design “research”

In my recent post Creationists prefer numerology to real scientific research I discussed the “research” approach used by those few scientists who are proponents of intelligent design. And I concluded:

“they ignore the normal honest research approach. They never advance a structured hypothesis, one that is consistent with intelligent design. They therefore never submit such hypothesis to any testing or validation.”

Behe

Michael Behe is Professor of Biological Sciences at Lehigh University in Pennsylvania. He works as a senior fellow with the Discovery Institute’s Center for Science & Culture.

Recently I noticed another blatant example of this lack of scientific honesty – the refusal to propose and test their own hypotheses of intelligent design. It’s a quote that seems to be going around the religious apologist bogs at the moment. For example, have a look at True Paradigm: Monday quote, The Big Bad Wolf, Theism and the Foundations of Intelligent Design – Page 13, or Still Speculating After All These Years at Contra Celsum.

It’s a quote from Michael J. Behe‘s book Darwin’s Black Box: The Biochemical Challenge to Evolution – this is the short form.

“The overwhelming appearance of design strongly affects the burden of proof: in the presence of manifest design, the onus of proof is on the one who denies the plain evidence of his eyes.”

Michael J. Behe, Darwin’s Black Box: The Biochemical Challenge to Evolution p 265.

Notice the problem?

Behe is asserting that he has no need to produce any evidence, outline a structured hypothesis, or do anything to test or validate his claim.

He simply has to make an assertion – based on nothing more than his claim of an “overwhelming appearance” (to him). Then it is up to those with different hypothesis to do all the work. To test his assertion (please note – a vague assertion – not a structured hypothesis) and prove him wrong.

Or else he declares his assertion correct by default!

Similar articles