Nature reducing their irreproducibility

#MSResearch  #EAE Time to improve your experiments

This is something that is well know that many things that are published in the top journals, turn out to be not quite right. 

Some people say over 50% of these papers may be crud.
Why is this the case?

(a) An honest error, where that was what the experimental result told you. If you aim to be first you don't know what is right, until people try to replicate it. If they can't it takes a load of studies to break the dogma created by the first study.

(b) You are racing to get the result before you are scooped, you do experiment that goes they way you want and that's QED-end of story, shame that it was the fluke that would turn out to be the odd one out.

(b) Liberal with the truth, you spin the story that ups the importance of the work to get that big hit.

(c) You don't know whats round the corner that will catch you out being liberal with the truth

(d) I am sure you can think of more ... and god forbid

(e) In a small fraction of cases it is made up.  
If your career depends on the right result would you make it up? You and the team need integrity.

Having just spent three years generating the tools to do the killer experiment that could give the Nature/Science paper (you hope) only to find it doesn't work the way you want. I can see the temptation.

(e) However, one thing that we have been banging on about is that the reporting of the experiments are so vague that no-one has a clue what was actually done so it is impossible to properly replicate the work. 



We showed that Nature was doing a bad job when it came to MS work and setting a bad example.

Therefore it is interesting that Nature have just issued reporting guidelines as part of the publishing process this no doubt will begin a change that we have been talking about. 


From next month, Nature and the Nature research journals will introduce editorial measures to address the problem by improving the consistency and quality of reporting in life-sciences articles. To ease the interpretation and improve the reliability of published results we will more systematically ensure that key methodological details are reported, and we will give more space to methods sections. We will examine statistics more closely and encourage authors to be transparent, for example by including their raw data.


Central to this initiative is a checklist intended to prompt authors to disclose technical and statistical information in their submissions, and to encourage referees to consider aspects important for research reproducibility (go.nature.com/oloeip).

Whilst this process will not remove crud, it will mean that the experimental design should be changed to avoid getting the cruddy results in the first place. However, it will be important that they monitor and implement this  

The neuroimmunology/EAE field need to embrace this because as we have shown people need to up their game.

Labels: ,