Nature Neuroscience is currently undertaking an initiative to improve statistics and methods reporting.
Following an initative of the NIH/NINDS (Landis et al.), Nature Neuroscience is testing a new scheme to improve reporting, and consequently to reduce bias and faulty statistics in work published in their journal. Authors have to fill out at very detailed checklist (stats checksheet nat neurosci source file ud) which is sent to the reviewers. Other journals had checklists before, but they were pro forma, and mostly ignored. This checklist looks very bureaucratic, and authors will hate it, but it contains all the relevant questions (stats including power; bias including blinding and randomization; ethics; detailed reporting of strains, animal husbandry, material; design of functional imaging studies, etc.). It forces authors to think about these issues, and if they haven’t done their homework before designing the experiments they either have to cancel submission, or fake entries into the sheet. Which would put them in clear violation of good scientific practice, compared to just not reporting that they used an underpowered design without blinding and randomization (which is the current practice)….
Nature Neuroscience should be commended for this initiative. Other journals will hopefully adopt this policy, and reviewers take the time to study the checksheets in order to request clarification from the authors or pull the plug on publication of flawed papers.