There is a lot of thinking going on today about how research can be made more efficient, more robust, and more reproducible. At the top of the list are measures for improving internal validity (for example randomizing and blinding, prespecified inclusion and exclusion criteria etc.), measures for increasing sample sizes and thus statistical power, putting an end to the fetishization of the p-value, and open access to original data (open science). Funders and journals are raising the bar for applicants and authors by demanding measures to safeguard the validity of the research submitted to them.
Students and young researchers have taken note, too. I teach, among other things, statistics, good scientific practice and experimental design and am impressed every time by the enthusiasm of the students and young post docs, and how they leap into the adventure of their scientific projects with the unbent will to “do it right”. They soak up suggestions for improving reproducibility and robustness of their research projects like a dry sponge soaks up water. Often however the discussion is in the end not satisfying, especially when we discuss students’ own experiments and approaches to research work. I often hear: “That’s all very good and fine, but it won’t get by with my group leader.” Group leaders would tell them: “That is the way we have always done that, and it got us published in Nature and Science”, “If we do it the way you suggest, it won’t get through the review process”, or “We then could only get it published in PLOS One (or Peer J, F1000 Research etc.) and then the paper will contaminate your CV”, etc.
I often wish that not only the students would be sitting in the seminar room, but also their supervisors with them!
It strikes me that the professional status of scientists stands apart conspicuously from other professions such as doctor, lawyer, care giver, and yes, even national football referee. Scientists have no regulatory body, no code of professional ethics and no obligatory continuing professional development! Yes, real estate brokers and national football team referees have all that! Scientists however are and remain scientists simply by carrying out their activity, which almost always follows after a specialized degree (e.g. diploma, doctorate). Pilots, by contrast, must prove every year that they have flown so-and-so many flying hours, even under the scrutiny of an instructor. Doctors, whether or not they have a practice, are licensed to practice, or are employees, must prove to their doctors’ association panel that they have achieved 250 continuing medical education credits within five years. The idea behind all this is to ensure that this person is up to date with the practices of his/her profession according to current standards. Why don’t scientists have to do that too? Are they not important enough? So that it doesn’t matter if something goes wrong because scientists did not realize that their field had already moved on? Will a scientist get continuing education just by doing his/her job, i.e. automatically? Or does it have something to do with freedom of science – fear of any form of regulation as a restriction of creativity in the ivory tower?
Am I calling for another authority or regulatory office? Contiuous professional education at conventions? Annual examination for PhDs and professors or else their titles are withdrawn and their scientific publications retracted? Of course not. But yet I find it disturbing when students have to educate their supervisors – that can’t work. It also makes one doubtful when funders need to start training their reviewers and panelists. The most important funders in England — Welcome Trust, MRC and the British Cancer Foundation for example — are already doing so. Or journals who take care of the continued education of their reviewers, as does the renowned the British Medical Journal.
Certainly, most scientists are absolute experts on the immediate object of their research. They read (or write themselves) the most current papers, hear the relevant lectures at conventions and lead discussions at the poster. But only a small minority has qualifications in statistics, experimental design, or as reviewers — the central skills of their trade. Over the years they have established themselves as scientists in the system essentially through simple assimilation of current practice. Success as a supervisor, departmental head, professor bequeaths upon them the right to do so. They have never had to really know what the p-value actually means (or does not mean); that the probability of their hypotheses is what decisively influences whether their results are false-positive; or what pseudo-replication and nesting are; or what regression to the mean is. And that’s because their colleagues, including the reviewers of the funders and journals didn’t know either! Now that this has come to light as an important cause of deficient reproducibility, robustness and predictiveness, the newcomers are often better informed than the established scientists. When something completely new comes, such as open data (with things like common data elements and metadata, repositories etc.) or electronic laboratory logbooks (protocols/registries) things get problematic.
So: No “registered scientist” (https://sciencecouncil.org/), but rather periodic continuous education on current topics of high general relevance. Together with the students and post-docs! How to make this real? With attractive formats with minimal loss of time. As material: Their ongoing research projects. Attractiveness could be enhanced if participation is linked, for example, with competitive distribution of funds. But have no fear: This will never happen! Because it will evoke the fears of bureaucracy and tedium, sitting out obligatory seminars, and it will be countered with the knockout argument of “scientific freedom”. Which is a pity, really, because it could lead to professionalization of science and better understanding of the robustness of their results.
A German version of this post has been published as part of my monthly column in the Laborjournal: http://www.laborjournal-archiv.de/epaper/LJ_18_06/28/index.html
See also my short piece in Nature in ‘Health tips for researchers’