The standard of peer review has recently come under scrutiny after several investigations discovered suboptimal procedures and in some cases fraudulent behaviour. In an article in PLoS ONE, Jelte Wicherts (from the Department of Methodology and Statistics, Tilburg University, The Netherlands) highlights a study where a seriously flawed paper was submitted to 304 open access (OA) journals. Worryingly the paper was accepted by 157 journals. The analysis suggested that only 106 journals performed some sort of review but a majority of these (70%) went on to accept the article. The rest of the acceptances occurred without any review. Jelte also highlights the recent attention that has been focused on the problem of faked peer reviews. For example, last year, a number of journals retracted papers after an investigation by the publishers.
To try and address these issues and provide authors with some guidance on the quality of peer review Jelte has created a questionnaire (14-item tool) to measure transparency. The questions centre around clarity of the peer review process, selection of reviewers, publication ethics, governance and overall transparency and all this information has to be available via the journal’s website. Jelte tested the validity and consistency of his tool using three different approaches involving authors, publishing experts and librarians. He used the results from the studies to revise and refine the form and the end result was a tool that could effectively assess peer review transparency and give an indication of quality. Jelte accepts that further validation is needed and stresses that transparent peer review and high quality peer review do not necessarily go hand-in-hand. Read Jelte’s article in full here.