SCIENCE

Blatant Fraud Discredits 43 Major Scientific Studies

Mar 27, 2015 at 4:29 PM ET

Some scientific researchers are rigging the peer review process to speed up publication and defraud the scientific community. This week, BioMed Central officially retracted 43 papers published in its family of nearly 300 journals, covering topics from prostate cancer diagnostics to repairing broken ankles.

Major journals have retracted roughly 170 papers since 2013, according to RetractionWatch, a blog which monitors research integrity. In its most egregious form, publishers believe that third-party organizations are fabricating both papers and peer reviews on behalf of their clients, under the guise of helping foreign scientists translate their work into English.

China appears to be the biggest offender; the majority of the BioMed Central papers retracted on Friday were from major Chinese universities. Occasionally the plagiarism is so sloppy that the borrowed phrases don’t even make sense in context. A Scientific American exploration found dozens of papers featuring the phrase “Begger’s funnel plot”, all from China.

“There is no such thing as a ‘Begger’s funnel plot’…A statistician named Colin Begg and another statistician named Matthias Egger each invented tests and tools to look for biases that creep into meta-analyses. ‘Begger’s funnel plot’ appears to be an accidental hybrid of the two names.”

Thousands of Americans are already suspicious of scientific research, and intentional deception doesn’t help. When it comes to clinical trials there’s also a deeper concern—junk science may result in dangerous prescription drugs entering the market.

In response to these concerns, BioMed Central announced Friday that it will attempt to close some of the loopholes that fraudsters exploited in its system.

“A sad reality is that this problem is sourced at a higher level than publishers alone can tackle,” writes Elizabeth Moylan, a senior editor at BioMed Central. “Science is set up with perverse incentives that reward scientists for impact and productivity, rather than for the quality of their research.”