Scientific integrity

Are scholarly publications any more reliable or trustworthy than the general press? Does the publishing process build in sufficient checks and balances or is it remarkably flawed? Reports of fraudulent or retracted articles from the body of scholarly literature might suggest a crisis in scholarly publishing. There have been a number of sting operations to test the publishing process and expose publishers who publish bogus articles that have attributions of scholarly work. Retraction Watch, founded by Ivan Oransky and Adam Marcus, keeps track of journal articles that have been retracted for any reason—some legitimate, such as an author admitting that mistakes were made in the original submission, and some more serious, such as forced retractions because the underlying science or its reporting was indeed fraudulent. 

Airing out this dirty laundry keeps the publishing industry in check, but it also captures the attention of the popular press and its predictable fallout of sensationalism. Last October’s cover story in The Economist, Unreliable Research: Trouble at the Lab, is one visible, highly cited example. Stories on so-called predatory publishers and the rising number of retracted articles support the not-so-veiled accusation that the scholarly publishing business has serious quality-control problems, if not unbridled issues with integrity. 

How can researchers and publishers effectively respond? A session entitled “Ethics and Trust in Journal Publishing: How Sound is the System?,” presented at the 2014 Spring Conference of the International Association of Scientific, Technical and Medical Publishers, addressed this concern. Among the presenters were editorial directors Ivan Oransky and Chris Graf, and science journalists John Bohannon and Phil Davis

Retraction Watch has grown to become a valuable auditing service for the journal business. Without question, a fully retracted article is the most egregious error that can occur in this form of communication. Reputable journals post and archive an article’s official version of record and append any errata or notice of retraction that may subsequently occur. Retraction Watch adds another layer of transparency to this error correction. In addition, since it covers all fields of scholarly publication, it provides some measure of industry-wide statistics. I observe that the numbers are telling in that they are very small in comparison to publication totals. Retraction Watch posted approximately 500 newly retracted articles in 2013. Compare this number to the nearly 2 million articles that were published in more than 28,000 scholarly publications last year. That puts fully retracted articles at about 0.02% of the annual publication volume. Also of note is that the large majority of the retractions listed on Retraction Watch are in biomedical or clinical fields. These research areas clearly have more difficult problems in establishing reproducible starting conditions (cell lines, animal cohorts, well-characterized reagents, etc.) than other areas within the physical sciences. Practitioners in medical fields are aware of these problems and are taking steps to improve the reproducibility of experimentation by redoubling certification and testing procedures. 

Last spring, Science reporter John Bohannon wrote about his sting operation that uncovered a cohort of largely “pay-to-play” publishers that were willing to publish almost anything vaguely scientific as long as the author forked over a publishing fee [Who’s Afraid of Peer Review?, Science, 342, 6154 (2013)]. Bohannon submitted a paper on completely fabricated research to more than 300 open access journals around the world. 255 responded and an astounding 157 accepted the manuscript for publication. Most of these publishers were new, hailing from developing countries, but a few were established and respected in the industry. Bohannon followed his faux article’s path from submission, to author payment, to acceptance by the publisher. He also traced facade addresses in Western locations and circuitous routes that the payments were funneled to hide the identity of certain publishers. By publicizing the existence of these shadowy enterprises and exposing their methods, he does the industry, our authors, and readers a great service. But the success of this venture may have biased his vision as an investigative reporter, as he expressed his own personal distrust of the industry. 

I stress that it’s important to take a step back and see these deficiencies in context. Quantifying retractions, such as offered by Oransky, gives good definition to the problem. By categorizing those publishers exposed by Bohannon and by other sting operations summarized by panelist and independent researcher Phil Davis, we see that the great majority are new to the scene of scholarly publishing and from developing countries where the tradition of industry integrity is not yet ingrained. 

The system is not perfect, but errata and retraction statistics belie a miniscule error rate compared to any other communications media. More importantly, the system is self-correcting. It may take time, but bad science or fraudulent science will eventually be smoked out—from the early 20th century Piltdown Man to the Hendrik Schön affair at Bell Labs a decade ago. As a whole, scholarly publishers and the academic community practice due diligence to maintain the integrity of published works. 

Several large publishers have started to conduct annual ethics audits to assure that their policies and procedures are well followed and effective in ferreting out misconduct and substandard manuscripts. Chris Graf spoke about his experience reviewing such audits for Wiley. Graf also serves on the Committee on Publication Ethics (COPE), a forum for editors and publishers of peer-reviewed journals to discuss and advise on publication ethics. AIP Publishing journals belong to COPE, as do 8500 other scholarly journals from around the world. By belonging, editors and publishers recognize that the system is imperfect, but we nevertheless have faith in it and are committed to continually improve it. Science depends on our commitment.