Apparently, the star of the hit 2006 mockumentary “Borat” has a second career as an esteemed professor of philosophy, at least if you believe the references of “Evaluation of transformative hermeneutic heuristics for processing random data,” a paper recently published by the Romanian journal Metalurgica International. This very B. Sagdiyev is listed as an author of “The epistemologies of the dice, a practical considerations (sic)”; that paper of course, does not exist, nor does the organization which purportedly published it. But the paper that contains the reference is very real, an intentional “gotcha!” engineered by a group of Serbian professors wanting to expose the serious issue of insufficient peer review in scientific journals.
Journals, by necessity, deal with very specialized articles whose vocabulary and methods may only be fully understood by a small group of experts in a particular field. As the editors of a journal often lack this expertise, they forward papers to the peers of the authors, who are tasked with evaluating the rigor and merit of the science. A paper is only published if it meets the standards of these reviewers; when the reviewers themselves submit work, it may be in turn be evaluated by the scientists they just reviewed. This reciprocity is meant to ensure scientific honesty and a basic standard of quality for the publication.
Yet some journals, catering to a high-pressure academic world of “publish of perish” in which professors are expected to author a consistent stream of research or be fired, have relaxed their standards, giving substandard work a stamp of approval. The Serbian scientists were concerned by the procedures of such publications and decided to construct a nonsensical paper full of academic jargon and nonexistent citations, submitting it in the hopes that a work so ludicrously bad would be rejected. Not only did Metalurgica International accept the paper, it did so without requesting a single correction.
Less blatantly obvious hoaxes can also slip though the cracks of peer review. In a recent “sting operation” organized by Science magazine, an error-ridden paper on an anticancer drug discovered in lichen was submitted to over 300 journals. Of the 255 publications that responded, only 106 performed any peer review at all, and 70 percent of the peer-reviewed journals accepted the paper. Many of these journals seemed more concerned with getting a publishing fee than checking the content of the paper; at one publication, the cost of getting the work into print was $3,100.
Even Science itself has issued a number of high-profile retractions due to the failure of peer review, including the 2010 withdraw of a paper on human embryo cloning. While some retractions are inevitable due to the progress of research, those due to the failure of peer review undermine the authority of scientific publications (and ultimately, that of science as a whole). As the rate of scientific research, especially in the developing world, continues to increase rapidly, it is of the utmost importance that this research be adequately scrutinized before being released to the public.