Monday, June 15, 2009

Another "Hoax Paper" Accepted For Publication

As much as we whine about peer-reviewed process and the prestige of certain journals that try to uphold a very rigorous acceptance process, when we read something like this, it drives home the fact that, more often than not, quality beats quantity on any given day.

A hoax paper was accepted to be published in an open access journal that I've never heard about until now {link available for free only for a limited time}.

The fake, computer-generated manuscript was submitted to The Open Information Science Journal by Philip Davis, a graduate student in communication sciences at Cornell University in Ithaca, New York, and Kent Anderson, executive director of international business and product development at The New England Journal of Medicine. They produced the paper using software that generates grammatically correct but nonsensical text, and submitted the manuscript under pseudonyms in late January.

Davis says he decided to submit the fake manuscript after receiving several unsolicited invitations by e-mail to submit papers to open-access journals published by Bentham under the author-pays-for-publication model. He wanted to test if the publisher would "accept a completely nonsensical manuscript if the authors were willing to pay".

Davis was informed by Bentham on 3 June that his manuscript was accepted for publication. The publisher requested that Davis pay US$800 to its subscriptions department, based in the United Arab Emirates, before the article was published. At this point, Davis retracted the article.


The editor of the journal is resigning from the journal in dispute with the publisher. I find it entirely VERY strange that the journal publisher is claiming that the paper has been reviewed by more than one referee, and yet, the editor in chief is not able to access or verify that such a review has taken place. Typically, the editor, of all people, are the ones assigning the referring process and has full access to all the reviews. So something is definitely fishy here.

This is reminiscent of the Alan Sokal hoax in "Social Text", even though the aim of that hoax is on an entirely different matter than this one. Still, the review process failed, and that something that is essentially garbage made it through the process and got through. This is one clear example of what would happen without proper scrutiny and review.

Of course, someone might ask "But ZapperZ, even those prestigious journals like Nature and Science are not immune to publishing such "hoaxes", as illustrate by the Hendrik Schon debacle".

Ah, but there's a difference here. When you either forge data or being unethical, it is very difficult to spot something like that, as opposed to the nonsensical paper written by Sokal and in this latest case. Any reviewer who knows the subject matter can immediately spot such "word salad" that are contained in these cases. In Schon's case, it isn't easy to spot because the deception is in THE DATA itself. It requires others to try and reproduce that experiment as a means to verify what was obtained, i.e. it requires the wheel of scientific independent verification to kick in. You'll never see such word salad manuscript getting published in Nature, Science, PRL, etc.

Zz.

No comments: