Study finds false data top reason for journal retractions

Share this article:
Should Stephen Glass have penned medical journal articles instead? A look into the integrity of medical publications indicates that many researchers are guilty of fabrication. The study, published by the Proceedings of the National Academy of Science on Monday, found that 43.4% of retracted articles were pulled over fraud allegations -- false and fabricated data or suspicion of fraud. Errors, on the other hand, accounted for only 21.3% of the retractions the authors found among the 2,047 retracted articles listed in PubMed's index as of March 3. The study noted that PubMed references 25 million articles which go back to the 1940s and the earliest retracted article is from 1973 and was retracted in 1977. "Hence, retraction is a relatively recent development in the biomedical scientific literature, although retractable offenses are not necessarily new" they wrote.

The report noted that publications are not always open with readers and found that even when journals talk false data, it can sometimes be in code. Case in point: the journal Biochemical and Biophysical Research Communications discovered a 1991 article it published included false and fabricated information, yet its retraction was worded as follows: “results were derived from experiments that were found to have flaws in methodological execution and data analysis.” The PNAS study's authors only discovered the real reason -– fraud –- because Harvard University reported it to the Office of Research Integrity and the PNAS trio read Harvard's report. The Office of Research Integrity is part of the Department of Health and Human Services and investigates allegations of research misconduct, having evolved from a 1985 Congressional mandate.

This year's ORI tally includes a case in which an accused researcher “neither admits nor denies committing research misconduct but accepts ORI has found evidence of research misconduct,” as well as a researcher who “knowingly and intentionally fabricated and falsified data in portions of figures.”ORI closed 13 investigations last year, eight in 2010 and seven in 2009. Punishment varies and can include exclusion from working as a public health service adviser, avoiding government-contracted research jobs and having research vetted by a third party for a specified period of time, among other measures.

Yet the suspect research can still appear fresh and uncontroversial to readers. The PNAS study noted that there is no uniform retraction policy, either in its execution or its announcement, and noted that the Journal of Biological Chemistry “routinely declines to provide any explanation for retraction.” The authors also found that a noted retraction is often not enough to wipe a discredited study from the books. Two cited examples: a 2001 Nature article in which the authors disputed the conclusions doesn't indicate the article has been retracted, and a 2005 Science article “continues to be cited even though both the HTML and PDF versions are clearly marked as retracted and the PDF version includes a copy of the retraction notice.”

The Office of Research Integrity has also taken up this topic and hit some of the very same points in its quarterly newsletter in September 2011 (2012 has been particularly plagiarism-heavy, with plagiarism accounting for 9.8% of the retractions the authors studied).  The newsletter, which also serves as a printed record of case findings, acknowledged that journals may hold off immediately publishing a retraction in the interests of being fair, yet also noted that even if an institution weighs in and an investigation closes that “a retraction may never be published, because of other factors, such as an editor's decision, a journal ceasing to exist,” or because the data popped up in supplements or review articles that are considered “non-retractable,” and the information continues to be available without being anchored by a notice of concern or retraction. Like the PNAS researchers, the Office of Research Integrity also noted that “the text of the retractions, corrections or errata associated with falsified papers rarely explicates the details on which components of a study are false, and/or why.”

The agency tries to loop everyone in through Federal Register Notices and tacks on a comment that shows up in PubMed search results, but this precaution doesn't really help readers because it's a multi-link process and “few readers are aware this information is available.”

The Committee of Publication Ethics has a set of retraction guidelines that recommend clearly identifying articles as retracted. The notice should appear on all electronic searches for the research and “should usually be reserved for publications that are so seriously flawed (for whatever reason) that their findings or conclusions should not be relied upon.” The committee's guidelines also base the power of retraction on the situation: if editors have proof retraction is needed, they should do it with our without the author's consent, but if the evidence is shaky and the university or institution isn't going to look into the allegations, editors should “issue and expression of concern” as well as state the reason for concern.
Share this article:
You must be a registered member of MMM to post a comment.

Does a health psychology approach hold the key to Rx adherence? In MM&M's latest Leadership Exchange Uncut eBook, industry stakeholders from the payer, provider, academic and pharma realms explore the "why" behind medicine taking. Access here.