The Growing Big Three Boycott: Understanding why scientists are publically boycotting Cell, Nature and Science Journals

The Growing Big Three Boycott: Understanding why scientists are publically boycotting Cell, Nature and Science Journals

Tags: , ,

By: Anna Badner

A publication in the prestigious Nature, Cell, or Science is considered the golden ticket to a successful science career. From superior job opportunities to more successful grants, a paper in these journals is definite boost to any scientist’s reputation–a token of true scientific ability. Exceedingly competitive, these top tier journals reject more than 90% of manuscripts they receive and continue to have growing submission rates1 There are approximately 10 000 submissions per year for Nature and more than 12 000 for Science1which is definitely telling of the increasing competition and selectivity to publish. Nevertheless, despite the continuous growth in paper submissions, established scientists have begun to publically criticize these elite journals. One notable example is biologist Dr. Randy Scheckman, 2013 Nobel Laureate in Physiology or Medicine. Upon receiving the Nobel Prize, Scheckman wrote a brief commentary in The Guardian2 describing the damaging effects of “glamour journals.” In the article, he compares Nature, Cell, and Science to high-end fashion brands, highlighting that their main aim is to sell subscriptions and not to display the most important research. Marketing their product with the ploy of Impact Factor, Dr. Sheckman citizens these journals for publishing only eye-catching and provocative science in “sexy subjects.”

Although Dr. Sheckman’s piece has received a fair share of criticism, including remarks of evident hypocrisy, it does bring attention to certain emerging problems within the scientific community. One major concern, which Sheckman touched upon, is the widespread use of the Impact Factor as a measure of research quality and productivity. Derived from citations to all articles in a specific journal, the Impact Factor was originally established by Thomson Reuters Corporation as a tool to help librarians identify journals to purchase3. However, when applied by funding agencies, academic institutions, and other parties, the Impact Factor is often used as the primary parameter to evaluate an individual’s or an institution’s scientific contributions. Meanwhile, scientists alike agree that the Impact Factor does not appropriately measure the quality of a scientific article nor does it reflect how influential the work is in the field4. Thus, in recognizing the misleading influence of the Impact Factor, scientists across various disciplines are voicing a call for change.

At the 2012 Annual Meeting of The American Society for Cell Biology in San Francisco, an extensive group of editors and publishers of scholarly journals developed The San Francisco Declaration on Research Assessment (DORA), a set of recommendations for improving the way in which the quality of research output is evaluated. With approximately 10 668 signers to date, many of which hold prestigious academic positions, DORA aims to have research assessed on its own merits. The document recommends that institutions and funding agencies weighs the “scientific content of a paper as much more important than publication metrics or the identity of the journal in which it was published.”5 Furthermore, the document reminds researchers to cite primary literature in which observations first give credit to the appropriate scientists rather than to temptation of citing reviews. In addition to the positive reactions in academia, Thomson Reuters, Science6, and Elsevier have all responded to DORA with relative optimism. There is a definite consensus for change, but the leaders in scientific enterprise must take responsibility to transform the evaluation process.

Another prominent critic of the system is theoretical physicist Dr. Peter Higgs, one of the key researchers to predict the existence of the Higgs boson. Despite his indisputably fundamental contributions to science, Dr. Higgs has openly stated that he himself would not have been able to survive the extremely competitive environment of modern academia7. With growing productivity standards, scientists are under continuous pressure to publish frequently, especially in top tier journals. Of even greater concern, these increasingly demanding expectations may actually drive scientists to publish unreliable or faulty results. A 2011 paper exploring journal retraction rates found a direct correlation between retraction frequency and journal impact factors8, suggesting that the disproportionally high payoff of glamour journal publications may encourage scientists to inappropriately interpret and/or present their data. Further complicating matters, in India and China, researchers receive financial incentives to publish in journals like Nature, Cell, or Science.1 Significant salary increases and cash rewards are offered to those that get their work accepted in famous journals, endorsing their country’s world-class standards9. As the role of financial incentives in any context raises certain ethical considerations, the major concern lies in how such motivators may adversely affect science quality. It is no surprise that the scientific process relies heavily on trust between colleagues and collaborators, and is based on a silent commitment to research integrity. Academia must play its part and begin to praise quality over quantity and scientific merit over Impact Factor.
Lastly, no commentary on scientific publication and its big players can be complete without discussion of the persistent controversy around open-access (OA). While the fight is honorable, OA has been blamed for causing a “wild west” effect in science publishing. To those who remain unfamiliar with this phenomenon, an investigation by Science magazine (a subscription-based journal) showed that a completely false cancer treatment paper with meaningless data and significant errors was accepted by approximately half (~150) of the OA scientific journals it had been submitted to, highlighting the defective peer review process in online OA publishing10. However, supporters and advocates of OA have rebutted, stating that, despite the growing number of online journals (some of them unreliable), it is the peer review process that is broken in subscription-based as well as OA publications11. One especially vocal advocate of OA publications and critic of the previously mentioned Science magazine investigative article is UC Berkeley biologist, Dr. Michael Eisen. A co-founder of the Public Library of Science (PLOS), Eisen is undeniably invested in OA, and has repeatedly refuted the illegitimacy claims of online OA journals with accusations of hypocrisy. Dr. Eisen was easily able to identify some truly appalling and erroneous papers to be cleared by the apparently rigorous Nature and Science peer-review process. Furthermore, if historically wrong Cell, Science, and Nature papers are not enough to highlight this spectacle, the recent fiasco-involving allegation of fraudulent data (involving Stimulus-triggered acquisition of pluripotency cells, ie. STAP cells) in a published Nature paper may do the trick12.

Although the future is of science is uncertain, it is clear that in this new digital age the classical models of Impact Factor, journal subscription and peer review are growing out of date. It is time for scientists in all stages of their careers to recognize these flaws and work towards solutions.

References:

1. Reich ES. Science publishing: The golden club. Nature. 2013; 502(7471): 291-293.

2. Scheckman R. How journals like Nature, Cell and Science are damaging science. The Guardian. 2013 Dec 9 [Cited 2014 April 20]. Available from: http://www.theguardian.com/commentisfree/2013/dec/09/how-journals-nature-science-cell-damage-science

3. Vanclay JK. Impact Factor: Outdated artefact or stepping-stone to journal certification. Scientometric. 2012; 92: 211-238.

4. The PLoS Medicine Editors. The impact factor game. PLoS Med. 2006; 3(6): e291.

5. Schmid SL. Beyond CVs and Impact Factors: An Employer’s Manifesto. Science Careers. 2013 Sept 3 [Cited 2014 April 20]. Available from: http://dx.doi.org/10.1126/science.caredit.a1300186

6. Alberts B. Impact factor distortions. Science. 2013; 340(6134): 787.

7. Aitkenhead D (2013). Peter Higgs: I wouldn’t be productive enough for today’s academic system. The Guardian. 2013 Dec 6 [Cited 2014 April 20]. Available from: http://www.theguardian.com/science/2013/dec/06/peter-higgs-boson-academic-system

8. Fang FC & Casadevall A. Retracted science and the retraction index. Infect Immun. 2011;79(10): 3855-9.

9. Wang NX. China’s chemists should avoid the Vanity Fair. Nature. 2011; 476(7360): 253.

10. Bohannon J. Who’s afraid of peer review? Science. 2013; 342(6154): 60-5.

11. Basken, Paul. Critics Say Sting on Open-Access Journals Misses Larger Point. The Chronicle of Higher Education. 2013 Oct 4 [Cited 2014 April 20]. Available from: http://chronicle.com/blogs/percolator/critics-say-sting-on-open-access-journals-misses-larger-point/33559

12. Normile D. Stem cell research RIKEN panel finds misconduct in controversial paper. Science. 2014; 344 (6179): 23.