1

Academic Competition Fueling Citation Manipulation

In academia, your reputation as a scientist depends on the quality and quantity of your publications as well as the number of citations. Citations give recognition to the scientist who discovered a finding and was the first to publish it. Therefore, the number of researchers who cite your work in their publications, can help assess your academic performance. It shows that fellow researchers were able to build on your finding and that it was useful to the research field. Reports such as the Highly Cited Researchers List, published by the Web of Science, fuel the competitiveness of academics. However, a coin always has two sides and so does the competition of citations in academia.

Citation Ranking

The Highly Cited Researchers List is compiled annually by the Web of Science. This citation database indexes every piece of research content to form a powerful search engine of the worlds’ scientific information. With this information, they are able to publish annual reports that rank journals and authors. Their Highly Cited Researchers List was recently released for 2019. This prestigious list recognizes the world’s top researchers whose citation records ranked in the top 1 % in their field in the Web of Science database. Being on the Highly Cited Researchers List means you are an influential and successful researcher – what an honor.

Are Researchers Manipulating Citations?

Unfortunately, the above need of citations has led to the increasing number of cases of citation manipulation. This is because research is intensely competitive. Researchers compete to have their research published in the most prestigious journals so that they can have a better chance of being awarded research funding. Ultimately your citation ranking is an indication of your academic performance. The higher the number of your citations, the more successful your academic career.

An extreme example of citation misconduct has resulted in a highly cited researcher from being banned from a journal board. Kuo-Chen Chou, an editor at the Journal of Theoretical Biology (published by Elsevier), shockingly manipulated the peer review process. He asked authors of papers he was reviewing for publication, to cite his work and add an algorithm he had developed to their paper. Chou also reviewed papers under a pseudonym. As a result, several papers had this name as a co-author during the final stages of the review process. In fact, three other journals picked up Chou’s “coercive citation.”

This citation manipulation earned Chou a short-lived spot on the Highly Cited Authors List from 2014 to 2018. However, Clarivate Analytics (Web of Science) detected his misconduct, and removed scientists who had unusually high levels of self-citation from this list.

Elsevier is looking into ways of detecting citation manipulation, to prevent this type of misconduct occurring in future.

Citation Misconduct – Effect on the Academic Community

This type of misconduct diminishes the authenticity of the citation ranking method. The scientific community has come to rely on the citation count as a reliable method for journal and author ranking. Review boards now need to come up with other ways to assess researchers for funding grants. They need to focus on the quality of the research, something that is a lot more difficult to do than analyze citation data. With the increasing number of PhDs graduating each year, and more scientists competing for funding, assessing performance is going to become more challenging in the future.

How to Cite Correctly

If you have ever written a scientific report, you will be aware of the large number of citations you need to support your research. As a researcher, here are a few rules about citing research in order to avoid citing incorrectly:

  1. Cite the original author, who first published the finding.
  2. Ensure you read and understand the entire article that you are citing to prevent misinterpretations. Misinterpretation of the literature often leads to citations by other authors and before you know it, there is confusion in the field about the topic, broadening the incorrect citation problem and contributing to scientific myths.
  3. Excessive self-citation must be avoided. After all, one research project may result in several publications that build on each other over time. Citing your previous work is ethical. However, when self-citations become excessive, this is bad science and unethical.
  4. Senior authors should not insist that junior authors list them as co-authors in a publication if they did not contribute substantially to the paper. This is a common, yet unethical, practice in academia.

Honest Ways to Boost Your Citations

Scientists are creative problem solvers and have found legitimate ways of increasing the visibility of their research papers. By increasing the exposure your article gets, you increase the likelihood of your research being cited. You can promote your research through the following channels:

  1. Open-Access Journals: These journals are free for readers and therefore more accessible. Articles published in Open-Access (OA) journals are more accessible than ones published in expensive subscription-based journals. Therefore, an article published in an OA journal is likely to be cited more often than one published in a subscription-based journal. OA journals are growing in popularity.
  2. Social Media: Research, actively promoted on social media platforms, attract a higher number of citations. These platforms include Facebook, LinkedIn, Instagram, Twitter, Blogs, Mendeley and ResearchGate. Again, the more visible and accessible your research is, the more researchers will read it and the chances of your work being cited increases. There are drawbacks as well of using social media on the research. One of these drawbacks includes the general public commenting on your research with very little understanding of the science. However, with a bit of effort, you can manage your social media and increase the exposure your research receives.

Do you think scientific databases will be able to find methods to detect citation manipulation? This could enable the use of citations as an accurate measure of journal and author ranking going forward. Let us know in the comments section below.

Rate this article

Rating*

Your email address will not be published.

You might also like
X

Sign-up to read more

Subscribe for free to get unrestricted access to all our resources on research writing and academic publishing including:

  • 2000+ blog articles
  • 50+ Webinars
  • 10+ Expert podcasts
  • 50+ Infographics
  • Q&A Forum
  • 10+ eBooks
  • 10+ Checklists
  • Research Guides
[contact-form-7 id="40123" title="Global popup two"]





    Researchers' Poll

    What should universities' stance be on AI tools in research and academic writing?