U.S. and South Korea Contributing Massively to Novel Biomedical Publications

Research in South Korea is progressing fast. South Korea now ranks only second to the United States in terms of novel biomedical research publications, as demonstrated by a new study of over 20 million research papers. The ranking adds another boost to South Korea’s already impressive record of achievements in the world of scientific publications and research investment. Despite its small size, South Korea has managed to surpass its larger and more traditionally recognized neighbors, China and Japan, in the field of science. A combination of huge financial investment and government support for scientific endeavors and institutes has laid the foundation for this success. The study also found that Singapore, Taiwan, and Ireland have  made great advances in novel research over time.

The US Faces Increasingly Stiff Competition Abroad

Other nations increasing their investment and funding for scientific research both novel and otherwise. This has made it difficult for the US to maintain its position in the field of research. This is partly due to the future of scientific funding becoming more uncertain under the Trump administration. The domestic environment for scientists has also become uneasy as the administration reduces scientific advisory boards. The travel ban on researchers has only worsened the scenario. Although it secured the first position, it was just one percentage point ahead of South Korea. This lead being so tiny that it could easily disappear in coming years.

A New Way to Evaluate “Novelty” in Research

How were the ranks calculated? Why put the focus on novelty? Professor Mikko Packalen of the University of Waterloo is the author of this study. He highlights that nations with a tendency to produce novel research and work with new ideas can be considered “close to the edge of the science frontier”. In his study, he uses the term “edge factor” to describe the calculation of novel research. Professor Packalen used a text analysis tool to look at over 20 million paper abstracts and titles published between 2015 and 2016 in the popular biomedical and life sciences database, MEDLINE. The tool searched these abstracts and titles for newer phrases and terms from the Unified Medical Language System and categorized them by the affiliation of the first author listed. This way he created a new measurement to see which countries engage most with new ideas.

Measuring scientific progress is a difficult task at best. Traditional metrics focus on the number of citations that a publication has to measure impact. Publication in a prestigious journal increases one’s chances of citation as others are more likely to encounter your work. Using the impact factor to evaluate research quality is problematic for a number of reasons. As the scientific community turns to alternative metrics, many prestigious journals like Science and Nature are now emphasizing novelty. Professor Packalen proposes that through his new qualitative tool, it will be possible to “reward scientists who are willing to try out new idea inputs and thus promote healthier science.”

Is Packalen’s Methodology Lacking?

The study results, however, have shown slightly different trends. Some countries such as Saudi Arabia earned higher measures using Professor Packalen’s methodology than other metrics. In addition, Packalen highlights certain demographic and other explanations for the high scores of countries like China. According to him, younger scientists engage more with newer work, and China has a disproportionately large number of young scientists. Still, the study breaks down areas of engagement with novel ideas by category grouping. This allows readers to see individual areas where different countries are more likely to focus their energy and where they are lacking.

However, Professor Packalen’s research has encountered some criticism from the community. Caroline Wagner, a public policy scholar at Ohio State University in Columbus, doubted his methodology that emphasizes the use of newer concepts to define novelty. She questioned the correlation between the date of a publication and its novelty. She also proposed that the results could have been influenced by the fact that investment in science correlates to total publications. Both South Korea and the US place significant funds into R&D, hence their ranks cannot reflect ‘novelty’.

Ton van Raan, an emeritus professor of quantitative studies of science at Leiden University in Netherlands, has also criticized the study. He stated that the ‘edge factor’ should be measured together with impact. That way, scientists can more easily judge whether the ‘high edge factor’ countries are truly contributing to the frontiers of science.

Should We Focus More on Novelty?

Professor Packalen’s new tool, no doubt, offers an interesting new way to evaluate national contributions to the global pool of scientific research, novelty may not be the ideal replacement for impact as a metric. This is due to the fact that the long-term impact of novelty can be difficult to judge. Moreover, the ongoing replication crisis in science as a whole has led some to argue that researchers and journals shouldn’t focus on novelty at all. When journals and funders emphasize on novelty, researchers might be tempted to overstate the importance of their findings. There is no incentive to try to replicate the work of others. Going forward it will be critical for the scientific community to incorporate a variety of metrics in its evaluation of research. Hence, the scientific community is yet to see how helpful and authentic this new study would be to evaluate novelty in research.

Rate this article


Your email address will not be published.

You might also like

Sign-up to read more

Subscribe for free to get unrestricted access to all our resources on research writing and academic publishing including:

  • 2000+ blog articles
  • 50+ Webinars
  • 10+ Expert podcasts
  • 50+ Infographics
  • Q&A Forum
  • 10+ eBooks
  • 10+ Checklists
  • Research Guides
[contact-form-7 id="40123" title="Global popup two"]

    Researchers Poll

    Which is the best strategy to support research integrity according to you?