1

Why Quality Data is Lost with Advancement in Technology?

A Double-Edged Sword

While the days of the television stations, landline telephones, punch-card computers, and library card catalogs may be decades behind us now, the rapid advance of technology has brought both convenience and stress.

Incessant emails, social media updates, RSS feeds, and instant messaging bring a deluge of constant interruptions to your day. If you’re in need of information, our new knowledge economy brings it to you in near-infinite quantities to whichever electronic device you may be using in fractions of a second.

However, the quality of that information is dictated by the provider, whether it’s 24-hr cable news of Google search rankings that are manipulated through search engine optimization (SEO) techniques to improve the results page upon which your information is displayed.

Unfiltered Information

Despite numerous and frequent changes to their search algorithm, Google’s search results continue to suffer from so-called black hat practices where SEO experts find ways to artificially boost their search rankings through the manipulation of traffic and back links to other sites. The outcome for earnest seekers of information is that you are presented with results that are not truly reflective of the quality of information to be found on those highly ranked sites.

For academic researchers, Google Scholar has made some progress in moving from the “unfiltered” world of general Google search results in the millions, to more “authoritative” search results in the tens of thousands, but the issue of quality in those results, especially in the realm of open access publication in predatory journals, remains a challenge.

The Need to Stay Informed

The inherent expectation of new information in academic research carries with it the burden of staying updated in the field in which you are researching. In past decades, that meant reading everything that was written on the topic, but in this new world of high volume publication to feed the insatiable demand of hundreds of research journals, such a goal is unattainable.

Online access on a global scale, and aggregated databases have given researchers access to data beyond their wildest dreams, but such democratization of data has led to serious issues of quantity over quality. Abstracting services, annual reviews by industry, and other attempts to summarize and aggregate the high volume of research being produced every year can help, but many researchers still see themselves as ‘drinking from a firehouse’ in their attempts to stay current in their respective fields.

Information Management

The key to managing massive volumes of information, even when it is topic specific, lies in the development of increasingly sophisticated search and data mining tools.

If open access, instant distribution, and post-publication peer reviews are going to continue to fill academic “mega-databases,” we can no longer count on the contribution side of the equation to monitor either quality or content availability. That leaves the consumption side of that equation blatantly exposed.

Without adequate search capabilities, the pressure to stay informed will only increase as researchers are left to plow through ever-larger quantities of sub-par material in order to find quality research work that can be of practical use.

Rate this article

Rating*

Your email address will not be published.

You might also like
X

Sign-up to read more

Subscribe for free to get unrestricted access to all our resources on research writing and academic publishing including:

  • 2000+ blog articles
  • 50+ Webinars
  • 10+ Expert podcasts
  • 50+ Infographics
  • Q&A Forum
  • 10+ eBooks
  • 10+ Checklists
  • Research Guides
[contact-form-7 id="40123" title="Global popup two"]





    Researchers' Poll

    What should universities' stance be on AI tools in research and academic writing?