1

Using Big Data for Research

What is Big Data?

Growing concern over how much data is available to researchers and the potential for what we can do with all of that data is arguably one of the hottest topics in research today. “Big Data,” it is argued, will drive new employment and research trends and open-up completely new business models and opportunities as we struggle to manage what many consider to be a deluge of data that is now available to researchers.

To attempt some sense of scale as to what “Big Data” means, Eric T. Meyer who is from Oxford University a panelist at the 2014 conference for the Association of Learned and Professional Society Publishers (ALPSP) answered. When asked what constituted “big” in Big Data, he explained that: “if it’s easily handled on your laptop, it’s not big data!” Since laptop hard drives are now at 1 Terabyte (TB) or more, Big Data is clearly taking us on the road to Petabytes (1,024TB) and higher.

Size isn’t the Biggest Problem

The availability of data currently far exceeds our ability to manage, analyze, and learn from it. Terabytes of data require a sophisticated and expensive infrastructure for storage and ongoing maintenance, and the cost of such services may place that data out of reach from some of the institutions that could benefit from accessing it.

In addition, the longer it takes to develop the analytical tools and common coding standards for analysis, the greater the backlog of data will be. Researchers are already working under a data deluge, and there is no way to hold that deluge at bay while we figure out what to do with it.

A Global Solution For A Global Windfall

The technological advances that have produced the development of Big Data may have originated in more economically advanced nations, but the data now being collected is from every nation and should, by rights, be made available to every nation. For that goal to be achieved, there has to be standardization of data collection and data analysis methodologies.

Given that the USA and the rest of the world have been trying to reach convergence on accounting standards for almost 15 years now [Generally Accepted Accounting Principles (GAAP) in the USA, and International Financial Regulatory Standards (IFRS) everywhere else], that may be easier said than done, but when the World Economic Forum declared data to be an economic asset in 2012, it sent a very clear message about the need to treat this issue with respect and urgency.

Lack of Comfort

Being afraid of Big Data assumes that there will always be a malicious intent behind the use of all that information. There will no doubt be instances where nefarious uses will be found for it, but concern for that should also be balanced with the tremendously positive potential for information about ourselves. Advocates argue that science is built on the pursuit of knowledge, such that the greater availability of information through sophisticated technology is just a logical extension of that pursuit.

However, there is still a great deal of discomfort to be found in Big Data—not only in how much information can be gathered, but also in what can be done with that information, and what it may reveal.

 

Rate this article

Rating*

Your email address will not be published.

You might also like
X

Sign-up to read more

Subscribe for free to get unrestricted access to all our resources on research writing and academic publishing including:

  • 2000+ blog articles
  • 50+ Webinars
  • 10+ Expert podcasts
  • 50+ Infographics
  • Q&A Forum
  • 10+ eBooks
  • 10+ Checklists
  • Research Guides
[contact-form-7 id="40123" title="Global popup two"]





    Researchers' Poll

    What should universities' stance be on AI tools in research and academic writing?