When Does Research Become “Non-traditional”?

  Oct 20, 2016   Enago Academy
  : Expert Views, Industry News

In academia when we consider the term “non-traditional”, the concept of a non-traditional student (NTS) comes to mind. A great deal of attention and resources have been devoted to winning these NTS’s since they have a unique set of learning needs and are willing to pay the institution that can meet them. They are typically older and need flexible course options that can be balanced with both work and family obligations. Their research skills are often fairly rusty in the early days, but they are committed to academic success.

Traditional Research

Applying the term traditional to scientific research simply denotes a degree of general acceptance. We assume the design, methodology, and subsequent data analysis have been conducted in a manner that conforms to the foundational scientific method to produce results that can withstand scrutiny from equally qualified and experienced peers. This then forms the foundation of further research to validate or extrapolate those results. Just as businesses report financial performance in accordance with Generally Accepted Accounting Principles (GAAP), research is conducted and reported according to a commonly accepted language. Different methodologies have been developed over the years in order to accommodate different research scenarios, but the debate between qualitative and quantitative methodologies continues to this day. The first research study that did not include a statistical analysis was probably labeled and initially derided as being “non-traditional.”

The Danger in Applying Labels

By definition, “non-traditional” implies being different or outside the norm or accepted comfort zone. This can be exciting and interesting, depending on your perspective, but for researchers who live in a world of processes and experiments that must be followed to the letter, being outside the norm sets off many alarm bells and prompts many questions. The first and most cynical question is: “Why was it necessary to step outside the norm?” The expectation here, of course, is that there has to be a suitable justification for that action, without which, the entire study will be dismissed as being of questionable value.

Why Non-traditional Research?

Science must balance the consistency of proven methodology in order to validate research data, with the ability to embrace new methodologies and concepts in order to achieve significant breakthroughs. If we are wary of “non-traditional” as being disruptive, that has to be a good thing. It doesn’t really matter if a non-traditional study is offered as a response to an idiosyncratic research scenario or a deliberate choice to try something dramatically different that challenges established paradigms, the underlying goal is to move science forward into new territory.

We are entering an era of “Big Data” where researchers expect to deliver completely new methods of analyzing previously unimagined amounts of data. Research proposals, IRB documentation, grant applications, and report templates may all have to change in drastic ways. “Non-traditional” may well be the “square peg in the round hole”, but why should it be the peg that has to change shape?


Be the first to write a comment.