What Is Net Neutrality?
The foundation of the Internet was laid by the Advanced Research Projects Agency (ARPA) in the late 1960s, when a communications network was created between four local host computers. But since those days of the ARPANET, data has always moved through the packet switches with no constraints as to the type of data or any assignments of priority to that data.
However, new technologies are capable of changing that democracy! Thanks to deep packet inspection (DPI) technology, network operators now have the capability to identify what kind of traffic is being funneled through their network. They can also control the relative speed of different packets of data.
The potential to leverage this technology in the sale of premium fast lanes to commercial users was quickly realized, and the battle for net neutrality was born!
Advocates argued that the Internet should be formally recognized as a utility and should remain equally available to all. Network providers argued that the growth of utilization rates was creating bottlenecks and negative user experiences at peak traffic times. In order to fund the construction of bigger data pipes, they should be allowed to charge providers such as Netflix for the privilege of having their data prioritized along those pipes.
In February 2015, the Federal Communications Commission proposed the reclassification of broadband services as a public utility, thereby shutting down all plans to create fast lanes or to “choke and throttle” existing service levels. Immediate court challenges from Wi-Fi and Internet Service Providers (ISPs) were expected.
Neutrality in Two-Tier Academic Publishing
Academic researchers all around the world have suddenly found interest in the net neutrality discourse. A new debate has been thrown open altogether: does academic publishing need net neutrality too?
The industry, according to researchers, is facing similar challenges in user frustration with a perceived two-tier system. Traditional academic journals charge premium rates for the privilege of dictating what content their subscribers are able to view and when. The type of content presented is dictated more by the extent to which it is likely to receive high citation numbers than its relevance to actual readers. For this reason, replication studies that would be of interest to their readership are often passed over in favor of newer “hot” topics that will generate wider attention, or counterintuitive results that will seem to challenge existing schools of thought.
Journal editors argue that they are providing a valuable service in screening the submissions and subjecting viable candidates to allegedly extensive peer review, but as the Open Access model has already started to demonstrate, such “valuable services” can be delivered at a much lower price and in a much more user-friendly manner.
Is It the Need of the Hour?
The market strength of traditional academic journals has always rested on the rankings developed over decades of global citation of the research work that they publish. That has always provided a huge barrier to entry, as the newly minted Open Access journals have discovered to their chagrin.
Popularity based on rebellion against a detested model can only get you so far if the rest of the establishment continues to hold the output and perceived prestige of that model in high regard. Achieving a level of parity where having your work published in an Open Access journal will be regarded as being equal to publication in a traditional journal may seem unlikely as long as those traditional journals hold the economic high ground. But as the justification for those higher subscription rates becomes increasingly harder to defend, the winds of change may finally be blowing with enough strength. Just to conclude, does academic publishing need net neutrality? Yes!
Comments are closed for this post.