The Impact Of Policy And Politics On Net Neutrality

Net Neutrality is the concept that no company should be able to determine what level of services it can provide based on the content that passes through it. In other words, this means that there should be a level playing field for everyone to use the Internet and its content.

Companies like AT&T, Comcast, and Verizon that provide the underlying Internet “pipes” as cable companies and Internet Service Providers (ISP) are now interested in producing content. From their corporate standpoint, this makes sense since these companies are exploring new areas for revenue generation to maximize profits. At the same time, this new direction also puts these companies in direct competition with content providers such as Netflix and HBO. While competition is good in the marketplace but this new direction gives the cable companies and ISPs an unfair advantage of delivering their own content faster than their competition since they own the “wire”.

For Internet technologies, the policy and political perspectives revolve around the issues of governance. These perspectives are discussed below:

Impact of Policy on Internet Technologies:

Based on the content, cable companies and ISPs will be able to prioritize which content should load faster. This content prioritization will typically entail conducting Deep Packet Inspections (DPI) where content will be thoroughly read by these corporations. As we can imagine, DPI opens the door for privacy concerns, security issues and slowing down of the Internet.

If Net Neutrality is eliminated and the wire owners are given the capability to direct the network traffic as they please based on their own criterions then this would become a governance nightmare for the government. How would the government be able to regulate this unfair competition? How would the government be even able to find out about this unfair competition? How would the government manage the processes of net neutrality? How would the government know if security policies have been violated and private information has been compromised? How would the government even know who to go after since there could be a point where the content providers could blame the wire owners of slow traffic while wire owners could blame content providers for creating content that is not “optimized”. These are questions that the government has to consider to have effective governance that everyone can adhere to.

For governments, the elimination of Net Neutrality will entail developing policies, regulations, and technologies that monitor cable companies and ISPs to reduce the unfair advantage. Where would this authority come from? Where would the budget come from to create effective monitoring tools? Elimination of Net Neutrality leaves us more questions than answers and in a marketplace, without oversight, this could be a wild wild west where organizations create their own rules to eliminate the competition.

Impact of Politics on Internet Technologies:

The impact of Internet Technologies from a political perspective is noteworthy. The corporations that are pushing for the elimination of Net Neutrality are rich telecom organizations with big lobbying money and election donations. While I am not suggesting that lawmakers have been bought outright but something does not fit well. Why would lawmakers oppose an open Internet that has given us companies like Google and Yahoo! that created numerous job opportunities for US citizens?

For governments that are responsible for governing wire owners can be affected by the political maneuvering of lawmakers. This can produce challenges for effective governance. How can effective governance happen when the lawmaker wants more business in the state where these big corporations are located and provide campaign funding. For Internet technologies, this means that these technologies might not only have to comply with existing standards of packet deliveries and their flows but also might need to adhere to rules set by wire owners which might play politics and set in motion even that would be difficult to recover from.

As we can see that the policy and political perspectives are highly related and cannot be seen separately.  For government, corporations, and individuals the lack of neutrality will mean an end of an era of prosperity that we saw from the Internet revolution.

Success! You're on the list.


As an Enterprise Architect, I help organizations transform through people, processes, and technologies where I have had my fair share of dealing with technology infrastructure issues. However, in dealing with technology infrastructure, I have not paid that much attention to the underlying networks since I have always assumed that they will be there and always available. But after reading this article about Alohanet, I have come to realize that what we take for granted today is the result of many years of problem-solving activities that involved universities, military and commercial organizations. Thus, I now have a greater appreciation for the importance of networks for individuals and organizations.

Typically in conversations with others, I have often indicated that the Internet came from ARPANET, which was a military-funded project. While this is correct but it diminishes the role the University of Hawaii played in laying the foundations of the Internet before it was even funded by the military.  Prior to this article, I was not aware of the University of Hawaii’s contributions. What is interesting is that the Internet started with some humble beginnings in the 1960s where some people in the university were just trying to figure out how to share resources across the various university buildings that were spread across the various Hawaiian Islands. To think that the foundations of the Internet came from islands that were created by volcanic activity in the middle of the ocean millions of years ago is truly awe-inspiring.

The author does a great job of beginning with a story and then getting into the technical details of network communications. There are a couple of interesting points that the author talks about which I will relay below:

Firstly, the original goal of the ALOHA system was not to create this robust network of networks (i.e., the Internet) that every individual and organization can use but it was simply to see if radio communications could be used as opposed to conventional wire communications when needed. Interestingly, this was uncharted territory even for the experts who at the beginning did not realize the importance of radio broadcast channels versus multiple access capabilities and conventional point-to-point wire channels. In hindsight, going with radio broadcast channels was the right choice because otherwise a point-to-point wire channel would have cost too much from an infrastructure standpoint and would not scale as rapidly due to the time it would take to establish various point-to-point channels. In my experience, technologies that do scale quickly have three main ingredients (1) appropriate funding (2) a collaborative environment and (3) the level of too much technical sophistication is hidden from the end-users. This is how I see the evolution of networks from its resource sharing to now the use of the Internet.

Secondly, the author refers to the “usual software delays” even when developing network protocols. To me, this seems to indicate that software delays are nothing new and although we pay a lot of attention to them today, they have been the ‘norm’ for a while. From a broader lens, this comment also illustrates the reliance on networking on the underlying software that is used to handle data packets. From this, we can decipher that the relationship between network and networking software is a very close one.

Thirdly, the international efforts that involved research facilities and universities to show the potential of data networks are noteworthy. It shows the combined resolve of humans to test and solve problems collaboratively. I am not sure if this still happens today where instead of being protectionists about technologies, it is used by and for everyone. From a broader perspective, this also means that the military, research facilities, and universities were looking at the exchange of data through broadcasted data packets going beyond just the national boundaries.

Fourthly, the advent of the microprocessors and its incorporation into terminal control was an important achievement that opened up the doors for commercial usage. One thing led to another, first a paper, then a book, then looking at various mediums for packet broadcasts and then the tipping point where Motorola introduced its unslotted ALOHA channel in the personal computer. Interestingly, all of these events happened in a decade and thus opened up new possibilities for not only the people involved but for everyone else.

Lastly, the alignment of strategy and theoretical realities is I believe to be the key to all of what was going on. It seems like the process of learning went both ways where strategy learned from execution that fed back into strategy. In today’s world, this alignment is difficult to come by for many reasons. From a problem-solving perspective, this misalignment can result in delays, overruns, and frustrations. I am sure the data packet broadcast journey had its own issues as well but that did not deter people from keeping the eye on the big picture. Where would we have been today if the misalignment continued and there was no resolution? I would argue that the Internet would still be developed, networks would be incrementally improved but perhaps the Internet revolution would at least be delayed.

In conclusion, this article showcases the human resolve to pile through uncharted technical territories, figuring things out as they went along and the resolve to accomplish the desired objectives. It also illustrates the happenstance of putting the ALOHA system on the list for Interface Message Processors (IMP). There were numerous moving parts but at the end sending of data packets through the broadcast channel was a success that paved the way for future innovations.

Success! You're on the list.

Is the Internet a Distributed System?

Tanenbaum and Steen describe a distributed system as “a collection of independent computers that appears to its users as a coherent system.” This means that even if there are multiple heterogeneous components within the distributed system communicating with each other, but from a user’s point of view it is a single system. An example of a distributed system would be the World Wide Web (WWW) where there are multiple components under the hood that help browsers display content but from a user’s point of view, all they are doing is accessing the web via a medium (i.e., browsers). The following figure from Tanenbaum and Steen below helps visualize their definition of a distributed system.

Distributed Systems Organized as Middleware

From the above figure, we can observe that the distributed systems layer sits in-between the various computer applications and the independent computer operating systems. What the authors are trying to show here is that distributed systems are at a software layer level that acts as the “glue” which helps in sharing of resources across various independent components (i.e., computers) but at the same time seems like a single system to the end-users. The authors call this type of distributed system middleware. Additionally, we notice that these components are connected via a network. While it is not clear what kind of network this is but we can extrapolate that these independent computers are on the same network.

As we can see that the importance of the network cannot be minimized. For if there is no network then it becomes difficult for independent components to talk to each other and share resources hence there is no distributed system. The importance of the network is such that when we look at the 8 fallacies formulated by Peter Deutsch 8 out of the 8 fallacies when developing distributed applications are about the network. Following are these 8 fallacies:

  1. The network is reliable
  2. The network is secure
  3. The network is homogenous
  4. The topology does not change
  5. Latency is zero
  6. Bandwidth is infinite
  7. Transport cost is zero
  8. There is one administrator

Despite the importance of the network for distributed systems, can we truly claim that the Internet, which is a network of networks, is really a distributed system? I would say no since while a network provides the essential connectivity and communication channels for a distributed system but the network itself is not a distributed system. From an end-user perspective, the Internet might appear to be a single system (e.g., email) but in reality, email is not the Internet but a service provided on top of the Internet utilizing existing Internet infrastructure. Vera-Ssmio and Rodrigues agree with the claim that there is a distinction between a network and a distributed system. They emphatically say that “a computer network is not a distributed system.” 

Beyond the technological aspects though, should we be looking at distributed systems from a broader lens. Should we be looking at distributed systems from security and privacy perspectives? The answer is of course yes. The reason is that by definition within a distributed system components share resources. Some examples of sharing resources would include memory allocation and computing power optimization to name a few. But the sharing of resources opens up a Pandora’s box of issues related to security and privacy. This is due to the fact that when sharing resources, certain information (e.g., computer IP addresses, open ports, etc.) needs to be shared as well. The exposure of this information can result in unintentional consequences on one end or deliberate attacks on the other end. We need to ask ourselves: How much information sharing is too much? What happens when information is compromised? Should the Internet become a Distributed System? What happens if one computer is exposed and an intruder has gotten onto the network? How do you safeguard other computers on the same network that share resources?

In conclusion, in the 21st century, the Internet has become a necessary tool for businesses and individuals to interact with each other and share information. Some examples of this information sharing include emails, browsing the World Wide Web, conducting a financial transaction, sharing photos, etc. As time progresses, the importance of the Internet will only increase which would result in improvements and the creation of new services and business models. Thus, in order for businesses and individuals who are interested in leveraging the power of the Internet, it is useful to understand what the Internet is and what it is not. So, when we hear that if the Internet is a distributed system, the immediate reaction for some people is of course it is. But if we dig a little deeper, we would realize that the answer is not as simple. The reason is that the Internet itself is a network of networks and does not necessarily fall under the classic definition of a distributed system. Thus, in this paper, we have made the argument about the Internet not being a distributed system and raised some issues that go beyond the technological realm.


  1. Tanenbaum, Andrew S., and Maarten Van. Steen. Distributed Systems: Principles and Paradigms. Harlow: Prentice Hall, 2006. Print.
  2. Veríssimo, Paulo, and Luís Rodrigues. Distributed Systems for System Architects. Boston: Kluwer Academic, 2001. Print.

Success! You're on the list.

Net Neutrality and Quality of Service

Net Neutrality is the concept that no company should be able to determine what level of services it can provide based on the content that passes through it. In other words, this means that there should be a level playing field for everyone to use the Internet and its content.

Companies like AT&T, Comcast, and Verizon that provide the underlying Internet “wires” as cable companies and Internet Service Providers (ISPs) are now interested in producing content. From their corporate standpoint, this makes sense since these companies are exploring new areas for revenue generation to maximize profits. At the same time, this new direction also puts these companies in direct competition with content providers such as Netflix and HBO. While competition is good in the marketplace but this new direction gives the cable companies and ISPs an unfair advantage of delivering their own content faster than their competition since they own the wire.

From a network management perspective, in order to eliminate Net Neutrality, cable companies and ISPs would be looking at Quality of Service (QoS) specifically related to network traffic shaping and metrics.

Traffic Shaping – The main purpose of traffic shaping is to restrict network traffic entering the network at access points. This is done to prevent overloading of the network and to assign queuing priorities based on complex algorithms. Depending upon corporate preferences these algorithms can filter content that is deemed not necessary and assign certain percentages of capacity to some applications. What this means from a Net Neutrality point of view is that cable companies and ISPs can completely filter out their competitor’s content, can significantly slow competitor’s content and increase the percentage capacity of their own applications.

The big question is not if this will happen but what can the government do to monitor this and prevent this from happening. Due to the current state of government budgets, I would argue that the government would not be actively monitoring the networks but instead be more reactive and wait on complaints from cable companies and ISPs competitors. Even if these complaints are legitimate and can indicate unfair competition, the loss of revenue during litigation might be insurmountable for these competitors. Thus, not only would these competitors be slowly eliminated but also prevent future competitors from coming into the market place since they will be aware that cable companies and ISPs would have an unfair advantage of manipulating network traffic.

Metrics – In order to measure QoS, we need to have certain metrics. These metrics help us compare and contrast to understand and improve services. Additionally, these metrics also arm us with information that can be used to make decisions as individuals and organizations. What this means from a Net Neutrality point of view is that cable companies and ISPs can increase or decrease the QoS simply based on their own criterions. These criterions might include affecting the availability of a competitor’s network, increasing error rates due to retransmission, affecting latency and jitter where competitor’s customer satisfaction declines, slow loading of applications and creating Service Level Agreements (SLAs) that guarantee best services to only a select few who can afford premium prices.

For larger and well-funded organizations, the manipulation of QoS metrics with the help from cable companies and ISPs would guarantee their survival and drastically reduce the startup mentality that new companies embrace to fully utilize the power of the Internet as a fair playground for everyone to compete in. For a regular consumer, they might not see how these mafia-style tactics would affect them but in the long-term by the time they realize it would be too late and they would be left with only a few choices from whom they get their services from. Eventually, this lack of choice would result in customers feeling helpless and questioning why there are not any innovative companies out there to increase competition. The reason unbeknown to the customers would be how large corporate lobbying and individual self-interest twisted the arms of the government to create an unfair advantage for them decades down the road.

In conclusion, the concept of free and openness, the very basis that the United States was based on under its constitution is being threatened in the age of the Internet. Not only would Net Neutrality affect domestic competition but it also significantly affect global growth since most of the Internet wires are owned by US-owned corporations. Perhaps this is a way to stay competitive by being unfair to the rest of the world. At the end of the day, the power of the individual would be taken away and the power of the select few will continue to increase. Perhaps it is time to have leaders who can understand technology in a global context and not be afraid of healthy global competition.


%d bloggers like this: