Wireless Frequency Spectrum and Consumer Choices

According to the Federal Communications Commission (FCC), the spectrum is the range of electromagnetic radio frequencies used to transmit sound, data, and video across the country. This range includes all possible frequencies available from 0 hertz to infinity and these frequencies divided into contiguous service bands that are further divided into channels. The use of different frequencies for wireless communication is not new. The organizations that use these different frequencies include defense (e.g., drones, weapons), government (e.g., police, fire departments) and commercial (e.g., phone services by telecoms, TV stations). People also utilize these frequencies (e.g., Bluetooth, 802.11 wireless routers at home) for different purposes. Thus, due to the widespread usage of different frequencies, it becomes important that we understand the discussions at FCC regarding opening up different frequency spectrums.

Although the available frequencies are infinity the 500MHz to 10GHz is the most optimum and thus it is called the golden zone. In the golden zone, the lower the frequency, the better the wireless signals spread and thus minimum infrastructure (e.g., radio towers, signal boosters) is required. In order to avoid signal congestion, different organizations operate at different frequency spectrums. For example, Verizon operates at 800MHz and 1900MHz, Sprint at 1900MHz and TV broadcasters at 600MHz. From this, we can see that TV operates at an optimum frequency within the golden zone.

Due to the explosion in the utilization of mobile devices, there is an increasing demand for better wireless signal transmissions. In light of this increased demand, the FCC is considering to request/force TV broadcasters to free up their 600MHz frequency spectrum that travels better through long distances and through buildings so that telecoms can use it. The disagreement in this FCC consideration is how should this frequency spectrum be opened up. One option has an open auction where telecoms can freely bid and the other option is that large telecoms should have an upper limit of how much spectrum they can bid for. In the telecom industry, selection of either option matters. The reason it matters is that if there is an open auction then the telecom with the deepest pockets would take most if not all of the frequency spectrum released and thus would leave the smaller telecoms with limited options.

So, why should this “spectrum wars” matter to the consumers? The reason it matters is that if the large telecoms were able to get all the released TV frequencies then smaller telecoms would be forced to use higher frequencies. This means that smaller telecoms would have to spend more on infrastructure and maintenance of expensive equipment. This would result in the smaller telecoms passing these expenses to the consumers in the form of increased prices. Since most consumers are price conscious and want better quality and reliability from their phone service, they would select large telecoms as their best option. When the consumers start to make this decision, it will be only a matter of time when the business model for smaller telecoms would become unsustainable and they would have to close shop. After the smaller telecoms have disappeared we will see a lack of competition, monopoly and future headaches for the consumers.

For the consumers, the future headaches would entail increased prices and elimination of services. Even without the actual allocation of frequencies, there is a trend by larger telecoms to take advantage of their size and they have started doing this by eliminating unlimited data plans. Imagine if this is happening now, how the larger telecoms would have the consumers in a chokehold when these organizations acquire the TV frequency spectrums. What else would these telecoms be able to do then?

In conclusion, when the consumers close their eyes to what is going to happen, when they are most concerned with the immediate future but not the distant future, they would be exploited by large organizations. Not only would the attainment of TV frequencies benefit telecoms, but also the question becomes that if TV broadcasters are giving up their frequencies then what are these broadcasters going to do? Will the TV broadcasters move to cable and if yes then large telecom companies (e.g., Verizon) would still benefit since not only would they provide phone service but TV service and content creation. This to me seems another attempt to monopolize every media medium and would most likely result in corporations that would become too big to fail. Perhaps for the shareholders, this would be a good thing but for the average consumer, this would mean lack of choice and thus lack of freedom.


  1. http://www.fcc.gov/spectrum
  2. http://www.fiercewireless.com/story/tv-broadcasters-remain-wary-600-mhz-incentive-auction/2014-03-26
  3. http://www.wilsoncellularbooster.com/page.html?chapter=1&id=9

Net Neutrality and Quality of Service

Net Neutrality is the concept that no company should be able to determine what level of services it can provide based on the content that passes through it. In other words, this means that there should be a level playing field for everyone to use the Internet and its content.

Companies like AT&T, Comcast, and Verizon that provide the underlying Internet “wires” as cable companies and Internet Service Providers (ISPs) are now interested in producing content. From their corporate standpoint, this makes sense since these companies are exploring new areas for revenue generation to maximize profits. At the same time, this new direction also puts these companies in direct competition with content providers such as Netflix and HBO. While competition is good in the marketplace but this new direction gives the cable companies and ISPs an unfair advantage of delivering their own content faster than their competition since they own the wire.

From a network management perspective, in order to eliminate Net Neutrality, cable companies and ISPs would be looking at Quality of Service (QoS) specifically related to network traffic shaping and metrics.

Traffic Shaping – The main purpose of traffic shaping is to restrict network traffic entering the network at access points. This is done to prevent overloading of the network and to assign queuing priorities based on complex algorithms. Depending upon corporate preferences these algorithms can filter content that is deemed not necessary and assign certain percentages of capacity to some applications. What this means from a Net Neutrality point of view is that cable companies and ISPs can completely filter out their competitor’s content, can significantly slow competitor’s content and increase the percentage capacity of their own applications.

The big question is not if this will happen but what can the government do to monitor this and prevent this from happening. Due to the current state of government budgets, I would argue that the government would not be actively monitoring the networks but instead be more reactive and wait on complaints from cable companies and ISPs competitors. Even if these complaints are legitimate and can indicate unfair competition, the loss of revenue during litigation might be insurmountable for these competitors. Thus, not only would these competitors be slowly eliminated but also prevent future competitors from coming into the market place since they will be aware that cable companies and ISPs would have an unfair advantage of manipulating network traffic.

Metrics – In order to measure QoS, we need to have certain metrics. These metrics help us compare and contrast to understand and improve services. Additionally, these metrics also arm us with information that can be used to make decisions as individuals and organizations. What this means from a Net Neutrality point of view is that cable companies and ISPs can increase or decrease the QoS simply based on their own criterions. These criterions might include affecting the availability of a competitor’s network, increasing error rates due to retransmission, affecting latency and jitter where competitor’s customer satisfaction declines, slow loading of applications and creating Service Level Agreements (SLAs) that guarantee best services to only a select few who can afford premium prices.

For larger and well-funded organizations, the manipulation of QoS metrics with the help from cable companies and ISPs would guarantee their survival and drastically reduce the startup mentality that new companies embrace to fully utilize the power of the Internet as a fair playground for everyone to compete in. For a regular consumer, they might not see how these mafia-style tactics would affect them but in the long-term by the time they realize it would be too late and they would be left with only a few choices from whom they get their services from. Eventually, this lack of choice would result in customers feeling helpless and questioning why there are not any innovative companies out there to increase competition. The reason unbeknown to the customers would be how large corporate lobbying and individual self-interest twisted the arms of the government to create an unfair advantage for them decades down the road.

In conclusion, the concept of free and openness, the very basis that the United States was based on under its constitution is being threatened in the age of the Internet. Not only would Net Neutrality affect domestic competition but it also significantly affect global growth since most of the Internet wires are owned by US-owned corporations. Perhaps this is a way to stay competitive by being unfair to the rest of the world. At the end of the day, the power of the individual would be taken away and the power of the select few will continue to increase. Perhaps it is time to have leaders who can understand technology in a global context and not be afraid of healthy global competition.


  1. http://www.ocf.berkeley.edu/~raylin/whatisnetneutrality.htm
  2. http://cyberlaw.stanford.edu/downloads/20120611-NetworkNeutrality.pdf

Wireless Communications, Privacy Concerns and Regulations in the Age of Mobile Tracking

The article, Attention shoppers: Retailers can now track you across the mall, talks about how retailers are able to wirelessly capture the MAC addresses of your phone to understand your buying patterns. The retailers use these buying patterns to improve their marketing strategies and sell you more things. The article mentions that there are routers that help them in capturing this information and there are data tracking companies that are using them to gather information to sell to the retailers. These data tracking companies do this by accessing your phone wirelessly to get your MAC address, perform a one-way hash on the MAC addresses and then aggregate the data before it is sold to retailers. This data can also be provided to law enforcement agencies with the proper paperwork. These data tracking companies insist that they are not snooping on the phone to gather other information such as contacts and web archives.

From a wireless security perspective, there are various ways of how hackers can get into the network. The way the MAC address is being captured indicates that even if the MAC address is being hashed, the wireless transport of the MAC address from the consumer’s phone to the retailer’s location is not encrypted. Additionally, when these MAC addresses are compromised then hackers can use this information to gather specific data on the consumer by creating a “base station clone” similar to an “evil twin” attack. This is not something that might happen in the future but it is happening now and the consumers are mostly unaware. The consumers do not know that their movements are being tracked and recorded and thus cannot really react to it. On the other hand, consumers who do know that their data is being accessed wirelessly have the option to opt-out by going to the data tracking companies’ websites. Even if consumers opt-out, who knows how many of these companies are out there getting your data. Another option the consumers have is to shutoff automatic wireless connectivity of their phones but even in this case how many consumers would actually remember or even know how to do that.

We can see that the tracking of consumers without their knowledge raises privacy concerns. These privacy concerns revolve around “legitimate” capture of the consumer’s movements, contact lists, text messages and visited websites to name a few. Additionally, there are technologies at play here that could compromise the phone’s camera and voice recorders as well which would take us beyond the consumer but for anyone the consumer interacts with. All of this can be advantageous to the hackers who are looking to do some harm to you or people that you may know. Since most of our phones are constantly connected to the Internet, we remain vulnerable. There used to be a time when tracking was a concern in the online world but that has changed. Now, this tracking is happening in the real world and in real-time. The bleeding-in of online tactics into the real world will only increase as time progresses and we will see the further blurring of the lines when it comes to privacy. On the other hand, some consumers prefer to give up “some” of their privacy so that better services and targeted marketing can be provided to them. My concern with this giving up of some of their privacy is that it is just a start, a nudge to see how far consumers can open up their private matters. What is interesting is that companies actually do not need anyone’s permission to track you and thus have already trampled on and will continue to “test” how much they can get away with.

Lastly, what is troublesome about all this tracking is that currently there are no strong laws and regulations about making use of the information and safeguarding it against these kinds of malicious attacks. There seems to be no one monitoring what are the limits these companies can go to and whether special handling is required to protect this consumer data. Are we supposed to take the word of these companies that they are doing the right thing? Do the consumers really have a choice?

In conclusion, the usage of phones to gather information about you as a consumer would continue to increase and wireless technologies would be the norm. From a broader lens, while technology opens up doors to do new and innovative things, it can also be exploited for unauthorized access to your information. We as a society have to carefully figure out the pros and cons and keep the individuals in the loop because in the end it really comes down to trust and responsibility towards the individuals.




Research Finds MAC Address Hashing Not a Fix for Privacy Problems




Target’s Network Breach

Target’s corporate network was breached between November 27th, 2013 to December 15th, 2013 that resulted in 110 million credit cards and personal records being stolen. This data breach happened during the busiest days for retailers between the hours of 10:00am to 6:00pm. When I heard about this data breach in the news, I assumed that the attack was so sophisticated that it might have been difficult for Target to preempt it. But as I started to read articles on the breach, it became clear that although the attack used a combination of social engineering, phishing, and malware techniques it was not a sophisticated attack.

The articles indicate that intruders obtained stolen login credentials from one of Target’s vendors. This vendor accessed information on Target’s intranet via a portal that was set up for Target’s vendors. Once the intruders were able to get into the network, they were able to access Target’s Point-Of-Sale (POS) payment systems. The intruders were able to install malware on these systems that provided them with consumer information.

From a technology perspective, a few gaps that standout include:

  • The lack of segmentation of the network to avoid access to payment systems
  • The inability to identify data transfers to an unauthorized File Transfer Protocol (FTP) server
  • The failure to detect that intruders were testing their malware on a few POS systems before they launched a full attack
  • The deliberate ignoring of malware warnings from their internal systems

From a business perspective, a few gaps that standout include:

  • The data breach was not made public until Brian Krebs (a blogger) broke the news
  • Lack of adherence to security processes and practices
  • Too much reliance on just security certifications

The following figure shows the network and the route intruders took to access Target’s POS systems.

Intruders path used for Target's breach
Intruders path used for Target’s breach

From the figure, we can visualize the multiple issues with Target’s network. Although Target cannot be held accountable for any vulnerabilities caused due to its vendors it should have considered the potential of exploits that may be caused due to its vendors. Since the underlying network was the same, it had to be secure and constantly monitored for systems spikes and unusual network traffic activities. What is interesting about this attack is that although a large number of consumer records were stolen it has not been reported that how many corporate accounts were stolen too. Since 54% of businesses are small in the US, I am sure there have been corporate accounts that have been compromised as well.

As I have looked at what happened in this case, security is a management issue. In Target’s case, its management failed to see beyond their corporate environment and did not see holistically how the business processes that they put in place could be their downfall. Management also failed to inform the consumer. How long management would have waited to inform the consumers? My guess is that if the story had not been leaked, Target management would have conveniently forgotten about it.

In conclusion, there are three broader issues that come to mind (1) organizations are only as strong as their weakest link especially when that link is the corporate network (2) although organizations provide credit monitoring typically for a year but they assume that the intruders would only use this exploited information for a year and (3) consumers are left to fend for themselves during and after the one year period when their credit scores decline. In short, these data breaches have to be significantly eliminated since they not only cause loss of reputation but also loss of consumer confidence in an organization’s role as the protectors of our data. This is possible by not only credentialing but repetitively checking various ways how systems can be exploited from within and external to the organization even if the underlying network is perceived to be secure.




Click to access Target_Kill_Chain_Analysis_FINAL.pdf



5 Questions to Ask About Predictive Analytics

Predictive Analytics is a branch of data mining that uses a variety of statistical and analytical techniques to develop models that help predict future events and/or behaviors. It helps find patterns in recruitment, hiring, sales, customer attrition, optimization, business models, crime prevention and supply chain management to name a few. As we move to self-learning organizations, it is imperative that we understand the value of Business Analytics in general and Predictive Analytics in particular.

It turns out that Predictive Analytics is about Business Transformation.  But in order for this Business Transformation to take place, you have to take into account the organizational contexts in the following ways:

  1. Strategic Perspectives: Not all organizations are the same and thus what works in one organization might not work in yours. Based on the knowledge of your organization’s maturity, you have to decide if Predictive Analytics is going to be a top-down, bottom-up, cross-functional or a hybrid approach. Additionally, take into account what should be measured and for how long but be flexible in understanding those insights might be gained from data that might initially seem unrelated.
  2. Tactical Perspectives: One of the key factors in Business Transformation is change management. You need to understand how a change would affect your organization in terms of people, processes, and technologies. You have to take into account the practical implications of this change and what kind of training is needed within your organization.
  3. Operational Perspectives: It is all about how the execution of Predictive Analytics is done within your organization. To fully integrate Predictive Analytics into your organization, you have to learn from best practices, learn the pros and cons of your technology infrastructure and determine if the necessary tools are intuitive enough for people to make use of them.

Now that you understand the different organizational perspectives, it is time to ask the following:




Who uses Predictive Analytics to make decisions? Who should use Predictive Analytics to make decisions?
What happens to decisions when Predictive Analytics is used? What would happen to decisions if Predictive Analytics will be used?
Where does the data for Predictive Analytics come from? Where should the data for Predictive Analytics come from?
When is Predictive Analytics relevant? When should Predictive Analytics be relevant?
Why Predictive Analytics is being used? Why Predictive Analytics should be used?

When you ask the above questions, keep in mind that the reliability of the information and how it is used within the organization is paramount. A pretty picture does not guarantee that the insights you get are correct but you can reduce decision-making errors by having people who understand what the data actually means and what it does not.



%d bloggers like this: