According to Gartner, “Internet of Things (IoT) is the network of physical objects that contain embedded technologies to communicate and sense or interact with their internal states or the external environment.” The communication of these physical devices with itself and/or with its environment (e.g., other physical devices, information systems, etc.) generates tremendous amounts of data. Depending upon the end goal, this data can be used anywhere from determining the foot traffic in retail stores to monitoring environmental effects on trees. Since there are so many different uses of the data generated from these IoT devices thus it is difficult to determine how many of these devices would be used in the future. However, Gartner has taken a stab at this and estimates that within 6 years (by 2020) there would be about 26 billion active IoT devices in use. With so many devices in use within a short period, it would be naïve of organizations to think that these IoT devices would have no effect on existing business models and operations.
In order to understand the challenges and opportunities, the following questions need to be asked about your current and future uses of IoT devices:
Who uses them?
Who should use them?
What business processes use them?
What business processes would use them?
Where is the data being captured?
Where should the data be captured?
When are they used?
When would they be used?
Why do they affect the bottom line?
Why would they affect the bottom line?
When you are asking the above questions, keep in mind that organizations who know how to increase the bottom line through the effective use of technology would get into your space even before you think they are your competition. This means that as an organization, you have a choice of either ignoring the IoT revolution or getting ahead by fully immersing yourself into how IoT can provide competitive advantages across all business units.
In conclusion, while it may seem that IoT is only an “IT thing” but in reality, IoT affects the business-side more than it affects the IT-side and not leveraging it can mean the difference between staying alive or quickly becoming irrelevant.
Service Orientated Architecture (SOA) is a framework that allows business processes to be highlighted to deliver interoperability and rapid delivery of functionality. The benefits of SOA include reuse of generalized services, reduce costs and better business and IT alignment. If done correctly, it helps an organization respondent to ever-changing business needs efficiently. If done incorrectly, it can create bureaucracy and silos. This article evaluates the decisions, assumptions, and conclusions made by the research paper, SOA Migration Case Studies and Lessons Learned.
In the research paper, two research and evaluation methods are used to assess different cases for SOA. The first method is the Case Study Method where the researchers develop a theory and based on that theory they develop criterions to select the case studies that will be assessed. This Case Study Method is shown below:
The second method uses a customized version of the Evolution Process Framework (EPF) to evaluate SOA. This customized version is called EPF4SOA and the phases involved in the evaluation are shown below:
With the research and evaluation methods in place, the research paper goes on to assess three multibillion-dollar organizations that have been around for at least 50 years. These organizations have legacy systems that have become archaic and thus they are unable to respond to rapidly changing business needs. Keeping these limitations in mind, these organizations go on the path to extract as much functionality from these legacy systems as possible by creating SOA services that could be used in the organization. Based on the EPF4SOA, the research paper goes on to claim that for effective SOA migration, organizations need to have a strong business case, services design, technology selection, SOA governance, and education and training.
As we read this report, it seems obvious that the researchers have done a good job of evaluating these large organizations from the finance and telecommunications sectors and in highlighting the lessons learned on SOA migrations. However, this research has made some decisions and assumptions that need to be understood. Firstly, in the Case Study Method, there seems to be an element of confirmation bias when the cases being selected are based on an initial theory. This confirmation bias can lead to selecting cases that fit what the researchers are looking for rather than selecting cases and then determining what theories can be derived from those cases. Secondly, the research looks at organizations that have been around for a long time and by their very nature are most likely to have legacy system issues and make the assumption that other organizations would have the same issues. Lastly, the research report alludes to that SOA can help in business and technology alignment but does not take into account strong leadership and organizational change management capabilities that are needed for SOA migrations.
Keeping the above in mind and carefully reading the case descriptions, we can extrapolate that there might be some potential challenges in the cases being presented in this report. These potential challenges are explained below:
SOA is not only an IT concern: One of the lessons learned in this report indicates the need for a strong business case for SOA developed by IT in order to get management support. The fundamental problem with these lessons learned is that it automatically puts the burden of implementing SOA across the entire organization on IT and takes it away from the business side’s responsibility and involvement with SOA. While IT is responsible for creating SOA services but the business has to work collaboratively with IT. Business has to understand how the organization came to a point where it became difficult for quick system changes and how to avoid situations like this in the future. Thus, SOA is not only an IT issue but an organizational endeavor that involves all parts of the organization as well.
Organizational processes need to be reevaluated: One of the cases mentions the presence of too many point-to-point integrations that are reducing the ability of the organization to be more agile. While this might be the case but there is a bigger perspective here that is missing. This perspective revolves around organizational processes in place that led to this in the first place. These organizational processes not only entail IT but also the business side. It seems like in this case IT would do what businesses ask them to but there has to be some mutual understanding that the requests have to be understood holistically. Even after a SOA migration, if these organizational processes are not optimized they might still result in ad-hoc requests from the business leading back to point-to-point integrations.
A long-term view on legacy systems is needed: The cases in the report indicate that replacing the legacy systems was not an option since it would be costly to do this. While in the short-term this seems like a good idea but in the long-term, there are issues with this approach. These issues entail the constant “patching” to upgrade underlying hardware and software in addition to overburdening legacy systems where new services are being added on top of systems that should be replaced rather than being continued to extend their end of life. While for some organizations it might not be possible to replace legacy systems altogether but there should be a plan to retire these systems with new systems eventually.
No measurements mean no ROI exists:While some organizations in the research report did measure SOA migration ROI but that was done after the fact. So, if the organizations were not measuring pre-SOA how would they know if what SOA migrations promised is what the organization was able to achieve. Herein lies the problem where quantification and justification are made to show SOA being a success without doing the due diligence before embarking on the SOA journey.
In addition to the above-identified problems, the research report does not put enough emphasis on the importance of governance that is needed for SOA. Let’s explore what is governance and why it could be one of the differentiating factors in SOA migrations.
Governance: Governance is the policy of how things should be done and provides a framework in which business processes can operate under regulatory, time and other constraints. Thus, governance is an organizational responsibility even for SOA and not only an IT one. In order to accomplish this, a governance board should be set up that consists of a cross-functional team from both IT and business. Additionally, governance should not only include the overall organization and management of SOA activities but also the creation of success and failure measurements. These measurements should be used to actually determine the state of SOA within the organization instead of people doing vaporware measurements that have no grounds in reality.
In conclusion, while the research report is interesting in its own right but it should not be taken as the only lessons learned for successful SOA migrations. Based on a few cases these lessons learned cannot be applied across various organizations such as smaller organizations, governments, and nonprofits but should be taken with a grain of salt. The lessons learned should be a start but not the bible for successful SOA implementations. A successful SOA implementation will depend upon context, processes, technologies, and people since broadly speaking SOA is an organizational change management journey.
According to the Federal Communications Commission (FCC), the spectrum is the range of electromagnetic radio frequencies used to transmit sound, data, and video across the country. This range includes all possible frequencies available from 0 hertz to infinity and these frequencies divided into contiguous service bands that are further divided into channels. The use of different frequencies for wireless communication is not new. The organizations that use these different frequencies include defense (e.g., drones, weapons), government (e.g., police, fire departments) and commercial (e.g., phone services by telecoms, TV stations). People also utilize these frequencies (e.g., Bluetooth, 802.11 wireless routers at home) for different purposes. Thus, due to the widespread usage of different frequencies, it becomes important that we understand the discussions at FCC regarding opening up different frequency spectrums.
Although the available frequencies are infinity the 500MHz to 10GHz is the most optimum and thus it is called the golden zone. In the golden zone, the lower the frequency, the better the wireless signals spread and thus minimum infrastructure (e.g., radio towers, signal boosters) is required. In order to avoid signal congestion, different organizations operate at different frequency spectrums. For example, Verizon operates at 800MHz and 1900MHz, Sprint at 1900MHz and TV broadcasters at 600MHz. From this, we can see that TV operates at an optimum frequency within the golden zone.
Due to the explosion in the utilization of mobile devices, there is an increasing demand for better wireless signal transmissions. In light of this increased demand, the FCC is considering to request/force TV broadcasters to free up their 600MHz frequency spectrum that travels better through long distances and through buildings so that telecoms can use it. The disagreement in this FCC consideration is how should this frequency spectrum be opened up. One option has an open auction where telecoms can freely bid and the other option is that large telecoms should have an upper limit of how much spectrum they can bid for. In the telecom industry, selection of either option matters. The reason it matters is that if there is an open auction then the telecom with the deepest pockets would take most if not all of the frequency spectrum released and thus would leave the smaller telecoms with limited options.
So, why should this “spectrum wars” matter to the consumers? The reason it matters is that if the large telecoms were able to get all the released TV frequencies then smaller telecoms would be forced to use higher frequencies. This means that smaller telecoms would have to spend more on infrastructure and maintenance of expensive equipment. This would result in the smaller telecoms passing these expenses to the consumers in the form of increased prices. Since most consumers are price conscious and want better quality and reliability from their phone service, they would select large telecoms as their best option. When the consumers start to make this decision, it will be only a matter of time when the business model for smaller telecoms would become unsustainable and they would have to close shop. After the smaller telecoms have disappeared we will see a lack of competition, monopoly and future headaches for the consumers.
For the consumers, the future headaches would entail increased prices and elimination of services. Even without the actual allocation of frequencies, there is a trend by larger telecoms to take advantage of their size and they have started doing this by eliminating unlimited data plans. Imagine if this is happening now, how the larger telecoms would have the consumers in a chokehold when these organizations acquire the TV frequency spectrums. What else would these telecoms be able to do then?
In conclusion, when the consumers close their eyes to what is going to happen, when they are most concerned with the immediate future but not the distant future, they would be exploited by large organizations. Not only would the attainment of TV frequencies benefit telecoms, but also the question becomes that if TV broadcasters are giving up their frequencies then what are these broadcasters going to do? Will the TV broadcasters move to cable and if yes then large telecom companies (e.g., Verizon) would still benefit since not only would they provide phone service but TV service and content creation. This to me seems another attempt to monopolize every media medium and would most likely result in corporations that would become too big to fail. Perhaps for the shareholders, this would be a good thing but for the average consumer, this would mean lack of choice and thus lack of freedom.
In his article, Nick Carr argues that in the current business environment Information Technology (IT) does not provide any strategic advantage but it is merely an operational necessity. He equates IT to a commodity much like electricity and mainly talks about IT infrastructure becoming a commodity. Let’s explore this in the context of the Internet and Internet technologies:
The Internet is a network of networks that connects varied computers via switches to allow transmission of data across multiple networks using Internet protocols. Some of the popular uses of the Internet include email, instant messaging, browsing the World Wide Web (WWW) to name a few. In today’s society, the Internet has become an important tool for individuals and organizations to conduct their business. It seems like the use of the Internet has become so ubiquitous that individuals and organizations don’t even think about it and assume it to be always available but does that mean the Internet has become a commodity. In this context, I would agree with Nick that the Internet has now become very similar to a commodity since we are all accessing the same Internet despite the mediums by which we access it.
Internet technologies include browsers and search engines that help us navigate the WWW of the Internet. From Nick Carr’s perspective, these Internet Technologies are commodities and do not provide any strategic value. I disagree with this claim and here is why:
Browsers: Currently browsers are used to browse the WWW and used internally by organizations to access their corporate systems such as Enterprise Resource Planning (ERP) and Customer Relation Management (CRM) systems via a web interface. Thus, the security and privacy capabilities of these browsers become paramount in safeguarding the organizations against malicious attacks. While from the surface it may seem that these browser issues are operational in nature but from a closer inspection we can understand their strategic importance. For example, if an organization chooses one browser over another browser that has less security then the organization becomes vulnerable to exploits of that browser. These exploits can entail simple hacking attacks on the siphoning of organizational data. So, the selection of a browser is not just an operational activity but I believe it to be a strategic necessity.
Search: A McKinsey report, The Impact of Internet technologies: Search, indicated that web search provides value that includes the creation of new business models. An example of this would be price comparisons where users can essentially compare prices of what they are buying (e.g., airline tickets, hotel rooms, etc.) from various vendors on one website. This price comparison is not only useful for users but for corporations, this could also be used to determine if they are competitively priced. Since making your organization competitive is also a strategic consideration thus search capabilities are important for the organization’s future.
In conclusion, the oversimplification and lack of understanding of how the nuances of technology can affect organizations strategically are not only unsettling but also ill-informed. IT is not just one thing and by saying it is and cherry-picking the data to show this can lead to unintentional consequences.
Net Neutrality is the concept that no company should be able to determine what level of services it can provide based on the content that passes through it. In other words, this means that there should be a level playing field for everyone to use the Internet and its content.
Companies like AT&T, Comcast, and Verizon that provide the underlying Internet “wires” as cable companies and Internet Service Providers (ISPs) are now interested in producing content. From their corporate standpoint, this makes sense since these companies are exploring new areas for revenue generation to maximize profits. At the same time, this new direction also puts these companies in direct competition with content providers such as Netflix and HBO. While competition is good in the marketplace but this new direction gives the cable companies and ISPs an unfair advantage of delivering their own content faster than their competition since they own the wire.
From a network management perspective, in order to eliminate Net Neutrality, cable companies and ISPs would be looking at Quality of Service (QoS) specifically related to network traffic shaping and metrics.
Traffic Shaping – The main purpose of traffic shaping is to restrict network traffic entering the network at access points. This is done to prevent overloading of the network and to assign queuing priorities based on complex algorithms. Depending upon corporate preferences these algorithms can filter content that is deemed not necessary and assign certain percentages of capacity to some applications. What this means from a Net Neutrality point of view is that cable companies and ISPs can completely filter out their competitor’s content, can significantly slow competitor’s content and increase the percentage capacity of their own applications.
The big question is not if this will happen but what can the government do to monitor this and prevent this from happening. Due to the current state of government budgets, I would argue that the government would not be actively monitoring the networks but instead be more reactive and wait on complaints from cable companies and ISPs competitors. Even if these complaints are legitimate and can indicate unfair competition, the loss of revenue during litigation might be insurmountable for these competitors. Thus, not only would these competitors be slowly eliminated but also prevent future competitors from coming into the market place since they will be aware that cable companies and ISPs would have an unfair advantage of manipulating network traffic.
Metrics – In order to measure QoS, we need to have certain metrics. These metrics help us compare and contrast to understand and improve services. Additionally, these metrics also arm us with information that can be used to make decisions as individuals and organizations. What this means from a Net Neutrality point of view is that cable companies and ISPs can increase or decrease the QoS simply based on their own criterions. These criterions might include affecting the availability of a competitor’s network, increasing error rates due to retransmission, affecting latency and jitter where competitor’s customer satisfaction declines, slow loading of applications and creating Service Level Agreements (SLAs) that guarantee best services to only a select few who can afford premium prices.
For larger and well-funded organizations, the manipulation of QoS metrics with the help from cable companies and ISPs would guarantee their survival and drastically reduce the startup mentality that new companies embrace to fully utilize the power of the Internet as a fair playground for everyone to compete in. For a regular consumer, they might not see how these mafia-style tactics would affect them but in the long-term by the time they realize it would be too late and they would be left with only a few choices from whom they get their services from. Eventually, this lack of choice would result in customers feeling helpless and questioning why there are not any innovative companies out there to increase competition. The reason unbeknown to the customers would be how large corporate lobbying and individual self-interest twisted the arms of the government to create an unfair advantage for them decades down the road.
In conclusion, the concept of free and openness, the very basis that the United States was based on under its constitution is being threatened in the age of the Internet. Not only would Net Neutrality affect domestic competition but it also significantly affect global growth since most of the Internet wires are owned by US-owned corporations. Perhaps this is a way to stay competitive by being unfair to the rest of the world. At the end of the day, the power of the individual would be taken away and the power of the select few will continue to increase. Perhaps it is time to have leaders who can understand technology in a global context and not be afraid of healthy global competition.