Like many industries, the IT industry has a habit of latching onto buzzwords and then applying them everywhere. The term “cloud” is certainly no exception—and, like other similar terms, its use is varied and oftentimes inaccurate. As a starting point for our discussion, then, let us cite the National Institute of Standards and Technology (NIST) definition of cloud computing, as published by the institute in September 2011:
“Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.”
At the time this definition was published, “cloud” was already part of industry parlance, and was beginning to take root in the general lexicon. Additionally, global business spending for infrastructure and services related to the cloud had topped $78 billion. This year, enterprise spending on the cloud will reach an estimated $174 billion, and is expected to climb to $235 billion by 2017. Inevitably, as both business IT and consumer mindsets evolve toward the cloud in coming years, we will continue to witness dramatic growth in IT products in some areas, and significant reductions in other areas—resulting in a reshaping of the industry as a whole.
Download Full Whitepaper: Five Myths of Cloud Computing