September 16, 2016 | Virtual Strategy Magazine
Companies often leverage the cloud because of its promise of agility. The ability to flex an application up or down according to need by merely adding or subtracting server nodes allows companies to benefit from economies of scale. By essentially “renting” computing power from someone who can provide it on demand, companies can pay for exactly what they need, just like a utility. In this light, it’s reasonably assumed by many that all cloud applications can “scale out” in this fashion: by provisioning extra server nodes when capacity is needed and then removing them when demand diminishes. However, this is not always the case.