Over the long-run, is cloud computing a waste of money? Some startups and other “asset lite” businesses seem to think so. However, cloud computing for specific use cases, makes a lot of sense—even over the long haul.
With the rise of peer-to-peer sharing, it also makes sense that cloud computing—which is compute and storage “resource pooling” and renting—would also gain traction. But just as there are risks in sharing property and other assets, there are also risks in sharing cloud computing infrastructures.
IT leaders often express dismay at the process involved in not only forecasting for CAPEX needs, but then stepping through arduous internal CAPEX budget approvals. What’s all the fuss with CAPEX, and why is it so difficult to obtain?
In a very competitive macro-economic climate, companies seek to reduce costs and drive those savings towards either the bottom line or re-purpose savings towards innovative projects. Cloud computing is often seen as one such avenue towards cost reductions as companies can ultimately reduce capital expenditures and data center operating costs. However, as good as the concept of “cost savings” sounds, you might be surprised to discover that according to one analyst firm, cost reduction isn’t the primary driver for cloud computing.
Future NFL Hall of Fame quarterback Peyton Manning is tough to beat. What’s his secret? Is it accuracy, the ability to throw a “catchable ball” or capability to diagnose defenses quickly? The answer is probably all of the above, to some degree. Yet stated another way, Manning’s offensive excellence comes down to two things –simplicity and ability to execute.