For the third year in a row, cloud computing is one of the top three technology investments for CIOs. However, there are many misconceptions of “the cloud”. I have encountered five common myths or misconceptions. It’s high time to debunk them.
Over the long-run, is cloud computing a waste of money? Some startups and other “asset lite” businesses seem to think so. However, cloud computing for specific use cases, makes a lot of sense—even over the long haul.
With the rise of peer-to-peer sharing, it also makes sense that cloud computing—which is compute and storage “resource pooling” and renting—would also gain traction. But just as there are risks in sharing property and other assets, there are also risks in sharing cloud computing infrastructures.
IT leaders often express dismay at the process involved in not only forecasting for CAPEX needs, but then stepping through arduous internal CAPEX budget approvals. What’s all the fuss with CAPEX, and why is it so difficult to obtain?
In a very competitive macro-economic climate, companies seek to reduce costs and drive those savings towards either the bottom line or re-purpose savings towards innovative projects. Cloud computing is often seen as one such avenue towards cost reductions as companies can ultimately reduce capital expenditures and data center operating costs. However, as good as the concept of “cost savings” sounds, you might be surprised to discover that according to one analyst firm, cost reduction isn’t the primary driver for cloud computing.