While it’s certainly true there are many things in this world that are physically scarce, when it comes to living in the information age, we need to retrain our minds to ditch scarcity thinking and instead embrace “sky’s the limit” abundance.
For the third year in a row, cloud computing is one of the top three technology investments for CIOs. However, there are many misconceptions of “the cloud”. I have encountered five common myths or misconceptions. It’s high time to debunk them.
Over the long-run, is cloud computing a waste of money? Some startups and other “asset lite” businesses seem to think so. However, cloud computing for specific use cases, makes a lot of sense—even over the long haul.
With the rise of peer-to-peer sharing, it also makes sense that cloud computing—which is compute and storage “resource pooling” and renting—would also gain traction. But just as there are risks in sharing property and other assets, there are also risks in sharing cloud computing infrastructures.
IT leaders often express dismay at the process involved in not only forecasting for CAPEX needs, but then stepping through arduous internal CAPEX budget approvals. What’s all the fuss with CAPEX, and why is it so difficult to obtain?