Why Returning $1 Trillion to Shareholders is a Bad Idea

According to the Financial Times, companies are on record pace to return over one trillion dollars to shareholders this year via share buybacks and dividends.  With creaking IT infrastructures and under-investment in other areas such as plants, equipment, employee training and more, this isn’t just a flawed strategy; it’s a dangerous one for the future health of companies across the globe.

Courtesy of Flickr. Creative Commons. By Jeremy Yerse
Courtesy of Flickr. Creative Commons. By Jeremy Yerse

Investors are tired of companies hoarding cash. While much of these dollars are often locked away on international balance sheets, there is clamor to return a significant chunk of this cash back to investors via dividends and share-buybacks.  And just about every company of significant size is either boarding or already on the buyback gravy train according to an FT article; “dividends have climbed on average 14 per cent annually over the past four years” and “buybacks to rise at a double-digit rate this year.”

While $1T is expected to go back to shareholders, the strategy is not without critics. There is concern that returning such a large amount of cash to shareholders simply inflates stock prices and earnings per share growth, consequently leaving companies starved for investment.

“We haven’t seen much allocation of resources to capital,” says Bruce Kasman, head of economic research for JPMorgan. This is concerning, as such business investment can arguably help companies meet the needs of customers today and tomorrow. For example, this article shows the consequences of under-investment in keeping legacy systems performing in the banking industry. Even more troubling is the lack of investment in modern and cutting-edge technologies that might enable killer customer facing applications and improve customer satisfaction scores.

While burgeoning balance sheets may require some share buybacks—or at least enough to offset stock based compensation for directors and above—$1T does an seem excessive sum, especially while companies are spending 80% of their ever-so-flat IT budgets keeping existing and antiquated legacy systems functional.

If not share buybacks and dividends, where else could those monies be spent? For starters, while there’s an under-investment in IT on the whole, there are promising new technologies in play that could really make a difference for companies such as: mobile payments, embedded sensors for manufacturing systems, robotics, and even Hadoop and its related YARN applications.

Moreover, with practically a data breach and a destructive IT attack in the news each day, there’s awoeful under-investment in security (physical and software) to safeguard customer data. Another idea:investing in skills and technical training so that employees can serve customers better; this of course ultimately helps increase customer satisfaction and may improve company revenues.

Too many share repurchases may end up hurting the future performance of companies, especially when there are so many functional company departments, divisions and systems withering on the vine from lack of investment. Not to mention the dearth of innovation and creativity that’s kept at bay while investors and activist shareholders gorge themselves on the redistribution of cash flooding into their accounts.

There’s no way to avoid returning some cash to appease activist investors. But for the majority of it, there must be investment opportunities for innovation that aren’t getting a fair shake.  VC icon, Peter Theil, famously said; “We wanted flying cars, instead we got 140 characters.” Let’s not sell ourselves short on innovation. We can do better than returning $1T to shareholders via buyback binges. Indeed, we must for our companies to have a fighting chance in the future.

Problems with the Language of Probability

By Dave_s. Flickr - Creative Commons.

The language of probability to statisticians and most scientists is clear—they understand terms such as “correlation”, “statistically significant”, “confidence” and more. However, using probabilistic terminology to communicate the “likelihood” of an event occurring to those untrained in understanding such terms, can in some instances lead to the ruin of careers, companies and in worst cases—loss of life.

By Dave_s. Flickr - Creative Commons.
By Dave_s. Flickr – Creative Commons.

The “After Shocks” is a disturbing article on what some have called “science on trial”. In 2009, a swarm of small earthquakes hit L’Aquila, a small town in the mountains of Italy. This area in central Italy—much like those living near the San Andreas fault in California—is prone to continual earthquakes. In fact, over the centuries there have been tens of thousands of earthquakes in the area of L’Aquila with some having small effect and others killing hundreds of people.

Citizens in L’Aquila constantly live with an underlying fear of “the big one”—an earthquake so big that it shakes buildings to their foundation. So when a series of tremors hit the area in 2009, citizens were keen to get answers to questions such as; “is the big one coming soon?” and “if so, should I be leaving my home?” With most of the homes in the L’Aquila area unenforced—and thus unable to defend against a sizeable earthquake—government authorities came to the rescue by convening a scientific panel to answer citizen questions.

What happened next is a travesty. According to the above article, one of the scientists, Enzo Boschi, examined the earthquake data and concluded; “a large earthquake along the lines of 1703 event (the last one that killed 10,000) is improbable in the short term, but the possibility cannot definitively be excluded.”

Let’s dissect the use of the word “improbable”. Most statisticians would define “improbable” equating to a low probability, but definitely not zero. It appears this is what Boschi meant when he used the term “improbable”.  As further evidence, notice how Boschi qualified his statement; “but the possibility cannot definitively be excluded”. However, the article notes that to the untrained—even worse, the media—improbable means “ain’t gonna happen”.

Long story, short: the small shakes in L’Aquila eventually led to the big one six days later. On April 6, 2009, cumulative probabilities caught up with L’Aquila, with a 6.3 magnitude earthquake killing 308 people.  After spending weeks digging through the rubble, enraged citizens brought a lawsuit against the scientists, accusing them of negligence in not adequately sizing the risks of a large earthquake. In 2012, the scientists were convicted of manslaughter, however in November 2014; they won their appeal and are now free from jail–but not free from the associated costs of their legal defense.

There are challenges in the language of probability. What do terms such as “unlikely”, “serious possibility”, “likely” and others actually mean? The trained scientist might know in his or her mind how they are defined, but does your typical business associate, much less your CEO understand?

Surely, when we have data we can make calculations to estimate the probability of an event. But what happens when we do not? Subjective probability statements—where we’re trying to measure belief—can also get us in trouble if we don’t agree on definitions, especially for events that have never occurred.

We should not eliminate the language of probability. Even though we really don’t know everything that can happen, we still have to run our businesses and predict what’s coming next. However, we must also remember that what is “likely” to us, may be deemed “unlikely” to another—especially if they have a pre-conceived notion in mind. We should also remember that sometimes the use of statistics and probability gives us the illusion of control, where in fact there is none.

As we communicate to those not trained in the language of probability, it is critical to couch our language with key qualifiers of “estimate”,  “educated guess”, “margin of error”, “rare does not mean impossible” and more. We should avoid generalizations and any language that could be misinterpreted as “a sure thing” or “no chance in heck”. Barring that, the best solution is to keep our expert opinions to ourselves.

Is Your IT Architecture Ready for Big Data?

Built in the 1950s, California’s aqueduct is an engineering marvel that transports water from Northern California mountain ranges into thirsty coastal communities. But faced with a potentially lasting drought, California’s aqueduct is running below capacity as there’s not enough water coming from sources. In terms of big data, just the opposite is likely happening in your organization—too much big data, overflowing the river banks and causing havoc. And it’s only going from bad to worse.

Courtesy of Flickr. Creative Commons. By Herr Hans Gruber
Courtesy of Flickr. Creative Commons. By Herr Hans Gruber

The California aqueduct is a thing of beauty. As described in an Atlantic magazine article;

“A network of rivers, tributaries, and canals deliver runoff from the Sierra Mountain Range’s snowpack to massive pumps at the southern end of the San Joaquin Delta.” From there, these hydraulic pumps push water to California cities via a forty four mile aqueduct that traverses the state and dumps into various local reservoirs.

You likely have something analogous to a big data aqueduct in your organization. For example, source systems kick off data in various formats, which probably go through some refining process and end up in relational format. Excess digital exhaust is conceivably kept in compressed storage onsite or a remote location. It’s a continual process whereby data are continually ingested, stored, moved, processed, monitored and analyzed throughout your organization.

But with big data, there’s simply too much of it coming your way. Author James Gleick describes it this way; “The information produced and consumed by humankind used to vanish—that was the norm, the default. The sights, the sounds, the songs, the spoken word just melted away. Now expectations have inverted. Everything may be recorded and preserved, at least potentially: every musical performance; every crime in a shop, elevator, or city street; every volcano or tsunami on the remotest shore.” In short, everything that can be recorded is fair game, and likely sits on a server somewhere in the world.

So what got us here in terms of IT architecture isn’t going to be able to handle the immense data flood coming our way without a serious upgrade in terms of capability and alignment.

IT architecture can essentially be thought of as a view from above, or a blueprint of various structures and components and how they function together. In this context, we’re concerned with what an overall blueprint of business, information, applications and systems looks like today and what it needs to look like to meet future business needs.

We need a rethink of our architectural approaches for big data. To be sure, some companies—maybe 10%–will never need to harness multi-structured data types. They may never need to dabble with or implement open source technologies. To recommend some sort of “big data” architecture for these types of companies is counter-productive.

However, the other 90% of companies are waking up and realizing that today’s IT architecture and infrastructure won’t be able to meet their future needs. These companies desperately need to assess their current situation and future business needs, and then design an architecture that will deliver insights from all data types, not just those that fit neatly into relational rows and/or columns.

The big data onslaught will continue for the foreseeable future, and is only going to grow more intense from exponential data growth. But here’s the challenge: the human mind tends to think linearly—we simply don’t know how to plan for, much less capitalize on this exponential data growth. As such, the business, information, application and systems infrastructures—at most companies—aren’t equipped to cope with, much less harness the coming big data flood.

Want to be prepared? It’s important to take a fresh look at your existing IT architecture—and make sure that your data management, data processing, development tools, integration and analytic systems are up to snuff. And whatever your future plans are, consider doubling down on them.

Until convincing proof shows otherwise, it’s simply too risky not to have a well thought out plan to cope with stormy days ahead of too much big data.

Changing Your Mind About Big Data Isn’t Dumb

After all the hype about big data and its mental cousin Hadoop, some CIOs are getting skittish about investing additional money in a big data program without a clear business case.  Indeed, in terms of big data it’s OK to step back and think critically about what you’re doing, pause your programs for a time if necessary, and—yes, even change your mind about big data.

Courtesy of Flickr. Creative Commons. By Steven Depolo
Courtesy of Flickr. Creative Commons. By Steven Depolo

Economist and Federal Reserve Chairman, Alan Greenspan, has changed his mind many times. In aFinancial Times article, columnist Gillian Tett, chronicles Greenspan’s multiple positions on the value of gold. Tett says in his formative years, Greenspan was fascinated with the idea of the gold standard (i.e. pegging the value of a currency to a given amount of gold), but later was a staunch defender of fiat currencies.  And now, in his sunset years, Greenspan has shifted again saying; “Gold is a currency. It is still, by all evidence, a premier currency. No fiat currency, including the dollar, can match it.”

To me at least, Greenspan’s fluctuating positions on gold reflect a mind that continually adapts to new information.  Some would view Greenspan as “waffler”, or someone who cannot make up his mind. I don’t see it that way. Changing your mind isn’t a sign of weakness; rather it shows pragmatic and adaptive thinking that mutates as market or business conditions shift.

So what does any of this have to do with the concept of big data? While big data and associated big data technologies have enjoyed plenty of hype, there’s a new reality setting in regarding getting more value from big data investments.

Take for example a Barclays survey where a large percentage of CIOs were “uncertain”—thus far—as to the value of Hadoop because of the ongoing costs of support, training, hiring hard to find operations and development staff, and the necessary work to make Hadoop integrate with existing enterprise systems.

In another survey of 111 U.S. data scientists sponsored by Paradigm4, twenty-two percent of those surveyed said Hadoop and Spark were not well-suited to their analytics. And in the same survey, thirty-five percent of data scientists who tried Hadoop or Spark have stopped using it.

And earlier in the year, Gartner analyst Svetlana Sicular noted that big data has fallen into Gartner’s trough of disillusionment by commenting; “My most advanced with Hadoop clients are also getting disillusioned…these organizations have fascinating ideas, but they are disappointed with a difficulty of figuring out reliable solutions.”

With all this in mind, I think it makes sense to take a step back and assess your big data progress.  If you are one of those early Hadoop adopters, it’s a good time to examine your current program, report on results, and test against any return on investment (hard dollar or soft benefits) projections you’ve made. Or maybe you have never formalized a business case for big data? Here’s your chance to work up that business case, because future capital investments will likely depend on it.

In fact, now’s the perfect opportunity for deeper thinking on your big data investments. It’s time to go beyond the big data pilot and put effort into strategies for integrating these pilots with the rest of your enterprise systems.  And it’s also time to think long and hard about how to make your analytics “consumable by the masses”, or in other words, making your analytics accessible to many more business users than those currently using your systems.

And maybe you are in the camp of charting a different course for big data investments. Perhaps business conditions aren’t just right at the current moment, or there’s an executive shift that warrants a six month reprieve to focus on other core items.  If this is your situation, it might not be a bad idea to let an ever changing big data technology and vendor landscape shake out a bit before jumping back in.

To be clear, there’s no suggestion—whatsoever—to abandon your plans to harness big data. Now that would be dumb. But much like Alan Greenspan’s shifting opinions on gold, it’s also perfectly OK to re-assess your current position, and chart a more pragmatic and flexible course towards big data results.

Storytelling with the Sounds of Big Data

Trying to internally “sell” the merits of a big data program to your executive team?  Of course, you will need your handy Solution Architect by your side, and a hard hitting financial analysis vetted by the CFO’s office. But numbers and facts probably won’t make the sales pitch complete. You’ll need to appeal to the emotional side of the sale, and one method to make that connection is to incorporate the sounds of big data.

By Tess Watson. Creative Commons. Courtesy of Flickr.
By Tess Watson. Creative Commons. Courtesy of Flickr.

There’s an interesting book review on “The Sonic Boom” by Joel Beckerman in the Financial Times.  In his book, Beckerman makes the statement that “sound is really the emotional engine for any story”—meaning if you’re going to create a powerful narrative, there needs to be an element of sound included.

Beckerman cites examples where sound is intentionally amplified to portray the benefits of a product or service, or even associate a jingle with a brand promise. For example, the sizzling fajitas that a waiter brings to your table, the boot up sound on an Apple Mac, or AT&T’s closing four notes on their commercials.

Of course, an analytics program pitch to senior management requires your customary facts and figures.  For example, when pitching the merits of an analytics program you’ll need slides on use cases, a few diagrams of the technical architecture (on premise, cloud based or a combination thereof), prognostications of payback dates and return on investment calculations, and a plan to manage the program from an organizational perspective among other things.

But let’s not mistake the value of telling a good story to senior management that humanizes the impact of investing deeper in an analytics program.  And that “good story” can be delivered more successfully when “sound” is incorporated into the pitch.

So what are the sounds of big data?  I can think of a few that, when experienced, can add a powerful dimension to your pitch.  First, take your executives on a tour of your data center, or one you’re proposing to utilize so they can hear the hum of a noisy server room where air conditioning ducts pipe in near ice cold air, CPU fans whirl in perpetuity, and cable monkeys scurry back and forth stringing fiber optic lines between various machines.  Yes, your executive team will be able to see the servers and feel the biting cold of the data center air conditioning, but you also want them to hear the “sounds” (i.e. listen to this data center) of big data in action.

In another avenue to showcase the sound of big data, perhaps you can replay to your executive team the audio of a customer phone call where your call center agent struggles to accurately describe where a customer’s given product is in transit, or worse, tries to upsell them a product they already own.  I’m sure you can think of more “big data” sounds that can accurately depict either your daily investment in big data technologies…or lack thereof.

Too often, corporate business cases with a “big ask” for significant headcount, investment dollars and more, give too much credence to the left side of our brain that values logic, mathematics and facts.  In the process we end up ignoring the emotional connection where feelings and intuition interplay.

Remember to incorporate the sounds of big data into your overall analytics investment pitch because what we’re aiming for is a “yes”, “go”, “proceed”, or “what are you waiting for?” from the CFO, CEO or other line of business leader. Ultimately, in terms of our analytics pitch, these are the sounds of big data that really matter.

Three Steps to Becoming a Genius Forecaster

Both Ben Bernanke and Edward John Smith got it wrong. They made terrible forecasts that either wrecked the economy, or in the instance of John Edward Smith (his ship).  Forecasting is hard, and even those who sometimes get it right, often fail on a continuous basis. But fear not, there are three steps you can take to drastically improve your forecast accuracy, but you’ll have to be willing to put in the work, and possibly put your ego aside to get there.

Captains of the Titanic. By Jimmy. Courtesy of Flickr Creative Commons.
Captains of the Titanic. By Jimmy. Courtesy of Flickr Creative Commons.

Simply stated, a forecast is “a prediction…of some future activity, event, or occurrence.” There are many types of forecasts including business, economic, financial, meteorology, political and more. In fact, everyone is a forecaster to some degree especially when we start thinking about future trends and how they might affect our families, companies, communities and…even countries.

But good forecasting is difficult, and even the so-called “experts” and pundits get it wrong more times than they’re right.  With this in mind, here are three tips (surely there are more) to becoming a better forecaster.

First, understand that domain knowledge of a particular area doesn’t necessarily mean you’ll see the future better than anyone else. An article from the Financial Times chronicled Canadian psychologist Phillip Tetlock’s quest to improve forecasting techniques. Over 18 years, Tetlock accumulated over 27,500 expert forecasts on politics, geopolitics and economics.  His shocking conclusion? According to the FT article, Tetlock discovered that so-called experts were terrible forecasters! These were people in their sphere of influence, with strong opinions and knowledge about things they understood quite well. However, their forecasting track records—over time—were no better than chance.  So if you believe yourself to be an “expert”, it’s probably better to take a more humble approach.

Second, if you want better forecasts, run your expert opinions by others. Phillip Tetlock, Barbara Mellers and Don Moore run “The Good Judgment Project”.  It’s a collection of more than 20,000 volunteer participants who offer up opinions on economic and geo-political events. Through their research, Tetlock, Mellers and Moore learned that when expert forecasters were broken into teams, their discussions and sometimes heated arguments bore better results.  With the biblical adage, “iron sharpens iron”, when you bounce your expert forecasts off others, research from the Good Judgment Project shows that you’ll end up with more accurate depictions of future events.

Third, bring your data—in fact, bring all of them. Sometimes, when making expert forecasts we assume that only what we deem as a relevant data set is needed (maybe what’s in the corporate data warehouse) for the best decision making.

However, because the world is complex, and there are often many variables that contribute to an event or outcome, it’s better to bring all your data to the task. So this means, data that might be locked away in non-tabular “messy” formats such as call detail records, machine logs, or JSON data sets can and should be processed, refined and analyzed.  And don’t be afraid to look for data sets outside what you own in your internal data stores. There are plenty of data brokers that might have data you need to help unlock the puzzle of where to direct your corporate resources next.

Looking for more on the latest in forecasting? Tim Hartford’s FT article is a great place to start. I’m also a fan of Cullen Roche’s macro approach to understanding markets and financial flows. And no discussion on forecasting would be complete without referencing Nassim Taleb’s sometimes caustic critiques of the forecasting profession.

So if you want to be a genius forecaster, follow these three steps. First, drop any bit of hubris that comes with the forecasting profession and be open to other opinions. After all, as John Kenneth Galbraith once said, “There are two kinds of forecasters: those who don’t know, and those who don’t know they don’t know.”

Next, once your guard is down, you’ll be able to run your ideas by other experts and maybe come up with a better idea than your original one. Don’t be afraid to argue your point. But also be wise enough to be quiet and listen.  You can learn a lot by simply closing your mouth and opening your mind.

Finally, bring all the data you need to solve a problem, not just the clean data, or those that are easily sourced. Sometimes, there’s signal in the noise. But if you want better forecasts, you’re going to have to do the really hard work to find it.

  • Follow

    Get every new post delivered to your Inbox.

    Join 44 other followers

    %d bloggers like this: