Can Big Data Analytics Solve “Too Big to Fail” Banking Complexity?

Despite investing millions upon millions of dollars in information technology systems, analytical modeling and PhD talent sourced from the best universities, global banks still have difficulty understanding their own business operations and investment risks, much less complex financial markets. Can “Big Data” technologies such as MapReduce/Hadoop, or even more mature technologies like BI/Data Warehousing help banks make better sense of their own complex internal systems and processes, much less tangled and interdependent global financial markets?

Courtesy of Flickr

British physicist and cosmologist, Stephen Hawking, in 2000 said; “I think the next century will be the century of complexity.” He wasn’t kidding.

While Hawking was surely speaking of science and technology, it’s of little doubt he’d also look at global financial markets and financial players (hedge funds, banks, institutional and individual investors and more) as a very complex system.

With hundreds of millions of hidden connections and interdependencies, hundreds of thousands of various hard-to-understand financial products, and millions if not billions of “actors” each with their own agenda, global financial markets are the perfect example of extreme complexity.  In fact, the global financial system is so complex that even attempts to analytically model and predict markets may have worked for a point in time, but ultimately failed to help companies manage their investment risks.

Some argue that complexity in markets might be deciphered through better reporting and transparency.  If every financial firm were required to provide deeper transparency into their positions, transactions, and contracts, then might it be possible for regulators to more thoroughly police markets?

Financial Times writer Gillian Tett has been reading the published work of Professor Henry Hu at University of Texas.  In Tett’s article; “How ‘too big to fail’ banks have become ‘too complex to exist’ (registration required)” she says that Professor Hu argues technological advances and financial innovation (i.e. derivatives) have made financial instruments and flows too difficult to map. Moreover, Hu believes financial intermediaries themselves are so complex that they’ll continually have difficulty making sense of shifting markets.

Is a “too big to fail” situation exacerbated by a “too complex to exist” problem? And can technological advances such as further adoption of MapReduce or Hadoop platforms be considered a potential savior?  Hu seems to believe that supercomputers and more raw economic data might be one way to better understand complex financial markets.

However, even if massive data sets can be better searched, counted, aggregated and reported with MapReduce/Hadoop platforms, superior cognitive skills are necessary to make sense of outputs and then make recommendations and/or take actions based on findings. This kind of talent is in short supply.

It’s even highly likely the scope of complexity in financial markets is beyond today’s technology to compute, sort and analyze. And if that supposition is true, should next steps be to take measures to moderate if not minimize additional complexity?

Questions:

  • Are “Big Data” analytics the savior to mapping complex and global financial flows?
  • Is the global financial system—with its billions of relationships and interdependencies—past the point of understanding and prediction with mathematics and today’s compute power?
About these ads

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s