Unintended Consequences of Combining Speed with Technology

Technology is often hailed as innovation vehicle, productivity booster, and enabler of a higher standard of living for all global citizens. However, the field of finance provides an interesting backdrop for what happens when an industry is pushed to its technological limits in the pursuit of automation and speed.

Since advent of the telegraph, and all the way until early 1970s, stock prices were displayed on a ticker tape printed in near real time.  The ticker tape (via telegraph technology) was a drastic improvement in delivery of information, since brokers could gain stock prices with only a 15-20 minute delay from original quotation.

Setting the dial now to the year 2011, we now see super computers trading stocks—not with humans—but, with other super computers. Forget delays in minutes or seconds, today’s super computers trade in microseconds and are increasingly “co-located” near stock exchange servers to reduce the roundtrip time for electrons passing through networks. In fact, on most trading floors, human brokers are obsolete as algorithms are now programmed with decision logic to make financial instrument trades at near light speed.

We’ve come a long way since the decades of ticker tape, says Andrew Lo, professor at Massachusetts Institute of Technology (MIT). At a recent conference Professor Lo mentioned while technology has opened markets to the masses (i.e. day-trading platforms) and reduced price spreads, there are also downsides to automation and speed.

First, he says, there is the removal of the human element in decision making. As super computers trade with each other in near light speed, there are smaller and smaller windows of latency (between event and action) and therefore fewer opportunities for human intervention to correct activities of rogue algorithms or accidental “fat finger” trades.

Second, with fiber optic networks spanning ocean floors and super computers connecting global investors and markets, we’ve essentially taken a fragile system based on leverage and made it more complex. Automating and adding speed to an already “fragile” system generally isn’t a recipe for success (i.e. the May 6, 2010 Flash Crash).

Based on these trends, it’s easy to imagine a world where financial networks will intensify in complexity, capital will zip across the globe even faster, and relationships between market participants will increasingly grow more interconnected. Where loose correlations once existed between participants and events, markets will soon move in lockstep in a tightly coupled system.

To be sure, the confluence of technology and finance has been a boon to society in many respects. However, as Lo says, there are “unintended consequences” in the application of the most advanced and fastest technologies to an already fragile system.  Whereas the buffer of “time” to fix mistakes before or even as they occur once existed, now we’re left to clean up the mess after disaster strikes.

In addition, as markets become more tightly coupled and complex, the butterfly effect is more pronounced where the strangest and smallest event in a far away locale can potentially cause a global market meltdown.

Advertisements

5 comments

  1. You look at the financial market from regulator’s view point. While human, at one extreme, is seen to be unpredictable, behavior of crowd is easier to predict, identify and control. Looking to computer programs written by mathematicians, algorithms become very complex, and even developers to some extent are unable to predict how system will behave in specific circumstances. In case of crowd of these systems, as they may not have same “cultural and behavioral baseline” like people do, identification of the harmful trends may really prove to be hard.
    Let’s look deeper – what is a real purpose of trading stock? Is it making real value or is it just redistribution of the virtual numbers across the players? That’s not about technology and speed. That’s about misusing the underlying idea of trading real (or future) goods on the market.

    • Eugeny, thank you for adding to the discussion!

      I think there are plenty of analysts who believe that adding extreme speed to today’s rickety financial infrastructure is cause for concern. Regulators don’t seem to agree as more supercomputer exchanges are springing up in US, Europe and soon India and possibly China.

      Today for most traders, the real purpose of trading a stock, I believe, is to make a profit spread and get out as quickly as possible. Today’s HFT are more interested in making their pennies and getting out of their positions by day’s end, than owning a stock for value.

      • “… plenty of analysts who believe that adding extreme speed … is cause for concern” – because they (in most of their mass) lose their power, and it’s harder to predict what will happen with the market or specific stock. Software is replacing them. Software predicts, and then executes. Those who manage software will rule.
        “Regulators don’t seem to agree” because regulator (government) makes taxes on the transactions. More transactions in the fraction of time, more taxes for government. Milch cow.

  2. Eugeny, you said analysts are concerned about HFT because they may lose their power. The analysts I’m referring to are large institutional traders (block shares) and yes they are concerned about front-running and visibility into order flow that’s made available by HFT platforms that’s not accessible to other traders. Software and compute power are replacing traders – but is completely removing the human element a good thing?

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s