Real-Time Pricing Algorithms – For or Against Us?

In 2012, Cyber Monday sales climbed 30% over the previous year’s results. Indeed, Cyber Monday benefits both online retailers as they gain massive Christmas spend in one day, and consumers can shop at work or home and thus skip holiday crowds.

And yet, underneath the bustle of ringing “cyber cash registers”, a battle brews as retailers now can easily change prices, even by the second, using sophisticated algorithms to out-sell competitors. Consumers aren’t standing still though. They also have algorithmic tools available to help them determine the best prices.

Christmas ballLet’s say you are thinking about buying a big screen television from a major online retailer.  The price at 12 noon is $546.40, but you decide to go get some lunch to think about it. An hour later, you check back on that same item and now it’s priced at $547.50.  What gives?  Depending on your perspective, you’ll either end up being the beneficiary of algorithmic pricing models or the victim.

A Financial Times article notes the price of an Apple TV device sold by three major online retailers changed anywhere from 5-10% daily (both up and down) in late November. Some HDTVs changed prices by the hour.

These up to the minute changes are made possible by real time pricing algorithms that collect data from competitor websites and customer interactions on their own sites, and then make pricing adjustments based on inventory, margins, and competitive strategies.

An algorithm is really just a recipe if you will, codified into steps and executed at blinding speed by computers.  Thus, a pricing algorithm may be using inputs from competitor websites and other data sources, and then based on pre-defined logic, churn out a “price” that is then posted on a website. Typically this process is executed in seconds.

Thus, it is increasingly common –depending on the specific item, day, hour, or even minute—that prices of online items change in a moment’s notice. If keeping up with rapidly rising and falling prices seems like a shopper’s nightmare, you’re right. However, consumers also have tools to fight back.

The same FT article points out that some consumers are using websites such as Decide.com to determine the best if not the most “fair” price points. Using either Decide.com, or Decide’s convenient smartphone app, for an annual fee of $30, a consumer can access pricing predictions of items based on Decide’s predictive pricing algorithms.  Simply look up an item, and Decide.com gives its best prediction of when to buy an item and where.

Today, we take for granted that grocery store prices generally don’t change within the hour, and that prices at the gas pump (while sometimes changing intra-day) generally don’t change by the minute. As data collection processes move from overnight batch to near real time, expect more aggressive algorithmic pricing, coming to a grocer, gas pump—or theater near you!

Of Baby Black Swans and the Race to Zero

Defined as extreme events with high impact, Black Swans are infrequent occurrences that pack a punch (i.e. in financial markets the 2008 crisis, or 2010 flash crash). However a new study shows as the combination of machine trading and speed intertwine, these extreme events are occurring more often than previously imagined.  As markets continue to connect and participants become linked, each extreme bounce and/or collision may slowly break the system.

Nassim Nicholas Taleb is the person most responsible for burning the concept of low probability, high impact events into the minds of global business executives.  Coining the term “Black Swans” as the name for extreme outliers with devastating consequences, Taleb has put executives on notice that they need more built-in redundancy and should incorporate slack in business processes to cushion against failure.

However as technology proliferates and advances thus speeding processes, it appears humans are increasingly removed from decision making.  Thus ensuring a little slack in the system may not be enough to protect from system meltdown.

Take for example a complex “system” such as global financial markets.  In an effort to gain competitive advantage, computer scientists, quants, and software programmers are building machines that scan data streams, analyze, and decide trading strategies in micro-seconds.  These individuals (sometimes hedge fund managers) or corporations (such as larger investment banks) are shrinking the window for decision making down to levels where humans cannot react fast enough—microseconds today and nanoseconds in the future.

Trading equities is now a technological “arms race”, where companies compete buying and selling at near light speed. And while the concept of using speed for competitive advantage doesn’t sound like such a bad idea, there are also ramifications for a race to zero.

The first issue with this trading arms race is exclusion of participants who cannot afford the requisite technology.  Just as it takes nearly a billion dollars to win a US election thus ensuring few can join the fray, it takes multi-millions to build and co-locate ultra-fast computerized trading platforms. A second issue is that as trading nears the speed of light, there is ultimately less and less slack in the system to correct trading errors. And since financial markets are tightly coupled, this means that one single error in a fragile system can cascade with cataclysmic results.

Trading at near light speed – in an already fragile and tightly coupled system—is driving more extreme events, which appear to be fracturing global markets. And contrary to common knowledge, these events aren’t just happening once every two to three years.

A team of physicists, system engineers, and software programmers recently published a paper suggesting that abrupt “events” are occurring in the financial markets much more than previously thought. In fact, over the years 2006-11, the authors report a total of 18,520 spikes in stock movements—or extreme events (I’ll call them baby black swans) that arguably should have low probability of occurring according to normal distribution statistical models.

The aforementioned study notes; “There is far greater tendency for these financial fractures to occur, within a given duration time window, as we move to smaller timescales.” Meaning that in financial markets, as faster computers slice decision making windows down to nanoseconds, we should expect more volatility.  Moreover, if a given system is not designed to handle extreme volatility, there is a high probability of fissures and potential for total system breakdown.

In 2010’s Flash Crash, the US stock market plunged 1000 points in nine minutes and then regained those losses just as fast.  Never before had market participants seen thousand point swings within a ten minute timeframe. If the authors in the study cited in this article are correct, this kind of extreme volatility is only the beginning.

Questions:

  • Is this “race to zero” latency risky, or is this much ado about nothing?
  • Speed is a competitive advantage. Do you see a similar “race to zero” in decision making processes in other industries?

Optimizing Supply and Demand with Narrow Artificial Intelligence (AI)

Many a marketer has lamented, “I wish I knew exactly what my competitors are selling and where they’re successful.” The good news is that in many instances, the information to answer the query exists. The bad news; data sources needed for the query are often strewn across the internet, in trade association figures and sometimes third party databases.  Not to mention, the data could be siloed within your own company in various departmental datamarts.

A challenge like this calls for some really smart thinking and compute power. It’s also begging for an algorithm.

But what is an algorithm? Simply stated, an algorithm is a set of rules specifying how to solve a problem.  And while some algorithms incorporate randomness, they are usually instructions intended to move in steps. For example, a cake recipe is an algorithm of sorts, as performing steps out of order, or skipping steps won’t result in a very tasty fare.

And while some algorithms are quite simple in design, the real “beauty” of an algorithm emerges where complexity and scale predominate.  An article from the Financial Times, “Supply and Demand in the Sky” illustrates this point.

The Financial Times article mentions that in Europe, each airline knows how many passengers are flying its planes at any one time. This is, of course, because each airline has data from its own source systems such as reservations, resource scheduling, departure control etc. However, airlines have no information on how many passengers are flying competitor planes.  They can guess the answer based on routes and the maximum capacity (seats) of competitor airplanes, but they really don’t know.

Why all the fuss in knowing what the competition is up to? Rest assured, if you are an airline marketing executive trying to decide where to add another route, it’s definitely helpful to know if your competitor is flying full or nearly full airplanes.

But with multiple millions of investment dollars riding on your decision, an uneducated “guess” isn’t going to cut it. In this case, an algorithm designed by Phillipp Goedeking of Airconomy rides to the rescue!

Phillipp Goedeking has a PhD in biology, but his real love is taking a mathematical approach to complexity. To help airlines determine how many people fly from one city to another, Dr. Goedeking looked at flight frequencies and type of aircraft used and easily came up with the total number of seats available at any one time. So this number is the available “supply”, but what about “demand”?

To determine demand, Goedeking’s algorithm uses computational power from a bank of 150 computers to produce various demand estimates –essentially an educated guess. Then it matches these “guestimates” against thousands of sets of quantifiable transport data. The algorithm then refines its estimates by “comparing data through a very sophisticated trial and error process.”

Now, since airlines don’t share passenger data, it’s hard to know how close the algorithm’s “guestimates” are to truth. However airline interest in Goedeking’s algorithm suggests that he’s very close to the right answer.

Some key takeaways (there may be more) from the Financial Times article include:

  • Some business challenges exist that are too complex—or too time consuming for human intelligence
  • Well designed algorithms (with compute power) can help sort through complex data sets and choose best options from millions of opportunities
  • Some algorithms are designed to “learn”, improve and evolve over time
  • Some decisions are too costly to be left to educated guesses or gut decisions.  If you don’t know the answer, chances are someone like Phillipp Goedeking does

No algorithm, by itself, is sufficient to claim competitive advantage. Smart people and smarter processes also must be added to the mix. That said, data management capabilities and “intelligent” algorithms are a ticket towards helping some companies write their own future.

Long Tail vs. The Blockbuster

Ever since Chris Anderson penned “The Long Tail”, it’s commonly accepted that products with low sales volume can, en masse become bigger markets than sales of blockbuster products. However, a recent Economist article takes a different angle, suggesting that marketing executives should in fact, make the big bet and go after the blockbuster “hit” product.

Back in 2005 when “The Long Tail” first hit bookshelves, Wired Magazine editor Chris Anderson advocated business executives should, “forget squeezing millions from a few megahits” and instead focus on niche markets where sales volume from obscure products can potentially tally big profits.

Mr. Anderson made a compelling case that just because a product or service isn’t a “hit”— doesn’t mean it won’t make money. Citing concrete examples from Amazon, Netflix and music service Rhapsody, Anderson argued when it comes to consumer choice (especially online), more is better and “fringy fare” can be extremely profitable. In fact, one key conclusion from “The Long Tail” is to “embrace niches.”

Is the mass market dead? Is niche marketing the wave of the future? The author of an Economist article titled, “A World of Hits”, suggests otherwise.

A key premise of the Economist article is that in fragmented world—with abundant choice—blockbusters matter more than ever. Citing television examples such as “American Idol”, “Survivor” and others, the author says that “top programmes are holding up well” and often at the expense of lesser entertainment options.

The article also mentions that “hits” are important in the music industry. For instance, even as overall album sales have declined 18% in Britain since 2004, albums occupying the number one spot have actually increased in sales. In addition, managers at Spotify music service disclose the most popular tracts on Spotify now account of 80% of streams. Moreover, in a rebuke to niche marketing—for a six month period, 1.5 million tracks on the service weren’t touched at all!

In fact, the author of the Economist article concludes, “Just because people have more choice, does not mean they will opt for more obscure entertainments.” In fact, a few examples show the opposite is occurring:

  • The top three US newspapers have all held onto subscribers much better than local/metro papers
  • Vampire movie, “New Moon” earned more in a day than any other film in history
  • The Lost Symbol by Dan Brown sold 1 million copies in the first day
  • In the past ten years, the top ten best selling books in Britain increased from 3.4 million to 6 million

Adding more fuel to the fire, research firm SNL Kagan calculates that between the years 2004-2008, movies that cost more than $100 million to produce, “consistently returned greater profits to big studios, than cheaper films.”

These are substantial and concrete examples that the “blockbuster” isn’t dead. However, let’s be fair to Chris Anderson, especially because he doesn’t advocate giving up on blockbuster products or services altogether. “Hits still matter,” he says, as a way to lure customers to an online or offline location. From there, recommendation algorithms (online) or trained personnel (offline) can steer customers towards other products or services that may be just as enticing as the blockbuster.

Long tail vs. the blockbuster? Perhaps there’s room for both.

In the Economist article, Jeff Bewkes, head of Time Warner says as much, “Both the hits and the long tail are doing well.” And going forward, a steady stream of profitable niche products/services alongside a stable of blockbusters may be the ultimate “win-win” approach for enterprises.

Questions:

  • With a potential audience of one billion, is a one-shot commercial on the SuperBowl worth $3.1 million (USD)?
  • Movie makers would love to always churn out “hits”, but they’re usually few in number. How can studios determine in advance which movies will be popular?
  • Long tail vs. blockbuster. Do you favor one approach over the other?

Related article: HBR “Long Tail Economics, Give me Blockbusters

Methods to Systematically Reduce Customer Choice

grocery storeNew research regarding online dating websites shows that when it comes to presenting customers with choices, in fact “less is more.” And while marketers inherently know that too much choice leads to cognitive meltdown, sometimes we’re confounded with the best way to remove options presented to customers. Is there an ideal way to cull customer choice?

Sometimes marketers believe that customers want more choice. According to an MIT Technology Review article, however, in the online dating market new research shows that, “users presented with too many choices experience cognitive overload and make poorer decisions as a result.”

The Technology Review article cites research from two professors from National Sun Yat-Sen University in Taiwan where in an experiment they presented online dating users with wide and deep selection of potential matches. After all, customers want more choice—right?

Not so according to the study: “More search options (led) to less selective processing by reducing user’s cognitive resources, distracting them with irrelevant information, and reducing their ability to screen out inferior options.” In effect, users suffered from data overload where too many choices prohibited them from making an optimum decision.

Here’s where statistical analysis can help reduce choice overload.

Online dating sites often attempt to use sophisticated computer applications and proprietary algorithms to divine appropriate partner matches based on user inputs such as preferences for race, religion, eye or hair color and more. EHarmony’s matchmaking algorithm, for example, helps select potential partners based on a 258 question personality test.

With a deep historical data set of what EHarmony determines as “success” (236 marriages a day according to the site) this online company believes they can predict matches with a high degree of probability.

For sites like EHarmony, Match.com or others, the challenge isn’t showing all relevant results (like a Google search that delivers 2,000 hits) but just the top ten and maybe worst case—twenty. This of course, assumes that the algorithm is based upon the right “predictors” of successful match making, and that the science behind the scenes can be trusted.

The lessons for online dating companies—much less any business—is due to carrying costs or customer confusion, less choice can actually increase sales and profitability! In fact, some large U.S. retailers are starting to investigate this idea by reducing the variety of different products carried by up to 15%.

The experiment run by the Taiwanese professors shows that when it comes to choice – less is more. If this hypothesis is true, how then does a marketer decide which choices to reduce?

Reduction in customer “options” must be based on carefully considered variables and analytical analysis. For example, a category manager at a retailer—let’s say the toothpaste aisle—shouldn’t automatically assume that the products with the lowest sales should be removed.

Careful analysis including variables such as year-over-year sales comparisons, seasonality, pricing, profitability, trade promotion dollars, etc should be considered. In addition, market basket analysis may inform the retailer that a brand of slow selling toothpaste is in fact often purchased with other very profitable items. Indeed, assortment optimization and shelf space allocation can be a very scientific exercise. A company should also set up control groups for experimental purposes to test a hypothesis before making any permanent changes.

The use of algorithms and careful analytical analysis are two ways that companies are reducing and optimizing customer choice. In the end, customers may not ultimately want more choice—just more relevant options.

Questions:

  • In online dating, some users may want more results (choices) presented because they feel that they can judge a “match” more effectively than a computer. Are there instances where delivering more choice options is the better strategy?
  • Are there dangers of optimizing customer choice—especially when it comes to encouraging new innovative products/services with no sales history?
  • Predictors of a “good match” in online dating can be highly subjective. How would you counsel online dating companies to improve their matchmaking capabilities?

Related article: Less is More in Consumer Choice

Smart Data Collective Podcast: Zero Latency Future

podcast_SDC_animated2We live in a world where everything is or will be connected to the Internet.  Sensors, GPS phones, and near real time analytics are helping businesses make decisions  remarkably faster. This is a zero latency future, and I expound upon the concept in a 17 minute podcast

I hope you will enjoy and feel free to post a comment to keep the conversation going!

Zero Latency: The Next Arms Race

no speed limitIn the near future, your company may be competing with a computer.

In fact, companies with the fastest computers, most sophisticated algorithms, technical know-how and most complete data sets will begin to separate themselves from competitors. In a world where milli-seconds will make or break your company, how should you best prepare for this new arms race?

Zero latency is all about reducing the time between when an “event” occurs and subsequent action from your company. GPS phones, sensors and real-time analytics are just some of the technologies allowing businesses to sense and respond to changing market conditions in shorter intervals of time.

Let’s look at the world of high frequency trading (HFT) for a preview of a zero latency future.

In a gross oversimplification of a very complex topic, high frequency trading is a strategy where financial companies purchase ultra-fast computers that execute trades autonomously. By subscribing to data feeds from stock exchanges and other sources, these companies often use algorithms to analyze data (voice, video, html, stock quotes etc.) as they pass by and then execute an instruction (bid/offer) for a security. The capture of data and analysis is completed in milli-seconds.

In fact, for HFT speed is of the essence. To make a trade faster than competitors, some companies have seen fit to place their servers directly on the floor of the stock exchange—effectively giving them a direct pipe into the trading platform.

A Traders Magazine article notes that many HFT firms use, “chips designed for video games to more quickly process the market data that enters their models.” The same article also mentions that, “some (HFT) firms are investing $2m every other month on new servers.”

HFT companies are actively scanning multiple data feeds for anomalies, detecting events in real time, and then executing based on predefined business rules. In this new arms race, to make money, HFT companies have discovered that zero latency wins the day. In other words, high frequency traders know they must be milli-seconds ahead of their competition in transforming data streams into actionable insight.

And while HFT is all the rage in financial circles, it’s not far fetched to see how in other industries, the ability to respond faster—to customer needs or changing events is conferring competitive advantage. Some examples:

  • Netflix’s Cinematch algorithm serves up movie recommendations in real time, based on subscribers past rental history and movie ratings. The right recommendation keeps customers satisfied and inventory turning.
  • Airlines often reroute flights based on weather events in real-time, making sure connections are not missed for their most valuable customers.
  • Overstock.com sends out 25 million event-driven emails each week (each with a dozen personalized recommendations) to over 300 customer segments. Campaigns go out daily, whereas some marketers take weeks to build a campaign.
  • And of course, Google serves up real time recommendations (advertisements) in milli-seconds based on your browsing history via cookie and search input

Zero latency means much more than making a fast decision. After all, making a poor decision—faster—isn’t going to help a company win market share.

While companies rapidly upgrade their analytical infrastructure and clean their data, they must also have the right talent in place to constantly tweak and keep their algorithms and models current. That said, in some cases these algorithms will actually “learn” from their successes/failures and improve, without human intervention.

Thoughtful analysis of changing market conditions is a mainstay of successful companies. However, in the near future, some analysis (based on quick correlation) will no longer take days and hours—it will be done in milli-seconds.

In a fast paced global economy—in most instances—latency in decision making will not be your friend. The best execution will be based on collecting and analyzing data and then acting faster than the competition. Is your company ready for a zero latency future?

  • Advances in hardware technology and advanced analytical applications make zero latency possible. Will small to medium size companies with lower IT budgets be able to compete? Is this fair?
  • What impact will a zero latency future have on the skills marketers need to effectively compete?
  • With advent of Smartphones and location based services, accurate behavioral targeting will be a key beneficiary of “zero latency”. Is a Minority Report future that far away?