By the Blouin News Technology staff

Nasdaq and the cost of mishandling ‘technical glitches’

by in Enterprise Tech, Media Tech.

Screens display the start of trading in Facebook shares at the NASDAQ stock exchange on Times Square in New York, on May 18, 2012.

AFP/Getty Images/Emmanuel Dunand

Releasing its investigation into the mishandling of Facebook’s IPO (initial public offering), the Security and Exchange Commission has not only issued its largest-ever fine for “poor systems and decision making” of an exchange, but also revealed thought provoking details into today’s computer-driven trading system. This extends the issue beyond trading, and into computer-driven systems of all sorts – how we use them, how we depend on them, and what happens when we lose control of them.

On the first day of trading back in May of 2012, Nasdaq OMX executives made multiple mistakes that not only led to the fine, but also $62 million paid to brokers who lost money after their Facebook orders were mishandled. A large sum to be sure, but not even close to what UBS lost during the debacle – claiming a $356 million loss.

In an open letter, Nasdaq Chief Executive Robert Greifeld wrote: “While we prepared extensively for the Facebook initial public offering, including thorough tests of our systems with member firms, the challenges we encountered that day were unprecedented.”

So what exactly happened? The word “extensively” seems to fall short of SEC findings.

The first mistake was made the day before trading began, when Nasdaq’s computer systems failed to undergo sufficient preparation to handle the sheer quantity of orders that would come in on the first day of trading. The computer programs were tested with only 40,000 orders, so when the highly anticipated IPO opened the following day, Nasdaq’s computer programs were thrown into a continuous loop when the system was engulfed with 496,000 orders. When company executives were made aware of the situation, they participated in a “Code Blue” conference call, where they decided to continue trading despite the pricing difficulties, hoping a few temporary fixes to the code and a switch to an untested backup system would do the trick. That would be the next big mistake.

When the Facebook stock hit $42, brokers, unable to see how many shares they had actually purchased, went to Nasdaq with their complaints. At about 12 p.m. that day, one CEO of a broker wrote to Greifeld in an email: “we are all trading blind.” By 2 p.m., executives could see that tens of thousands of orders had failed to be executed. This led to another huge misstep – selling these shares into the market, which would lead to a sharp drop in Facebook’s share price.

Another company suffering from the technical mishaps of this day was mobile game-maker Zynga, who too suffered major price swings in its shares during Facebook’s IPO. And this wasn’t the first occasion of technical difficulties leading to trading violations. Programming errors were also the root of troubles in October of 2011 and August 2012, when orders were executed below the publically listed price for some customers.

Technology has streamlined many processes, including trading, to be faster and more efficient than ever before. Yet as society becomes increasingly more comfortable with technology and artificial intelligence, it is possible to over-trust its capabilities. While computer-driven systems may perform with less error than humans the majority of the time, the ramifications for when humans fail to test potential “technical glitches” – and recognize the magnitude of them as they are in progress – can lead to large, costly consequences.