Why we still can’t prevent flash crashes

February 22, 2011, 9:47 PM UTC

Although regulators have put small changes in place in the U.S. markets since last May’s flash crash, our hyper-wired interconnected markets still aren’t equipped to prevent another one.

By David Leinweber, contributor

There are more than a few weird moments in the history of markets and computers. Two UCLA students used 125 stock board messages to send a dead penny stock up more than 106,600% one morning more than a decade ago. A Google “oops” moment that accidentally revived and re-disseminated some 6-year-old bad news sent United Airlines (UAL) down 77% in a matter of minutes in 2008.

But the grand prize in Computerized Market Weirdness has to go to the May 6, 2010 flash crash, when the Dow tanked 998.5 points (9.5%) in minutes, and then recovered just as rapidly minutes later. Looking at individual stocks shows an even weirder picture. Accenture (ACN) dropped from $40 to one cent, in three seconds.

And now, 10 months later, we still don’t have a system to prevent other potentially crippling crashes. This will take some time to get right, but forward motion has lagged. An NSF Workshop in July headed by Nobel Economist Robert Engle made succinct clear recommendations that dealt with the central economic and computer issues, but the process seems stalled. Late last week, a committee headed by regulators at the Securities and Exchange Commission and the Commodities Futures Trading Commission issued a report with 14 recommendations to limit huge swings in prices. These include changes to access, routing systems and creating a consolidated audit trail to detect trading abuses. Some viewed it as insufficient to really instigate systemic changes.  Many considered it to be a step in the right direction, and the report is commendably open about the questions and many uncertainties relating to these suggestions.

Not enough is said about the ability of the federal agencies involved to analyze and understand change that could have further unintended consequences. Current technology has progressed so rapidly that the available tools, some with roots back to the 1980s, are unsuitable for the current century.

Market participants and regulators need to understand complex electronic markets, which are capable of potentially dangerous behavior. While May 6 is the poster child example, professional traders report a continuing series of “mini flash crashes” in stocks and other securities – shares of Apple (AAPL) experienced one earlier this month. They have become sufficiently common that the phase “flash crash du jour” comes up on trader websites.

This isn’t just about arcane systems glitches. If flash crash activity were to be become more frequent, the market’s role in providing capital and the broad economy would suffer.

How did we get here? Why are our markets surprising us so often? The world’s complex interconnected, ever faster electronic markets were not designed — they just happened. And they demonstrate a worrisome capacity for flaky behavior. These “big data” markets are hard to understand, and even harder to analyze. It takes many months just to get traction on analyzing a single incident. Simulating possible incidents isn’t on the menu.

CFTC Chairman Gary Gensler said that one reason it took so long to conduct the review is because it was an “enormous” effort to collect and analyze all the trading data on May 6. SEC Chairperson Mary Schapiro estimated the flow rate of the data stream: “We need… capability to receive something on the order of 20 terabytes of data in a month,” Schapiro said.  A great deal of the nation’s ability to deal with crises of this sort will depend on technological innovation.

There is a long sad history of large-scale federal information technology flame-outs. Tapes buried in the tundra. Billions of dollars have gone into the bit bucket. We need to make sure we get this one right.

Big data doesn’t always fail

There are also, encouragingly, examples of successful “Big Data” technological transformations in other fields, particularly astronomy and earth sciences.  Many times the volumes of data are collected, curated, and available quickly for modern approaches to visualization, simulation and analysis. Literally billions of federal research has flowed into these areas going back to the 1960s.

Fields once characterized by fragmented, scattered incompatible data have “been there and done that” on solving their big data challenges. Supercomputers and software literally millions of times more capable than those used for current financial tools have played key roles.

Regulators hoping to make a similar leap in technology should start by considering a broader set of market risks, and clearly spelling out the right questions we need future analysis systems to answer. Then let the technology come from that, not from what’s on the shelf. Here are some questions to get started:

  • Enforcement. How can you spot a market manipulator who works in microseconds, working similar scams across markets and different securities?
  • Systemic Structural Risk. How can we know if the complex interactions between market centers have become a source of systemic risk due to unanticipated interactions between those systems when they are operating as designed? Much of the flash crash analysis is here.
  • Systemic Implementation Risk. Same question as above, but recognizing that markets are built on real computers, with delays, crashes, races, slow-downs and all the ailments and errors that occur in real plugged-into-the wall IT machinery.  Some people argue this played a role in the big crash. No one argues that it couldn’t cause the next one.
  • Policy. Can we simulate, analyze, model and visualize what would happen if we make changes in the rules? Avoid unintended consequences.
  • Financial cyber-attack. One of the worst calls the heads of the SEC/OFR/CFTC could get is “Are our markets under attack?” If that happened, test probes would certainly precede it. Could we know if that was happening in time to take any action?

A business-as-usual update of fragmented problematic systems is a recipe for disaster.  I know a guy with a “Send lawyers guns and money” ringtone. Substitute “accountants” for “guns”, and I fear we have a description of a likely approach to tackling the challenges of 21st century financial information. We need to recognize this for what it is — one of the most important computer science challenges we face, and we need to expand the team to include the A-list of “big data” supercomputing if we hope to succeed.

David Leinweber is a pioneer in electronic markets. Author of Nerds on Wall Street: Math, Machines and Wired Markets (Wiley 2009), and Principal of Leinweber & Co. His professional interests focus on how modern information technologies are best applied in trading and investing, and how technology affects global financial markets.
Starting in 2010, as a public service role, he founded of nascent Center for Innovative Financial Technology at Lawrence Berkeley National Lab in Berkeley, California to help the nation’s most advanced computational researchers improve understanding of financial markets.

Also on Fortune.com: