It’s been an eventful week in the capital markets, as four fairly large dislocations sent traders scrambling.
I’m not going to comment on the Fannie/Freddie bailout, or on the glitch at the LSE that halted trading for much of Monday. And I’ll leave it to Paul Kedrosky at “Infectious Greed” to draw a connection between the Lehman announcement and the first test of the Large Hadron Collider.
But let’s look at the darkly comical story of the United Airlines stock plunge at 11am Monday. The catalyst was when a reporter at Income Securities Advisor re-posted a 2002 United bankruptcy filing, feeding a wire service published on thousands of Bloomberg terminals worldwide and flagged as “new” news.
At this point, both traders already skittish about United’s outlook and automated trading systems interpreting the headline negatively began quickly selling stock.
UAL stock takes a dive as the trading volume surges:
The ex-post finger-pointing was, of course, inevitable, but an inordinate share of the blame has been directed towards automated trading systems: “The use of algorithms — which allow computers to make decisions in fractions of a second — appears to be a main culprit in the UAL case”, claims Jonathan Spicer in a Reuters analysis piece. Quoting marketplace concerns about “taking human judgment out of the news processing”, Spicer concludes that “there’s little doubt the problem was exacerbated by automated trading.” Yet despite Spicer’s own lack of doubt, he presents no facts or evidence to support this subtle luddism.
To conclude that this market dislocation was caused by automated trading systems gone haywire is to ignore the answers to some fundamental questions. What were traders without automated systems doing? And were automated systems doing what their traders were asking them to do?
I spoke with an option trader yesterday who bought puts in UAUA shortly after the headline hit his screen, based on some incomplete information and a quick decision. Certainly, in retrospect, and in this specific instance, he wishes he had taken the time to call United for confirmation before making his trades. But had it been a “real” trading opportunity, the opportunity would likely have disappeared in the time it took him to dial the phone. Trading often requires us to make decisions on imperfect or even incorrect information, balancing efficiency with confidence. That concern is in no way limited to the world of automated trading systems.
Automation did not go haywire here. This was not a cousin of the once-feared Y2K bug, causing computer systems to behave unexpectedly or disastrously. There are no reports of automated systems doing anything except what their managing traders asked them to do in these precise circumstances; in fact they were doing exactly what other traders in the marketplace were doing or trying to do manually themselves.
Human traders who trade too pathologically quickly on unconfirmed rumors ultimately find themselves at a competitive disadvantage, and before long are financially drummed out of the marketplace. The same is true of course for any traders who use various tools – like real-time data, electronic order entry, direct market access, automated trading systems, and so on. Traders with or without automated systems who were too aggressive selling shares of United now find themselves at a financially competitive disadvantage in the marketplace; traders who smartly took the other sides of these trades are now better off. This process makes the marketplace stronger and more robust in the long run.
The world financial system is nothing if not astounding complex, and we must prepare ourselves for the impossibility of eliminating all human error. We all remember how a few weeks ago Bloomberg resurrected Steve Jobs (in fewer than three days, no less). Fallibility is a constant, so it is imperative that we design our systems and train our human and algo traders to strike the correct balance between immediacy and confirmation. As we have said before, it’s not the technology that will make you money, it’s how you use it.