High frequency trading: how it happened, what’s wrong with it, and what we should do

Crossposted from mathbabe.org

High frequency trading (HFT) is in the news. Politicians and regulators are thinking of doing something to slow stuff down. The problem is, it’s really complicated to understand it in depth and to add rules in a nuanced way. Instead we have to do something pretty simple and stupid if we want to do anything.

How it happened

In some ways HFT is the inevitable consequence of market forces – one has an advantage when one makes a good decision more quickly, so there was always going to be some pressure to speed up trading, to get that technological edge on the competition.

But there was something more at work here too. The NYSE exchange used to be a non-profit mutual, co-owned by every broker who worked there. When it transformed to a profit-seeking enterprise, and when other exchanges popped up in competition with it was the beginning of the age of HFT.

All of a sudden, to make an extra buck, it made sense to allow someone to be closer and have better access, for a hefty fee. And there was competition among the various exchanges for that excellent access. Eventually this market for exchange access culminated in the concept of co-location, whereby trading firms were allowed to put their trading algorithms on servers in the same room as the servers that executed the trades. This avoids those pesky speed-of-light issues when sitting across the street from the executing servers.

Not surprisingly, this has allowed the execution of trades to get into the mind-splittingly small timeframe of double-digit microseconds. That’s microseconds, where from wikipedia: “One microsecond is to one second as one second is to 11.54 days.”

What’s wrong with it

Turns out, when things get this fast, sometimes mistakes happen. Sometimes errors occur. I’m writing in the third-person passive voice because we are no longer talking directly about human involvement, or even, typically, a single algorithm, but rather the combination of a sea of algorithms which together can do unexpected things.

People know about the so-called “flash crash” and more recently Knight Capital’s trading debacle where an algorithm at opening bell went crazy with orders. But people on the inside, if you point out these events, might counter that “normal people didn’t lose money” at these events. The weirdness was mostly fixed after the fact, and anyway pension funds, which is where most normal people’s money lives, don’t ever trade in the thin opening bell market.

But there’s another, less well known example from September 30th, 2008, when the House rejected the bailout, shorting stocks were illegal, and the Dow dropped 778 points. The prices as such common big-ticket stocks such as Google plummeted and, in this case, pension funds lost big money. It’s true that some transactions were later nulled, but not all of them.

This happened because the market makers of the time had largely pulled their models out of the market after shorting became illegal – there was no “do this algorithm except make sure you’re never short” button on the algorithm, so once the rule was called, the traders could only turn it all of completely. As a result, the liquidity wasn’t there and the pension funds, thinking they were being smart to do their big trades at close, instead got completely walloped.

Keep this in mind, before you go blaming the politicians on this one because the immediate cause was the short-sighted short-selling ban: the HFT firms regularly pull out of the market in times of stress, or when they’re updating their algorithms, or just whenever they want. In other words, it’s liquidity when you need it least.

Moreover, just because two out of three times were relatively benign for the 99%, we should not conclude that there’s nothing potentially disastrous going on. The flash crash and Knight Capital have had impact, namely they serve as events which erode our trust in the system as a whole. The 2008 episode on top of that proved that yes, we can be the victims of the out-of-control machines fighting against each other.

Quite aside from the instability of the system, and how regular people get screwed by insiders (because after all, that’s not a new story at all, it’s just a new technology for an old story), let’s talk about resources. How much money and resources are being put into the HFT arena and how could those resources otherwise be used?

Putting aside the actual energy consumed by the industry, which is certainly non-trivial, let’s focus for a moment on money. It has been estimated that overall, HFT firms post about $80 billion in profits yearly, and that they make on the order of 10% profit on their technology investments. That would mean that there’s in the order of $800 billion being invested in HFT each year. Even if we highball the return at 25%, we still have more than $300 billion invested in this stuff.

And to what end?

Is that how much it’s really worth the small investor to have decreased bid-ask spreads when they go long Apple because they think the new iPhone will sell? What else could we be doing with $800 billion dollars? A couple of years of this could sell off all of the student debt in this country.

What should be done

Germany has recently announced a half-second minimum for posting an share order. This is eons in current time frames, and would drastically change how trading is done. They also want HFT algorithms to be registered with them. You know, so people can keep tabs on the algorithms and understand what they’re doing and how they might interact with each other.

Um, what? As a former quant, let me just say: this will not work. Not a chance in hell. If I want to obfuscate the actual goals of a model I’ve written, that’s easier than actually explaining it. Moreover, the half-second rule may sound good but it just means it’s a harder system to game, not that it won’t be gameable.

Other ideas have been brought forth as to how to slow down trading, but in the end it’s really hard to do: if you put in delays, there’s always going to be an algorithm employed which decides whose trade actually happens first, and so there will always be some advantage to speed, or to gaming the algorithm. It would be interesting but academically challenging to come up with a simple enough rule that would actually discourage people from engaging in technological warfare.

The only sure-fire way to make people think harder about trading so quickly and so often is a simple tax on transactions, often referred to as a Tobin Tax. This would make people have sufficient amount of faith in their trade to pay the tax on top of the expected value of the trade.

And we can’t just implement such a tax on one market, like they do for equities in London. It has to be on all exhange-traded markets, and moreover all reasonable markets should be exchange-traded.

Oh, and while I’m smoking crack, let me also say that when exchanges are found to have given certain of their customers better access to prices, the punishments for such illegal insider information should be more than $5 million dollars.

One thought on “High frequency trading: how it happened, what’s wrong with it, and what we should do

  1. Cathy, you’re close to the cause when you mention transaction costs and bid-ask spreads, and a financial transaction tax or FTT, or Tobin Tax on some types of trades, would get rid of a lot of HFT, in my opinion. But if they charged market makers on every trade, that would be crippling, and in many ways are a bad idea (the main one being that trading would migrate to a non-FTT area).

    I agree the half-second rule would still be gameable, and like the FTT most people who propose it seem to have little appreciation of how to make it practical. Same with “discretization” and other auction roadblocks.

    A the real cause behind the rise of HFT, in my partly-educated opinion, is the abolition of fractions, a.k.a. “decimalization”. The main thing this change did was replace the minimum difference in bid-offer with 0.01 from 1/16 = 0.0625. That does save customers a lot, but it also means market making is less risky, since you can usually take only 0.01 of liquidity risk on your principal.

    But the stock markets also ban making prices that go out to more than two decimal places: you can’t trade at 5.007, even if you want to. That means that the person to get the trade isn’t the one with the best price, but whoever is fastest… hence an arms race (with little ultimate utility, IMO). It might have happened w/o decimalization too, but the obvious thing would be to kill the 0.01 rule or extend it to 0.0001. See http://www.chrisstucchio.com/blog/2012/hft_whats_broken.html

Comments are closed.