What is High Frequency Trading?
Ask a dozen traders and you’ll get a dozen different answers. Some say it involves program trading, stat arb and quantitative modeling coupled with extremely low latency and high frequency trade execution. Others say it is where alpha generation is purely based on speed.
Quants define it as “exceptional speed + algorithms or automatic strategies.” But, who gets to determine whether your speed is “exceptional” or if you are using enough algorithms to make the cut?
There are even nuances to the technical definition. Does trade execution have to occur for it be considered high frequency trading? What about bad liquidity – if you put an order out to try and influence other participants, but you pull it back and don’t execute it, does that still count? What about sponsored access? Are you a high frequency trader if you are simply the conduit, not the execution venue?
The truth is that as an industry, we may never standardize on a definition.
However, a few things do seem to be constant. You need a fully automated system to be considered a high frequency trader and most people agree that it has more to do with the percentage of total volume traded using high frequency strategies than the total volume traded overall. This is interesting because it puts companies like Lime Brokerage in the same category as Goldman Sachs despite the fact that they are on opposite sides of the size spectrum.
How Did We Get Here?
To really understand high frequency trading, we must understand where the industry has come from… Advances in computing power have boosted the number of firms who are able to take advantage of pricing inefficiencies in different markets. Changes in liquidity providers – particularly new entrants like Knight Capital, Liquidnet and BATS – have opened the floodgates for many high frequency firms.
Before decimalization, you had to be registered with an Exchange to be able to make a market as an on the floor market-maker (NYSE specialist) or an “upstairs” market maker (eg. Nasdaq market-maker). As a market-maker you were subject to regulation and capital requirements in return for a right to capture the bid-ask spread. Before decimalization only registered market makers were allowed on both sides of the trade (eg. Bid and ask) at the same time. Now anyone can operate from both sides of the market and act like a market maker. This freedom, combined with the democratization of access to new technologies, has created a host of opportunities. In 2001, one to two seconds was fast, but the continual increase in computer power combined with decreased cost has led to an arms race to see who can have the fastest and lowest latency infrastructure. In a market structure that determines the priority of orders by price and time, being first in line means getting the trade done. This ever-increasing speed means that turn around times are now measured in single milliseconds or sub-millisecond for the largest players with co-located servers at the exchanges.
People have also been leaving larger firms to start small shops. And, by capitalizing on advancements in technology, they have been able to achieve the same results. This is currently resulting in a very large increase of newer, smaller firms competing with similar strategies and ideas. As is usual with the markets, even the small inefficiencies found today will soon be gone and spreads will continue to narrow until they are all at 0.01c. This is an indication of a competitive market and good for retail investors as they will have lower transaction costs due to the decreased bid/ask spread.