Here is a great series on High Frequency Trading. I was most intrigued by this:
It’s important to note that market making is nothing new. In the era when stocks were traded in 1/8ths and 1/16ths, market making was done by humans working in the pit. A single human trader would often run a market making strategy on larger stocks with significant volume. Later on, from the 1980’s to the early 2000’s, human daytraders would often fill this role. To a much lesser extent they still do.
Automated trading systems have replaced these human market makers for a very good reason – cost. For a strategy (and note: this strategy works only for a few securities, no human can track hundreds of stocks mentally) to be worth a financial professional’s time and effort, it must generate at least $20-200k profit each year (this assumes a human smart enough to daytrade would work for $20k/year). In contrast, a single server in a data center can run hundreds of strategies at a cost closer to $50k/year, and they can do it faster and more accurately than any human.
…Suppose that at precisely 10:31:30:000 AM, new information becomes available which suggests that it will now be profitable to place a buy order at $20.07 – perhaps a press release has hinted that the price will go up, or a correlated security has just gone up in price. Because of this, both Mal and Jayne want to change the price on their orders to $20.07. Whoever happens to be fastest will rise to the top of the book:
This is why automated market making has morphed into high frequency trading, and why so much effort is poured into creating low latency systems. Whoever places their order first will be the most likely to trade.
It’s interesting that progress in this market is defined as the degree to which machines talk to and understand each other.
The immediate ability to profit from technological advances means computers will be autonomously driving market liquidity before they’re driving cars.