HFT and Network Latency

By Deane Barker on September 2, 2012

The emphasis on High Frequency Trading (HFT) in stock markets is getting absurd.  The New York Times estimates that it accounts for 60% of all trading now, and the emphasis seems on getting network latency lower and lower and lower, so that computers can get in and out of trades faster.

To understand this, you have to know that algorithms (“algos”) will often make a trade then make another one within the same second.  I remember when “day trading” was all the rage, and the old school traders were scoffing at the idea of getting in and out of a stock in the same day.

Obviously, if you’re looking to execute trades inside the same second, you have to have fast computers.  More important, you have to have a fast link to the computers belonging to the stock exchange itself.

We discussed this before.  A couple of months ago, we talked about how trading firms fight for space in a data center in Mahwah, New Jersey, so they can be in close proximity to the NYSE computers.

However, there’s still the problem that you have to tell you computers what to do, and if you’re far away, this can take too long.

[…] sending a message that distance, using the fastest fibre-optic route between Chicago and New Jersey that I know of, takes around 16 milliseconds. That’s a huge delay: you might as well be on the moon.

Sixteen milliseconds = “on the moon.”

This article from Wired is about this delay is coming down.

The fastest communication between New York and Chicago would be line-of-sight through the air, which requires a chain of microwave relay towers. Tradeworx is building such a network, as is McKay Brothers, a California firm that hopes its system will be the fastest, with a round-trip latency of less than 9 milliseconds.

That’s Chicago to New York.  The next frontier is New York to London, which – awesomely – involves sharks and armor-plating.

Meanwhile manufacturers have begun making cable for a new New York–London link intended to shave 311 miles off the usual distance and cut the round-trip message time from 65 milliseconds to just under 60. It will do this by taking a great-circle route, traversing the shallow Grand Banks off Newfoundland. Most transatlantic cables head straight for deep water, to get away from sharks. In what some might consider a case of karmic justice, sharks threaten the financial industry by biting its cables, attracted by the electromagnetic fields generated by the wires that power the amplifiers at intervals along their length. Along the continental shelf, cables must be expensively armored against sharks and if possible buried to avoid damage from anchors and fishing trawls. The new cable will be armored for about 60 percent of its length, to take advantage of the shortest possible route.

Absurd?  Sure.  But here, folks, is the grand finale – the kicker that takes the cake.  Inside the data center at Mahwah, the physical proximity of your server matters, as does the length of cable which connects it.

Traders pay to put their servers in the same building, and to make things fair, engineers scrupulously add extra lengths of cable to equalize the runs among all the servers. Yes, we are talking about a few feet plus or minus. At nearly the speed of light.

Adding a few feet could put you ahead or behind. Seriously.

What This Links To
What Links Here