The Gateway to Algorithmic and Automated Trading

Fair Play

Published in Automated Trader Magazine Issue 35 Winter 2015

Major trading venues have historically championed the idea of a level playing field, but critics say the pitch can be decidedly uneven these days. Adam Cox reports on an intensifying debate over whether markets are indeed fair, and what that debate could lead to.

Eric Hunsader, Founder, Nanex

All may be fair in love and war, but how about the markets? In a world dominated by ultra-low latency technology, the spotlight on the industry - in terms of fairness - has never been brighter. Angry investors, lawsuits, media attention, calls for regulatory action, grandstanding by officials... rarely will a day go by when someone is not questioning whether the markets are fair and what is to be done about it.

The question itself is not so simple. There are myriad differences in regulations, either in terms of asset classes or geographies, and there are complex market practices and even more complex technologies that all need to be dissected in order to take a view. Most questions of fairness come down to who can see what data when. But they also can concern market features such as order types and trading rebates that, critics allege, benefit some sectors of the market more than others.

As each cry of 'Unfair!' has grown louder, there appears to be momentum behind the would-be reformers, a small but prominent group of market practitioners and technologists who want to put pressure on market venues to alter their rules and foster more transparency.

Meanwhile, regulators have been talking a tougher game and levying more fines (although it is worth noting that a record $16 million fine recently slapped on HFT group Latour was not about fairness but rather concerned the firm's risk procedures). Whether all this will result in meaningful changes that significantly affect different groups of market participants is not entirely clear. What is clear is that the pressure shows no sign of dissipating.


One of those calling for reform is Eric Hunsader, a developer who founded data technology group Nanex. He has little trouble describing what he sees as the root of the problem, at least as far as US equity trading is concerned.

"The reality that is the core of all the market structure issues that we have is that there are two things that price the same stocks," Hunsader told Automated Trader.

One of those is the Securities Information Processor (SIP), which under Regulation National Market System (Reg NMS) is meant to receive data from all the exchanges in the United States and calculate the National Best Bid and Offer (NBBO) on a given share. The other is the direct data feed an exchange will sell to clients.

Hunsader said that when Reg NMS was passed in 2006, the idea was that direct feeds would provide additional information. "It was never thought that they would actually supply it faster. In fact, a lot of the language back then

in 2006 was that information travels instantly so there really wasn't any awareness that a tiny speed difference would cause the damage it has caused. But they were very clear in saying you can't give this core information faster to your direct feed customers than you can to the SIP."

Hunsader has established a reputation as one of the most vocal critics of HFT practices. Soon after the Flash Crash of May 2010, he began publishing analysis of the causes, ultimately pointing the finger at high frequency trading and differing sharply with the conclusions of an official SEC report. Along the way, he identified an issue that became crucial in a landmark fine of the New York Stock Exchange and one which is central to the current lawsuit...

The remainder of this article is only available to Paid Subscribers

Click here to purchase a subscription to Automated Trader

  • Copyright © Automated Trader Ltd 2018 - Strategies | Compliance | Technology

click here to return to the top of the page