The Gateway to Algorithmic and Automated Trading

Conference Take-aways: High Frequency Trading for Fund Management Firms 2011

Crowded trades, indifferent regulators, diminishing margins, the latency arms race, and the need to get the right kind of silicon enhancements were just some of the challenges being discussed at High Frequency Trading for Fund Management Firms 2011. Conference chairman and regular Automated Trader columnist, Bob Giffords, gives his detailed report from the event, and analyses the discussions around latency, regulation, risk, and market structure.

Spare a thought for the poor high frequency trader (HFT) facing ever-increasing costs, diminishing returns and commoditisation of his precious technology. Not only are the regulators out to spoil his fun, but margins are being steadily arbed away, as ever more robotraders and bulge bracket brokers crowd in. Pitfalls are everywhere: like quote fades, rapid tick oscillations or micro trends and reversions nearly impossible to catch. So it may not be surprising that HFT rates are levelling off somewhat; in the US in particular. Such were the messages at Hanson Wade's first foray into the high frequency conference space. So we realised the market was just doing its job, as we spent two action-packed days learning how to forage in the microsecond undergrowth once all the low-hanging fruit has been plucked - a real practitioner's treat.

Some long and short-term algo traders who risk overnight capital were also said to be struggling a bit with crowded trades and unnatural markets distorted allegedly more by quantitative easing and regulatory interference than by high frequency rocket scientists. However, there were lots of suggestions on how to avoid the squeeze, and one or two specialists using artificial intelligence techniques appeared to have found a way round the problems. So hope springs eternal as learning curves go exponential.

Moreover, there was light at the end of the FIX tunnel too with a new, high-speed encoding layer in the pipeline as an alternative to legacy tags and XML formats. For hardware geeks, 2 Ghz FPGA clocks are already on the drawing boards promising 4x upgrades using hybrid technologies in another year or two, although software remains an issue. Still every co-location centre will want one. Nanoseconds here we come!

There was, however, a minor 'dust up' over picoseconds, when I suggested playfully that they too might enter the reckoning in a few years time if co-location centres ever move onto large silicon wafers. Several speakers scoffed at such talk due to the fixed nature of the speed of light. Well, I know a couple of very serious technologists who want to make it happen. Talk of single digit microseconds was itself science fiction not so long ago. So watch this space!

Several speakers challenged us to 'bring the data' for any claims made about high frequency trading. Too many myths were being circulated even by regulators without proper evidence. So lots of facts and figures were quoted to argue that the low latency jockeys had narrowed spreads and reduced execution times, normal intraday volatility, cross-overs in the consolidated book and leakage. While cancellation ratios had increased, it was said not to be a problem for traders who adapted their strategy, even without co-location. Yes, trade sizes had come down although this was levelling off and mainly due to execution algorithms and regulatory changes in tick sizes not HFT. Data was presented by several speakers showing that costs had indeed come down as well, even allowing for volatility and size. Yes, there was gaming from time to time like quote stuffing, momentum ignition or blind pinging, but ordinary traders were at it as well, and algorithms just needed to be smarter. Traders needed to challenge their brokers or algorithm suppliers to demonstrate how they controlled for such factors.

The flash crash was dismissed as mainly a problem of weak market microstructure that would be corrected by the elimination of stub quotes and naked access, the imposition of pricing collars and circuit breakers and more robust exchange technology. Speakers were confident it wouldn't have happened in Europe because of our volatility interrupts and absence of trade through rules. The market had apparently matured hugely over the past year in any case with much more attention to fair value assessments, risk and volume spikes designed into the algorithms.

One speaker showed how exchanges were fighting back against the new competitors to regain the best execution laurels. He documented how MTF owners had apparently preferenced their own platforms to provide liquidity in order to attract flow, but that last year best execution was already moving back to the exchange. Indeed liquidity appeared to move around between venues quite freely, even intraday. The need to 'go with the flow', rather than rely on historical statistics, was stressed time and again.

The long arm of the law

So is best execution then working in Europe? Here there was more debate, since too many small brokers were still focusing on the local exchange and ignoring the MTFs. Regulators needed to enforce the MiFID rules, not just complain about it. The number of small, start-up prop traders and hedge funds in the room seemed to justify the view that HFT had seriously opened the market to competition and that proper smart order routing did not require deep pockets. Fast enough technology is becoming commoditised so costs are coming down. How best to choose technology partners was now the issue and many tricks of the trade were keenly debated including things like pre-coded responses to further reduce latency.

Some interesting points emerged from the debate over regulation and government intervention. Large HFT traders were apparently not afraid of regulation, and some actually might welcome it as providing more barriers to competition. Inevitably it would add to costs and hit investors of course, but did the regulators really care? So, it looks like the UK rules may be tightened here.

Small traders were, however, less sanguine, since they lacked the manpower to cope with thousands of pages of rules. Some market centres did not welcome regulation either, and felt comfortable regulating their own members, who were becoming much more international. Traders too seemed happy to put the onus back on the exchanges. Perhaps they just realised that regulation could discourage liquidity and impose competitive disadvantages. Few participants apparently welcomed transaction taxes, minimum resting-times for orders, limits on order-to-trade ratios or quoting obligations for market makers. One market maker admitted they might make money on longer resting-times, but still did not want it since spreads would widen. He could also live with maximum order-to-trade ratios, but it might also limit liquidity. Speakers appeared more resigned to pre-trade risk checks however, and one even felt market-making obligations might actually help. Yet studies were quoted that showed how taxes, restrictions and rulebooks would just widen spreads and increase costs.

One frustrated delegate questioned whether the regulators really wanted to hand global leadership to the emerging markets "on a plate". Speakers seemed to agree that there was very little real information around on which to base the legislation. "Bring the data" once again! Sadly there were no regulators present to listen or respond, but better trade reporting classifications are at least on the regulatory agenda.

There was also an excellent review of Basel III and the global regulatory response to the financial crisis. The prospects are not particularly hopeful. Many weaknesses of Basel II still remain, with considerable room for national discretion and escape clauses and lots of politics. Moreover, the additional demands for liquidity would apparently not have prevented the 2008 banking crisis, although they will support public sector debt issuance. Is that the real agenda? Will the dodgy models work any better this time around? Will the regulators be able to make sense of the huge volumes of data they are collecting? The jury apparently is still out.

Horses for courses…

So how are people coping with diminishing returns? If transparency was driving competition, perhaps traders could hide in the dark pools. Indeed price improvement was shown to lie in the dark even with orders for as little as 5% to 10% of averaged daily volume. This apparently applied to broker crossing networks as well as the classic buy-side-only venues.

For lit markets there appeared to be many answers: smarter execution algorithms, multi-asset trading, globalisation and just more complex, less obvious investment strategies.

Speakers suggested many smart techniques for recognising fair value pricing and dodging the dodgy flow. This included such things as using sweep orders instead of resting orders, volume limits instead of constant presence, or careful preferencing of venues. Another speaker suggested layering the book. However, all of these strategies tend to limit the eligible liquidity, so common sense needs to balance the risks of adverse selection against execution cost.

With everyone using HFT tools, the focus has to shift to what is 'fast enough' for your strategy, tied to ultimate business performance end-to-end, not some arbitrary technology benchmark. For this, measuring slippage and 'predictable execution' was described as crucial. Everyone seems to be focused at the moment on jitter and 'deterministic latency', since 'time is capital'. One speaker noted that faster markets were also more deterministic. Another warned that latency was massively fat-tailed, so beware of overconfidence. A third argued that smarter algorithms with longer latencies might also achieve higher hit rates, so fast does not necessarily imply better. Moreover, today's optimisation could become tomorrow's bottleneck.

The risks of using hyper-fast trading engines were then compared to ABS brakes on cars: in the wrong conditions, such as loose snow, ABS brakes can actually be dangerous and should be switched off. HFT tools can be similarly risky in the wrong market regime. The old adage of 'horses for courses' still apparently applies. Even stripped down, real-time operating systems may be slower than standard versions in the wrong conditions. Understanding your tools is key.

Where next?

The growth of asset classes beyond equities was also explored. Although FX boasted the highest electronic trading levels, when measuring only the HFT element futures and options had the highest HFT rates, followed by FX, fixed income, and then commodities.

Regarding globalisation there was a lot of focus on Asia although Latin America was also maturing quickly and Russia and Africa both merited attention. Here speakers focused on the key success factors for HFT markets: modern technology infrastructure, accessible co-location, liquid stock lending and futures markets, central counterparty clearing and even cheap money for leverage. Fragmentation and regulation encouraging competition were obviously helpful, but with cross asset, cross border strategies not always necessary. One HFT trader claimed he was managing in all sorts of regulatory cultures. Pricing was less competitive but he could still make money.

There was growing recognition that with all this cross-asset, cross-border trading participants are witnessing a highly correlated global market. This was making it increasingly difficult to diversify portfolios and pushing investors to look at emerging markets and new strategies.

Outsourcing and technology partnering was another important subtext of the conference. Many key services were highlighted, such as real time interoperability with multiple prime brokers, connectivity to multi-asset global markets and a wide range of OMS, EMS and broker algorithms, high quality reference, valuation and historical data on demand, fully intraday position keeping, problem alerts and workflow tools etc. Linking this all together successfully is becoming increasingly difficult and so firms are obviously looking to leverage the specialist knowledge and economies of scale of their partners. The critical ability to keep pace with change and protect one's intellectual property was also discussed.

Best practice tutorials

Given all of these trends, how is the market evolving? Technology policies have shifted from in-house builds, to first to 'buy and modify' and now to renting services as the range of specialist knowledge and connectivity increases and cycle times reduce. The nearer we get to zero latency, the shorter the life cycle apparently. Market membership costs have also come down, and there is a growing choice of real-time and historical data providers, some very expensive, others much less so. Increased pressures on costs due to flat market volumes have forced many participants to sell services to recover their high technology investments.

For ultra high frequency traders direct memberships, sponsored access and co-location are the only options but the demands are ramping up, and regulation will only add to this. Also important are skills for managing split co-location trading engines and direct strategy access (DSA), remote, hands-on control of algorithms running on a third party infrastructure. This means that developers in emerging markets like Russia, China or India can run co-located algorithms in Europe or the US to compete with the locals as never before.

Exchanges too have broadened their revenue streams to data, technology services and co-location offerings as trading commissions have come down. This has encouraged them to woo the global high frequency traders, creating potential conflicts of interest, which the exchanges are trying hard to avoid.

There was a sense that the crowd competing in each venue is becoming much more diverse with a much broader cultural background. The market was described as a 'social medium' so understanding how traders behave is crucial to good algorithmic design, but so is the understanding of markets. One fascinating talk showed how the dynamics of European markets change during the daily cycle as firms first do price discovery and portfolio rebalancing, and then cope with macro news releases, the opening of the US markets and finally the close. Each market has its own profile that needs to be mastered, since algos need to be sensitive to these changes in conditions.

One speaker warned that many firms lack the capacity to customize algos and therefore there was a lack of diversity that might increase the risk of crowded trades. Others described the importance of tightly coupling backtesting, benchmarking and adaptive learning systems to trading engines, in order to calibrate performance to market conditions and to become more agile. One speaker highlighted the growing interest in using elementised news feeds and social media sources to gain perspective on what was happening or early warning of what might happen.

However, perhaps the most memorable point made during the conference was also the least relevant. One speaker noted that the massive Japanese earthquake near Fukushima knocked 1.8 microseconds off the length of the earth's day. Clearly that might wind up being more important than all the rest. Whatever the implications, the Hanson Wade conference gave us all much to ponder.