Conference Take-Aways: HiFREQ TRADE 2011

28 February 2011 - Latency measured in nanoseconds, FPGA's, ongoing systemic risk and worries over knee-jerk regulatory changes, were just some of the topics discussed at this year's HiFreq Trade Conference in London. Conference Chairman and regular Automated Trader contributor, Bob Giffords, offers his insights into the key points raised at the event.

There was a real buzz at the February HIFREQ TRADE conference this year as over 250 people debated the way forward for the most astonishing transformation ever in the capital markets. The technology was amazing, the regulatory threats 'scary', and the quantitative ambition, awe-inspiring. One got the impression we are now trading on the 'edge' with a shadowy abyss of flash crashes looming below as we scale boldly up to the complexities and speed challenges of 21st century global electronic markets. As I think back on the day these are some of the many messages that stood out for me both from speakers and private discussions alike.

Black Holes Beckon

The big story this year was FPGA chips with everything, but especially for feed handlers that moved together with multi-core trading engines onto the network cards to get a little closer. People are now talking about simple trading apps converging on latencies of single digit microseconds. 10 gigabit Ethernet with RDMA is also spreading out from the exchanges to directly connected robotraders in the co-location hubs over one-hop switches spanning the data centre. Everything is being time-stamped to mind-numbing nanoseconds, i.e. billionths of a second. Last year the latency target for exchanges was 200 microseconds, this year it is below 100 microseconds and some think they can get that down to 20 microseconds perhaps even next year, a 10 fold improvement in something like 2 years.

Brokers too have squeezed their pre-trade risk checks into 3 microseconds using FPGA and native exchange protocols. This is over 80 times faster than some FIX DMA offerings and appears to have lifted fill rates substantially, even to 90% in some cases, but such advantages will inevitably be eroded as competitors match the technology.

Meanwhile, more and more complex risk management is proceeding asynchronously out-of-band, but still at wire speeds. Waiting in the wings of course are GPU chips and 40 gigabit Ethernet that can bring terascale technology to crunch complex analytics in real-time too. Perhaps that will be the story next year when the pioneers report their progress.

Conventional multicore chips, of course, are fighting a rearguard action coming close to FPGA for feed handling on average, but tending to fall behind in the microbursts. Yet they offer much smarter and more agile solutions, so some people prefer them.

Another strategy was split co-location with dynamic allocation of trades and asynchronous secondary messaging between them. Designing the control models for these distributed algos is probably more important than people are letting on. Thus, there are real choices for smart infrastructure, adapting the hardware to the trading strategy. Where will it end? The technologists told us we are still accelerating into these ever-decreasing microcosmic circles with no end in sight.

Naturally everything has to be optimized - even the time-sharing algorithms on the chips, the power interrupts or use of Level 2 cache. Reflective memory chips that autonomously copy data around the network are another quick trick to save a few more CPU cycles.

How do we benchmark ourselves against the competition? Here too, help apparently is at hand, since some exchanges are now time-stamping all orders when they acknowledge them and then publish those of the fills as well. This allows firms to estimate how close they were to the trade. Other firms track drift in these timestamps to infer bottlenecks for smart-order-routing decisions. Another speaker reported using FPGA chips as well to replay price feeds for realistic backtesting benchmarks, accurate to the nanosecond. For those with ears to hear there were many useful tips of the trade.

Predictable Surprises Ahead!

There was much more anxiety this year over what regulators might do. Sometimes they make quite benign comments about the efficiencies of high-frequency traders, while at other times threaten heavy regulation: minimum order resting times, maximum tick-to-trade ratios, charging for cancels, imposing market making obligations on high-frequency access, short selling rules etc. It is not clear what all this is trying to achieve since many such initiatives may just restrict competition further and widen spreads, as happened when the regulators last struck over short selling. The investor will pay in the end as these regulatory costs rise, but few regulators appear to worry about that.

The basic idea seems to be that like everything else in life there is a need for speed limits. In Europe we have done that to date with volatility interrupts which have fairly small costs and only kick in for the fat tail events. The US had a weaker microstructure made worse by the trade-through rules that failed in the Flash Crash. Yet now the regulators want to penalise everyone, and some more than most. Indeed the EU too is suggesting that if proprietary firms are direct members of exchanges, they need to be regulated and ensure their systems are robust. There is no evidence that failing prop trader technology caused the flash crash. One might therefore be forgiven for concluding that regulators seem to want innovation but not if it gives someone an advantage. What else is innovation for?

Someone asked innocently why exchanges themselves are never fined for failing to ensure fair and orderly markets? A timely observation given LSE Group's glitch the following day. Apparently 'fair and orderly' obligations for exchanges are quite limited, so members need to protect themselves: caveat emptor. Yet if exchanges are not responsible for protecting markets, high-frequency traders somehow become responsible for any indirect loss they might cause. There seem to be some mixed messages here. There was widespread concern that the regulatory focus on high-frequency traders was allowing the markets to tolerate an unacceptably poor level of quality, which clearly creates large externalities and potential fat tails.

Regulators are also mandating increasing risk checking by brokers. Best practice currently was said to include credit and liquidity checks, fat finger checks, anti spamming checks, open order exposures to limit ratios etc. with some allowance for cancelled trades as well. Besides their compliance obligations, brokers are implementing these checks both to protect themselves against loss and to increase business through differential risk pricing.

The most 'scary' thing for one market participant was a heavy-handed, knee-jerk response to the flash crash. The consensus seemed to be that the regulatory future truly hangs in the balance. There is a lot of suspicion and not a lot of trust on both sides. Compare the pre-flash crash world of last year's HIFREQ TRADE conference that presented a much more benign view of regulation. Once again there were reassurances, but the politics are clearly complex. Some intervention now appears more likely.

Lofty ambitions

Speakers noted that single market opportunities for high frequency strategies are rapidly eroding, as everyone spots the gaps and piles in. Thus traders are looking for more complex plays: multi-asset class, multi-geography, multi-instrument correlations, arbitrage of dual listed securities, etc. The mind boggles at the sheer audacity of some of the strategies under discussion. Thus traders are resorting to ever more dynamic adaptive models, ever more frequent recalibration, and ever shorter half-lives before models are replaced. However, since more complex strategies are less likely to be challenged or offer natural barriers to entry, they apparently need much less frequent re-calibration and have relatively longer half-lives. In effect they are truly smarter.

In emerging markets key high-frequency issues are liquidity, the availability of suitable futures or other leveraged hedges, the availability of deep sources of stock loans for short selling strategies, and broker cross-margining capacity. While important, technology is very much secondary to these business issues, although some thought that the availability of a local point-of-presence (PoP) can be helpful for some of the lower speed players. However, network distances are long and relative value plays between exchanges can be very expensive. So high-frequency traders will often co-locate at both ends with asynchronous signals between them.

In some ways emerging markets are even overtaking their western competitors. Apparently Brazil and Russia, for example, both do exchange-based, real-time pre-trade checks to make sure buyers have the money in the broker's central bank account or sellers have the securities in their custodial accounts with same or next day settlement. This completely changes risk profiles, eliminates naked shorts by implication and presumably modestly slows down the flow as well. Brazil requires separate accounts for beneficial owners, not omnibus accounts, and offers cross-asset order types to facilitate complex spread or contingent trades - all very 21st century features. Not only does this give more transparency, but it creates new high-frequency strategies as well. If the advanced economies lose their innovative edge, there are huge implications for global competition and regulation.

Quants are apparently still making money with growing volumes, but their market sentiments are overwhelmingly neutral with as many bears as bulls and little change anticipated in the uncertain macro economic landscape. One quant commented how ideal this was for high-frequency strategies. Consistent with these survey findings, most quant traders were expected to hold their investment at current levels, while some are increasing, but few reducing it.

The growing demand for smart and diverse infrastructure to support diverse and changing trading strategies was said to reduce infrastructure half-lives and thus encourage outsourcing and huge economies of scale for infrastructure providers. Yet 40-50% of high frequency traders still apparently 'roll their own'. Are suitable third party offerings not available or are these firms failing to recognize the opportunity costs of in-house builds? Speakers highlighted how multi-supplier outsourcing potentially offers a greater choice of feasible strategies.

From this perspective, current exchange consolidation may be positive if it increases global competition, even at the cost of local competition. Regulators will need to tread carefully to balance such imponderables, since the lobby legions are on the march.

Fragmentation was described as a dream come true for high-frequency traders by providing them with a sustainable business model to arbitrage the liquidity venues for stable price discovery. Indeed most of the time volatility is now lower. However, when market sentiment sours, volatility can spike dramatically. The "burstiness" of fast markets was frequently emphasized.

Here cross-margining across venues becomes critical which is good for those global brokers, who have the necessary scale and depth to provide the requisite technology, global access, credit and high rates of innovation. Given these economies of scale and network effects the big brokers are busy scaling up and out, but few, it appears, can currently reach out to the many emerging markets and cover all the asset classes holistically. Competition clearly will be heating up.

Someone pointed to the growth in exchange-traded funds (ETF) and the regulator-inspired move from OTC to exchange-traded derivatives (ETD) as being key drivers for future high-frequency growth. Apparently ETFs in the US have risen from 21% of market flows to 35% over the past couple of years although in Europe it was still said to be down around 5%. This creates important high-frequency opportunities to arbitrage the indices to the futures contracts and also to the underlying cash baskets, while on-exchange swaps can integrate pricing off global futures benchmarks much more rapidly. These could create a new frontier beyond fragmentation. Since ETFs and ETDs are also potentially helping to increase correlation, it could however just lead to crowded trades and more black swans, proving once again that every silver lining has a dark cloud looming in the background!

This drive for diversity is focusing attention on algorithms and models, with relatively less emphasis on technology. Here 3 key drivers of quantitative strategies were highlighted:

  • Commoditisation of technology, driving the search for new data to support trading strategy decisions.
  • Increasing reliance on behavioural finance methods, promoting more adaptive high-frequency agility, and
  • Increasing diversification, encouraging research into understanding the investor.

At the same time these trends were also enabling low frequency traders to capture much more alpha. So interest is shifting to the trade-offs between research, technology and the trade-risk balance with a rich harvest of new niches that quantitative traders can exploit. In response, exchanges are looking to provide new data streams, such as latency, but also new microstructures. For example one global exchange was said to be considering price-size rules for the matching engine as an alternative to price-time rules. This potentially might shift the emphasis from speed to smart models and compute intensity. Others were said to be considering different commercial incentives to liquidity providers, order types to exploit penny ticks, a new mix of lit and dark order types, etc. Regulators too may accelerate such microstructure evolution, if for example they require some price improvement in dark pools.

The biggest challenges for quantitative traders apparently are not capacity limits, but rather the need to limit transparency to protect their intellectual property, yet still effectively communicate their strategies to investors who are being increasingly inquisitive. There is less trust in the black box, and more demand for transparency and due diligence. Squaring this particular circle will not be easy.

Current market challenges are forcing firms to focus more on downside risks and controlling the unknown unknowns by leveraging insights from extreme value theory, which lets you model things that are interdependent and therefore co-evolve. Quants are realizing that with so many things changing at the same time, it is difficult for any model to cope with uncertainty.

One fascinating speaker argued that long only firms often create profit opportunities for high-frequency traders because of poor communications between portfolio managers and their own traders. He challenged us to think of order books, at any time, as essentially a long-short portfolio strategy: you are long everything you want to sell and short everything you want to buy until the trades are done. If the trader does not look at the basket with these long-short dynamics, he leaves money on the table for the high-frequency traders. The solution apparently is to improve internal communications by focusing on the following: alpha decay rates, risk models for security returns, agency trading costs, likely market impact both temporary and permanent, and investor risk aversion. While admitting that such parameters are difficult to estimate, the speaker suggested that news analytics might provide some helpful proxies and alternatives to GARCH based models. He stressed that if variance itself is variable, then it becomes very difficult to estimate. We were left wondering how many quantitative models may be open to such uncontrolled errors. Dealing with this fundamental uncertainty perhaps becomes the next great quest for quantitative analysts.

There was one further thoughtful insight: volatility is always the same; it is only the pace of change, i.e. time, that varies. So we need to focus more on the rate of change of volatility, not just volatility. Indeed, while volatility is definitely not normally distributed, it can apparently be closely modeled with varying volatility curves all of which are normal. Yet since volatility is often a lagging indicator, we need to focus more on the strains and pressures that are building up to find leading indicators to the pace of change.

HIFREQ TRADE 2011 provided a wealth of insight, once again confirming Alvin Toffler's 1970s observation that knowledge is the fuel that accelerates the pace of change. For those who attended, may the G-force be with you!

  • Copyright © Automated Trader Ltd 2014 - The Gateway to Algorithmic and Automated Trading

advert
click here to return to the top of the page