2011 Algorithmic Trading Survey - report and analysis. By Bob Giffords
These were the key conclusions to emerge from Automated Trader's 2011 survey of algorithmic trading practice. The survey attracted more than 500 responses from market participants mainly in North America and Europe, but also included over 100 from Asia Pacific. A further 3.5% of responses came from elsewhere, including the Middle East, Latin America and offshore financial centres. 36.58% of responses were from buy side firms and 20.62% from the sell side, with the rest mainly from infrastructure or technology companies. A few academics, regulators and central banks contributed as well. Once again survey participants were self-selecting and mostly from the front office; however, the responses confirm a broad cross section of interests and perspectives.
In which geographical region are you resident?
Of the buy side, a substantial minority described themselves as proprietors or partners of their firms, which adds weight to the data provided. Over 85% of buy side contributors came from smaller firms managing under $1 billion in assets. The largest buy side firms managing over $10 billion in assets were perhaps under represented (5.9%) in the sample. Sell side responses, however, were mostly from larger firms with over 64% claiming a balance sheet in excess of $500 million, and nearly 30% with total assets in excess of $500 billion. Indeed as we shall see, the larger buy side firms tend to rely on their brokers to protect them from the speed daemons, so we can control where necessary for different strategies and alpha horizons.
How would you describe your firm's main business?
While last year's survey was focused very much on high frequency trading, this year attracted a broader range of interest. A few questions were also sponsored by the London Stock Exchange and hosting provider, Interxion, as well and are separately identified.
Which of the following descriptions best reflect the current level of trading automation in your firm?
What percentage of machine generated trading signals result in executable orders being submitted to one or more trading venues without any human supervision?
Perhaps the most striking feature of the survey was the breadth of automation. From opportunity spotting and pre-trade checks to execution, post-trade and asset servicing; from risk management and hedging to broker evaluation, model optimisation and even creation, the survey confirmed an increasingly comprehensive array of automation throughout the entire trade lifecycle.
Nearly 60% of all buy and sell side firms described themselves as highly automated trading shops with work flowing straight through from decision models to trade execution. Unexpectedly perhaps, the highest levels of buy side automation were reported in Asia Pacific. On the sell side, traders in the UK claimed the highest levels.
Moreover, the UK buy side firms confirmed that over two-thirds of their alpha signals were now machine generated, and rather more appeared to flow straight through to execution than in the US. Is the US innovation engine therefore beginning to falter?
With regard to speed, some sell side firms in all jurisdictions showed enterprise level volumes over 5000 orders per second. For buy side firms, only a few traders in North America or continental Europe achieved these levels. However, around a quarter of buy side firms in the UK and North America were doing a more than respectable 50 orders per second with somewhat more in Asia Pacific and rather less in continental Europe. Markets are clearly scaling up.
Considering all trading taken together within your firm (or your department in a large firm) across all algorithms, how many orders per second do your trading engines issue?
Another indicator of trading firms that chose to participate in the study was asset holding times. Where holding times were controlled by the model, around 60% of buy side firms claimed to be day traders, who normally squared their book at the end of each day. Indeed, in Asia Pacific around two-thirds of responses came from these day traders.
Thinking about your most common trading strategy how latency dependent is it?
In one of the three questions sponsored by Interxion, it was therefore not surprising to find that nearly two-thirds of buy side firms and nearly 80% of sell side firms declared their trades as latency sensitive, although only 14% of both categories claimed to be entirely latency dependent. Sell side firms were keen to be amongst the quickest but not necessarily the fastest, while more of the buy side recognised the importance of latency but were prepared to leave money on the table to save costs. If we control for holding times we find the buy side interest in latency drops considerably as the alpha horizon moves to the longer term, with the exception of course for execution algorithms. Here latency continues to be critical for everyone. In Asia Pacific the levels of latency sensitivity were the highest for both buy side and sell side combined. This probably reflects the long distances, as well as shorter alpha horizons.
Moreover, when asked why latency was not important, a majority of the buy side cited a longer term alpha horizon, while around a third suggested either that their markets lacked high frequency flow or that the costs were too high for the additional returns. Almost 15% regarded latency in the execution process as their broker's problem, not theirs.
Given this context the challenges faced by traders are quite revealing. For the buy side the biggest challenge by far (62%) was finding alpha opportunities in increasingly competitive markets. Over one third of both buy and sell side mentioned managing real time risks, more than a quarter of buy side firms cited capital capacity as a challenge, although most interestingly, this was a problem for only 8.5% of sell side respondents. Having to re-engineer models to keep them profitable was named as a challenge by over one third of buy side respondents, but again this was less of an issue for the sell side, being a choice for less than one fifth. Around a quarter also mentioned coping with speed and high volume data feeds, and one fifth chose dealing with errors and equipment failures.
What are the top three business challenges facing your firm with regard to algorithmic and systematic trading?
Alpha capture was much more of a problem for the buy side in North America and Europe, while in Asia it was mentioned far less (44%). There they were focusing more on speed and volume challenges as well as real time risks and the update cycle. In continental Europe, re-engineering their models was actually mentioned by half of buy side firms. Clearly everyone is feeling the pressures of competition, intensified by the 'high tech' nature of the markets and no doubt the financial crisis.
So how are traders coping with these challenges? If speed is the great leveller, firms are trying hard to be smarter as well as faster. So they are turning increasingly to multi-asset and cross time zone trades.
If we look at execution algorithms currently used by buy side traders, trading in equity index futures and options has overtaken cash equities as the main focus, at least for the survey sample (63% of buy side firms mentioned futures as opposed to 50% trading cash). Next came commodities (44%), followed by FX futures and spot or forward (35% and 33% respectively).
Which asset classes does your firm (or your department in a large firm) currently trade using execution algorithms?
Over the next couple of years the big growth areas look to be FX options, both exchange traded and over-the-counter (up 90% and 143% respectively from their current low base). Fixed income derivatives and cash markets are also motoring on with respectable growth figures (52% and 40% respectively).
From a sell side perspective, the offering is still focused primarily on cash equities (78% of firms offer stock execution algorithms), with equity derivatives of all types coming second (42% to 47%), followed by spot or forward FX (31%), and then commodity and energy derivatives (25%). However, over the next couple of years the sell side too is focusing mainly on FX options with huge growth, but for fixed income they appear to be thinking more about cash than derivatives. Perhaps because of the launch of several new trading venues on the cash side, sell side competition could increase 90% compared to 66% for fixed income derivatives.
Which asset classes does your firm (or your department in a large firm) currently trade using systematic trading algorithms or fully automated portfolio investment/ asset allocation algorithms?
For systematic trading algorithms on the buy side, the current asset class focus is very much on stock index futures and options along with commodity and energy derivatives. However, strong growth is forecast across the board, especially single stock futures and options, FX options and especially fixed income cash with the number of competing firms doubling or in some cases nearly tripling.
Meanwhile, the sell side systematic traders, which, of course, include market makers, are ramping up their algorithmic offering across the board as well. Competition in some sectors will double, triple or even quadruple as in the case of some FX derivatives as the algorithm portfolio spreads to cover pretty much all asset classes. Once again the fixed income markets are looking particularly healthy. In a couple of years global electronic robotrading will simply be all pervasive.
The Automated Trader Survey similarly confirmed growing global horizons. 90% of buy and sell side traders are already trading multiple markets, with a majority trading more than five. In the UK a third of buy side firms and over half of sell side firms are already trading in over 20 markets. Indeed, over a fifth of UK sell side firms were covering over 100 markets and liquidity venues. In Asia Pacific the geographical scope is, however, typically much narrower.
Target markets are also fairly predictable, with a general focus on North America and Europe and somewhat less in Asia. Over the next couple of years, however, buy side interest in emerging markets is exploding. Interest in Latin America is due to triple, followed by Eastern Europe including Russia, which should more than double, then Asian emerging markets (84% growth) and finally the Middle East and Africa (50% growth).
On the sell side the growth is rather more subdued and focused mainly on Latin America and Asia, with fairly limited growth in the Middle East and Africa. Indeed, it appears that sell side firms tend to be much more localised than their buy side clients in terms of offering, even though a few middle market brokers are partnering aggressively to provide global cover. Perhaps smaller sell side firms with specialist local knowledge should be focusing on global marketing to attract the new global demand.
The Automated Trader survey also confirmed that most buy side firms use their own algorithms, while 47% rely on broker algorithms. Access was typically over broker DMA services, while a minority used either sponsored access or direct exchange memberships.
In which of the following geographical markets do you currently trade using execution or systematic algorithms?
Of course as traders expand their sights, the pressure for colocation becomes intense and so Automated Trader explored this development in some depth.
While a majority of buy side firms still do all their trading from one central location, 47% deploy distributed trading engines and this number is expected to grow to nearly three quarters within three years. While on the sell side, distribution is more common (already over 85%), the vast majority of investment firms currently only use between one and four colocation centres. However, a few firms are already managing over 10 data centres.
Most firms operate their colo trading engines independently of each other, although a growing number of firms, especially in North America, on both the buy and sell side, are capable of dynamic placement of orders and close collaboration between distributed servers.
When asked why they were using colocation facilities, most buy side firms justified their decision based on latency to one or more market centres. A third of firms wanted to be close to their data providers rather than the markets themselves, while a quarter of firms were looking for overall cost savings related to market connectivity. A small minority (11%) were looking for agility and easily accessible e-services, while others wanted to be near their brokers.
What are the top 3 reasons for your firm using or planning to use co-location or proximity data centres?
On the sell side, the main colo centres today are in New York, London, Chicago and Frankfurt in that order. Over the next two to three years, the biggest growth is foreseen in Singapore, Sydney, Hong Kong and Madrid. For the buy side the current favoured locations are Chicago, New York, London, Frankfurt, Tokyo and Singapore in that order, while the biggest growth is foreseen in Madrid, Sydney, Hong Kong, Stockholm and Singapore. Sao Paolo, Brazil and Moscow were also frequently mentioned for colo expansion.
In which cities do you currently make use of co-location and/or proximity hosting?
Regarding colocation, up to now nearly half of buy side firms have chosen to partner with securities exchanges, while around a quarter each have partnered with data vendors or third party hosting providers either with their own network services or in a network neutral format. As the number of destinations grows and as cost and latency pressures increase, the network neutral model may well attract more market participants because of the competition it offers. Similar forces appear to be at work on the sell side.
ASP service centres hosting order or execution management systems are another potential colocation destination for the buy side (18%), while some firms are even exploring the world of cloud computing (13%).
As market participants struggle to capture alpha, we find clear evidence that they are using a wider variety of data sources to structure their trades. For example, investment firms are looking not only at pricing data for the particular instrument to be traded, but also at other instruments in the same asset class and other asset classes as well, along with benchmark indices of various kinds. The sell side in every region consults a wider range of market pricing data than the buy side, and UK and North American firms use the highest levels of cross instrument and cross asset information compared to other regions.
Besides pricing data, post trade data, corporate actions and fundamental data of various kinds are also quite popular with systematic traders. Even weather or geo-sensory data has a small but dedicated fan club. News feeds, too, represent rich but difficult sources of alpha or risk signalling. However, they are apparently being actively used only by a small minority of participants and mostly to date on the buy side.
If systematic trading engines read the news in one form or another, it appears they are much more likely to be latency sensitive. 85% of sell side and 78% of buy side firms that read the news are also latency sensitive. Curiously, however, if firms are entirely latency dependent, they are probably being driven by market prices, not news. Other interesting correlations also emerged. If an automated trader reads analyst opinion it will probably read other kinds of mechanised news, although the converse does not hold. However, the reading of news sentiment in particular appears to be a good predictor of the concomitant use of analyst opinion.
On the other hand, latency, liquidity and other technical or performance data feeds are all quite popular with systematic traders. In general the sell side tends to monitor latency more than the buy side, although highly latency sensitive strategies will inevitably monitor latency.
Moreover, a parallel range of data types were used for driving execution algorithms. While fundamental macro data releases are widely subscribed, news in particular is being widely explored but still not heavily used. More technical metadata metrics like latency data, liquidity data, volatility data, spread or correlation data etc., however, are widely used in execution algorithms, particularly on the sell side.
Regarding machine readable news, around 10% of buy and sell side firms have had any real success and many challenges were reported, such as natural language processing or the unreliability of semantic correlations. News appears to be most reliable for signalling risk or warning of impending volatility or sudden volume surges. It appears to be more difficult to signal a directional price change or changes in relative value. Nevertheless, more than half of buy and sell side firms are either currently using or evaluating the technology or are planning to do so. The remainder have no plans or interest in using it at the moment.
One survey question on the number of run time variables in systematic algorithms further confirmed this mushrooming data environment. While most algorithms were managing a few dozen variables, some were managing hundreds or even thousands of variables. Indeed the buy side algorithms appeared to be more complex than those of the sell side.
Meanwhile the technology arms race continues with business logic being embedded in hardware to achieve ever lower latencies and cope with ever increasing volumes as we move into billion tick days. The buy side appears to be leaning more towards GPU, especially in double precision rather than single precision formats, while the sell side is much more deeply committed to FPGA. Indeed, with a forecast 40% sell side penetration within three years and over one quarter counting all responses, we would have to say that FPGA is at the tipping point where it enters mainstream financial technology. GPUs appear to be not that far behind. This supports anecdotal evidence that FPGA is being used for feed handlers, pre-trade risk checks, and even core trading applications and moving into the nanosecond range with almost no jitter.
Which of the following hardware technologies do you currently use in live production for electronic trading or market data processing?
Infiniband is also gaining traction on the sell side, while the buy side appears to prefer 10 gigabit Ethernet to avoid data saturation. The OS bypass capabilities of both protocols appear to be very attractive. Meanwhile the use of the specialist features of multicore chips continues to grow as software languages begin to address these issues. The evidence regarding other specialist silicon or reflective memory chips is more ambiguous.
Engineering of Systematic Trading
Automated Trader also explored various engineering aspects of systematic and execution algorithms. The survey confirmed, for example, that buy side firms in North America and Europe (including the UK) have been developing these systematic trading algorithms for typically the past 15 years. On the sell side, experience appears to go back only around 10 years. Asia Pacific, on the other hand, has turned its hand to algorithmic trading only in the last couple of years.
In terms of accumulated investment, while some buy side firms in North America or the UK may have 100 man years or more invested, most firms will have invested less than 20 man years, and many will have less than five man years of intellectual capital, which must expose them to many competitive threats. On the sell side, up to 40% of UK firms and 25% of North American firms claim from 30 to 100 man years or more of accrued investment. The investment is much higher, but so is the burden of legacy systems.
The mood, however, is bullish with nearly all investment firms predicting growth for their systematic trading investment and the vast majority expecting that to exceed 25% in the next couple of years. However, on the sell side in North America, the forecasts are a bit more subdued. The main expectation there is for growth between 11% and 25%.
Measured in terms of the numbers of parameters, the complexity of systematic algorithms is remarkably similar for both buy and sell sides. The weighted mean expected number of parameters would be around 70-72, but the maximum was reported to be over 1000. There are, however, significant regional differences. In North America the sell side algorithms are rather simpler with on average only around 25 parameters. In the UK the situation is reversed: the buy side algorithms are simpler with only around 35 parameters predicted, while the sell side have over 170 parameters.
Systematic algorithms may be recalibrated with a frequency ranging from never to daily. Rather more buy side firms do it quarterly or when performance erosion requires it. On the sell side recalibration tends to be done somewhere between monthly and daily with another large group recalibrating on demand. Best practice appears to vary enormously.
The recalibration process typically involves extensive back-testing, sometimes using genetic algorithms or other adaptive learning processes to adjust the large number of parameters that may be rather inscrutable to a human quant designer. Rather more firms try to adjust the parameters manually before back-testing and benchmarking the configuration. A small minority of participants don't back test at all. Presumably they take great care when making adjustments in the live production environment as reported anecdotally in separate research.
The frequency of recalibration appears to be slowly increasing for all market participants, although around half of all sell and buy side firms claim it has not changed. While most systematic trading algorithms may last for months or years, a few have very short life cycles and may be withdrawn even intraday.
The design process itself appears to be increasingly mathematical, but many different policies are used to ensure success. In around 30% of cases overall, the generation of new trading ideas is partially or fully automated. North America appears still to be leading in the application of artificial intelligence to trading problems. For benchmarking performance, both buy and sell sides tend to use maximum drawdown limits, Sharpe ratios and net overall return, sometimes risk weighted.
Execution algos appear to be less complex in general than systematic trading models. Typical parameter numbers for both the buy and sell side are reported to be under 20; although a few firms claim algos with 100s or even in a couple of cases over a 1000 parameters. However, while each algorithm may be simpler traders have the bigger challenge of which algorithm to use.
While many traders may only use a fairly small number of execution algorithms, a couple have access to over 500. Managing such a huge library of software, constantly being optimized or replaced, is non-trivial. In the UK the buy side may only use 25 algorithms, but the sell side once again reported hundreds. In North America and Europe both buy and sell side deploy on average fewer algorithms (less than 100), while in Asia Pacific a few firms apparently have access to a very large basket of execution strategies.
Recalibration of these execution algorithms tends to be on a weekly or daily basis for the sell side, and on a monthly or even quarterly basis for the buy side. Once again the frequency of recalibration appears to be speeding up for a reasonable minority of players. As well as the recalibration of parameters, the software itself is updated every few weeks to months. Given the number of algorithms being managed, that is a sizeable workload, especially with the current levels of market uncertainty and volatility.
Moreover, the size and complexity of the algorithms once again demand automated tuning methods, which the survey confirmed are in use. In order of preference, performance is usually measured against the following benchmarks: arrival price, VWAP, comparison to pre-trade forecasts, and implementation shortfall. Asia Pacific appears to do fewer pre-trade checks but tends to use more third party TCA benchmarks.
The Flash Crash and the Regulatory Reverberations
The 2011 survey closed with a brief review of the flash crash, together with the various initiatives under consideration by regulators and policy makers.
Firms were asked what they considered were the main causes of the May 6 crash last year. Around a third of buy side firms felt it was just a blip, to be seen simply as a natural market event. Many more focused in either on the inadequacy of market organisers or even the regulatory framework itself. After these three main explanations, support for the various causes was thinly distributed: technical market data failures, market makers for stub quotes, large intermediaries for forwarding stop loss orders to a crashing market, high frequency traders for pulling their liquidity etc.
Whilst the regulators tended to blame the market participants, the participants themselves tended to blame the regulators for not anticipating the risks and the trading venues for having inadequate measures in place to ensure an orderly market.
Which of the following changes would you like to see implemented by exchanges?
After the crash nearly two thirds of buy and sell side firms carried on trading without major changes to their infrastructure or review process. Many actually enjoyed increased volumes and returns.
A small number re-engineered their approach to trading (6 to 9%), while a further contingent of around 25% slowed down their trading for a short while until they regained confidence. This was a much more muted response to the whirlwind rule-making adopted by the regulators. Asked for their opinions on the various regulatory proposals, survey participants revealed some very interesting attitudes:
The London Stock Exchange Group sponsored a number of questions related to regulation and market structure. In response to their question on liquidity provision, just under half were indifferent or negative on the proposals to impose mandatory liquidity obligations on market makers; a little over 30% unconditionally welcomed the suggestion. Only one fifth felt the initiative may improve liquidity during times of market stress.
Asked to determine the benefit to the wider market, around half of all respondents felt that trading venues should be required to offer members a choice of clearing houses, with the sell side having the strongest opinion on the matter at 55%. Similarly, just under half indicated a desire for clearing houses to interoperate and allow margin offset against positions booked with other clearing houses. Again though, opinion in favour of the measure was strongest amongst the sell side at nearly 60%. Perhaps surprisingly, less than 30% indicated that clearing houses should be required to accept trades from execution venues on a non-discriminatory basis.
Around 55% of all respondents agreed that index products such as the S&P 500 or EuroStoxx50 should be available on competing exchanges. When asked to choose which changes they would like to see implemented by exchanges, 50% of sell side firms stated a desire for real time latency reporting/diagnosis; over 43% of sell side firms, (yet only 25.8% of buy side) wanted to see average round trip latencies reduced, whilst 40% of sell side called for more consistent latency.
Should there be a minimum resting period for orders?
Opinion was clearly stacked against proposals for trading "speed limits", with almost 60% believing the concept to be either unworkable or likely to result in a shift of spend from latency to other performance differentiators. However, asked about the likely effectiveness of a minimum resting time for orders, opinion was far less polarised, with 42% of buy and sell side firms clearly against, and up to one quarter of firms stating that they did not have a strong opinion either way. Of the minority one third of participants in favour of minimum resting times, the overwhelming majority (70%) felt the limit should be quite short so that only the gamers are disadvantaged.
If the regulatory burden in your jurisdiction were to increase further do you believe your firm would strongly consider relocation to a lighter touch regulatory environment within the next two to three years?
On the SEC's rule 15c3-5 concerning market access and pre-trade risk, around one half of participants welcomed the proposals, although some wondered to what extent the requirements would be effective, with 40% of the sell side and one third of the buy side agreeing with the statement: "The changes are welcome although it remains to be seen whether they will be fully effective".
Regarding Dodd Frank, only 21% were positive on the legislation, with nearly 80% stating that the Act goes too far, fails to address the key issues, or is simply unnecessary.Regarding the proposed financial transaction tax, less than 5% welcomed it unconditionally. 13% thought it might work if imposed universally across all asset classes, but the overwhelmingly majority were very clear indeed, with over 80% feeling that the measure would damage liquidity or cause capital outflows wherever the tax is introduced. Asked about high frequency trading (HFT), despite only 21% of respondents describing themselves as an HFT, the majority of survey participants clearly approved of HFT participation in the markets, with three quarters being positive or indifferent on the presence of HFT firms and less than a quarter expressing any negative views at all.
Responding to the question on whether their firms would consider relocation to a "lighter touch regulatory environment", around 38% either rejected the idea unequivocally or considered it unlikely, whilst the remaining 62% indicated that relocation to a more welcoming and less onerous regulatory jurisdiction was a possible option, likely or even definite. However, isolating the buy side responses reveals a much greater potential for mobility, with over 70% apparently prepared to consider relocation within the next two to three years, and 19% stating that relocation is likely.
Accelerating pace of change
Wherever we look there is change: new market centres, new financial products, new technology, new strategies for automation, new regulations and accounting policies. Add to this new data sources, new colocation and e-services partners, new quantitative control methods, and new market participants. The list goes on. With algorithms now closely monitoring and responding to each other in every market on the planet in real time measured in microseconds along with new, competing index products acting as benchmarks for more idiosyncratic investment vehicles, we're beginning to see levels of correlation and volatility that were inconceivable only a few years ago. Learning curves continue to ramp up as institutional change follows.
The 2011 survey data clearly demonstrates that the trading environment has changed fundamentally and forever. Extreme connectivity will soon have spread to all markets and asset classes. This will in turn generate greater data diversity and volumes than most will be able to handle. Complex models will be interacting inevitably with intense bursts of feedback and waves of machine 'sentiment' that will be very difficult for mere humans to understand. The key challenge now is to work out how we are going to adapt to it all.
A full 80+ page report containing detailed statistics and further analysis will be published shortly. Visit www.fa5t.net/RL to order your copy.