The Gateway to Algorithmic and Automated Trading

Market data considerations in a new regime

Published in Automated Trader Magazine Issue 39 Q2 2016

There is an ever increasing need for market participants to improve returns and at the same time to manage risk. Yet these opposing forces are fraught with challenges.

If benchmark indices are any measure, markets are continuing to both whipsaw and flat line. Last year the S&P500 dropped for the first time since 2008 while the VIX remains stuck far below its long-term average. Volatility is both friend and foe to the professional investor and quant trader alike - too low is akin to the motionless clipper ship on a windless day. Sudden peaks on the other hand can exceed the limits of the vessel's (or in this case: the algorithm's) stability.

Adding to the stress are new obligations for algorithmically driven traders. One is the CFTC's proposed Regulation Automated Trader (RegAT) which, among its many stipulations, includes a provision for the testing of algorithms used in trading:

"… regular backtesting of Algorithmic Trading using historical transaction, order, and message data to identify circumstances that may contribute to future Algorithmic Trading Events; regular stress tests of Algorithmic Trading systems to verify their ability to operate in the manner intended under a variety of market conditions;"

An Algorithmic Trading Event in this case is specifically meant to imply an operational breakdown or other market-shocking disruption. Clearly, the goal is to prevent rogue algorithms from spreading havoc (For additional details on Regulation AT, see page 43 of this issue).

These examples are perhaps overwrought metaphors for the current state of affairs. Yet improving your understanding of markets, their microstructure and the impact on your own trading can be a game changer. Among many uses, data is the resource to improve algorithm logic to make better trading decisions and minimize that fear of something going rogue. But market data, specifically a high quality history of markets is just one factor behind algorithmic understanding.

The other factor is research, which is the quest for knowledge. An empirical investigation to discover the truth, establish factual evidence (opinions aside), to solve existing problems and provide insight into new theories. With technological automation advancing rapidly the need to understand market dynamics, market micro-structure as well as the influences of geopolitical economics and regulatory policy is more important than ever.

Yet the salient fact is that data, specifically historical market data, is messy. Among market participants' worst fears is spending more time processing and cleaning data rather than analyzing it. Data collection involves the management of the multiplicity and diversity of market centers across asset classes and continents.

Unearthing alpha from market dynamics is to uncover a diamond in a mountain of coal, demanding richer data over longer time periods. The challenge is dealing with the vagaries and nuances of different market centers - varying ticker names, symbol continuity and price adjustments across corporate action events, timestamps (both in terms of precision and alignment across markets), the influence of dark trades on lit markets, the list goes on and on… Any and all of these factors can and do influence algorithmic trading, its robustness and profitability.

But it does not signal an end to profitability. To find the shelter of safe havens in this new regime, participants must broaden their analysis as well as going into greater depths. This increasingly depends on data quality and data accuracy for consistent alpha.

Market Data considerations in a new Regime

  • Copyright © Automated Trader Ltd 2017 - Strategies | Compliance | Technology

click here to return to the top of the page