The Gateway to Algorithmic and Automated Trading

Give us an A!

Published in Automated Trader Magazine Issue 13 Q2 2009

Scribes and soothsayers have been analysing trades since the first cargo of spices was unloaded from the first camel train. But would their work have been any easier if they’d had laptops?

William Essex looks for simplicity in the evolving complexity of pre-, post-, and above all during-trade analysis.

Tomorrow's historians will argue themselves hoarse debating the causes of the first truly global financial upheaval. There will be books written on the fallacies of early twenty-first century risk management, for example; on hedge-funds and the short-selling ban; on politicians and bankers; on the contribution of automated and algorithmic trading to the recovery. And then, when all the big subjects are exhausted, some post-grad student might just stumble across the odd correlation (if that's the word she uses) between different markets and asset classes in the way their trading infrastructures evolve.Books

She might write a doctoral thesis with section headings that include 'fragmentation', 'aggregation', 'consolidation', 'the US: experience and global precedent', 'MiFID', Canada, 'dark liquidity', 'winners and losers'. She'll look at the way very different markets can seem to follow a similar pattern of evolution, albeit at varying speeds and with occasional bouts of leap-frogging, and consider the extent to which this 'correlation' is reinforced by global automation. Markets moved together in those days, she'll conclude, and that wasn't always such a good thing. She'll even wonder, briefly, whether globalisation ever really worked in the way that it was supposed to work. But she'll get her Doctorate for the section in which she argues that trading itself changed radically in the first decade of the twenty-first century.

And all because she found this magazine in a dusty corner of her university library, and got her idea from reading these first few paragraphs. The big change has been automation, of course, but one of the more significant changes has been in the size and complexity of the challenge represented by pre-, immediately post-, and during-trade analysis. This is a challenge that has become familiar, in that the combination of tight margins and competition have focused attention on cost. Fragmentation, meanwhile, has led us into a world where heightened concerns for trade oversight have to be balanced against the increasing difficulty of finding a single, durable best price (or, realistically, a combination of prices that add up to 'best') that will prove robust against any post-facto opportunity-cost analysis. These days, the term 'best execution' is probably best rendered 'better execution*', with the asterisk leading off into an explanation of how hard it is to track liquidity across a fragmented landscape.

"These days, the term 'best execution' is probably best rendered 'better execution*', with the asterisk leading off into an explanation of how hard it is to track liquidity across a fragmented landscape."

How hard it is, yes, but how worthwhile, too. It is a fundamental trading equation that challenge equals opportunity. And these days, in this trading environment, trading effectively IS trade analysis, in the sense that ongoing analysis is the source of a trader's, or a model-builder's, or an algorithm-designer's, "feel" for the market. Cost control does still come into this equation as well, of course, and thus, self-evidently, the whole nascent discipline commonly known as "transaction cost analysis" (TCA). This was once a simple matter of subtracting the cost figure from the big number, and asking for a discount next time. Now it's a multiplicity of variables that determine the overall impact of, typically, more than one transactional (or trade) event within an overall trade (or transaction). And its purpose is self-improvement.

Wizard!

Michael Sparkes, director and managing consultant of analytical products and research in Europe for ITG, says: "The need for TCA is all the greater these days. There has been a growing awareness of the need to verify trading costs, both in terms of obligations to monitor and demonstrate best execution, but also an awareness that by optimising the efficiency of the trading process, it's actually possible to get, in the medium and longer term, significant enhancements to investment returns. That's something that people had overlooked in the past."

Michael Sparkes

Michael Sparkes

Perhaps they overlooked it because trading was simpler in the past, so that the challenge/opportunity was smaller. Perhaps. But if transacting is now effectively multi-dimensional, with any given transaction potentially spread over time (see for example our 'Anatomy of an Algo' this month, on page 84) as well as potentially spread over multiple dark, light and variously shadowed liquidity pools, all of which are competing aggressively on price, it follows, first, that trading itself has changed, and secondly, more usefully, that there is a wider gap now between trading efficiently and trading 'badly'. TCA is a new discipline because it has effectively become an activity with a yield of its own.

Real-time solutions

Just working out whether the whole of a trade was handled at the best achievable price/cost is becoming a potentially profitable activity in its own right. In a challenging commercial and competitive environment, it is also therefore becoming all the more critical.
So what do we do about that?

Quantify the challenge, for a start. Ali Pichvai, ceo of Quod Financial, says: "We are entering into a new phase of electronic trading where the execution-management system, the trader, the owner of the execution wants to be able to take real-time decisions on how to execute. Whether to send the trade to a broker, or send it to an alternative liquidity pool such as a dark pool." Note: real-time decisions. Current thinking on TCA is that it has to be live; the trade to be improved is the one that's happening now. As Pichvai says: "You need to have a set of technologies that are able to look at the market, the information gathered from the market, and take that real-time decision."

Quod Financial's solution is the use of "adaptive trading" technology, which was covered in the Latest News section of the Q1 issue (find the issue, the section and in particular the story 'The Year of Trading Adaptively?' at www.automatedtrader.net). Pichvai says: "If properly used, and if its quality is high, TCA is a metric that can be used as an important parameter in the decision-making process." Not the only parameter, but still, a significant determinant of ongoing success.

Ali Pichvai

Ali Pichvai

Among recent initiatives in the field of real-time TCA, UBS Investment Bank's recent [September 2008 in the US; March 2009 in Europe] launch of its conveniently named Real-time Transaction Cost Analysis (Real-time TCA) tool is a striking example. Available via the firm's equity trading analytics platform, UBS Fusion, Real-time TCA promises clients "live, continuous analysis of their orders while they are still executing", and thus by implication the opportunity to intervene in real, mid-trade time.

Real-time TCA is delivered as an integrated part of the firm's Direct Execution electronic trading suite, which includes "an array of trading and analytical tools," according to the accompanying paperwork. Significantly in this context, these include tools for pre-trade transaction cost analysis (for single stock and portfolio trading), plus "a suite of in-market tools including trading alerts, and a wide array of downloadable reports and analysis tools".

Discussing the European launch, Phil Allison, head of European equities trading execution at UBS, says: "Giving our clients Real-time TCA while their orders are still active means they are able to monitor and react before the order completes." This beats finding out at the end of the day that an algorithmic trading strategy failed to optimise its outcome. "Since post-trade analysis is, by definition, the assessment of past performance, it cannot add optimal value to dynamic trading decisions," Allison continues. "Now, clients have the opportunity to monitor how their orders are doing while there is still time to affect the order's outcome." UBS Fusion's real-time analytics "continuously update while the orders are live, allowing clients to monitor how orders are performing across all venues".

Blue-sky TCA

The buzzword is 'real-time' and the gain is the ability to reach in and 'tweak' the trade while it's running through. In a trading environment where 'algorithms of algorithms' are on the way to becoming as commonplace as [managed] funds of [hedge] funds always used to be, this is potentially a turning point: the 'TCA layer' may not be trading in its own right [yet], but it's making some pretty important decisions.
That leads us into some interesting speculation.

First, let's acknowledge the obvious implication of all this, which is that TCA can no longer be a stand-alone activity. Trading cost may be significant, but it doesn't follow that a cheap trade is an effective trade (unless, of course, we're looking at cost in isolation, and the point is, we can't).

Secondly, there's a lot more real-time analysis going on around any given trade, than is attributable to TCA. There's the price data to which the trade is a reaction, for example.

And thirdly, not quite self-evidently, TCA outputs are only one factor determining the progress of a trade. Examples of others might include the size of the trade, its urgency, its objective.

What this tells us is that TCA in its present real-time form is potentially only a transitional activity on the way to a much more holistic form of transaction analysis.

Maureen Fleming

Maureen Fleming

We can track this trend through a sequence of other recent launches. Take for the example the recent launch of Vhayu's Velocity TCA solution as a tool for real-time analysis. This "provides multiple out-of-the-box benchmarks to measure and analyze trading performance with instantaneous feedback on trades sent from any FIX-compliant EMS or OMS". Vhayu has partnered with EZX "to integrate the necessary components to enable firms to conduct real-time TCA". And if that sounds as though it's going to take a big chunk of processing power - well, we're going that way too. Maureen Fleming, Program Director for IDC's Business Process Management and Middleware service, says: "Software that supports the full capacity of multi-core servers offers enterprises the opportunity to significantly improve performance and reduce cost."
All of which is interesting in its own right. But if we move out of the TCA 'silo' (if a magazine can categorise its discrete subjects of interest as silos), we also find news of Progress Apama's latest release of its CEP (complex event processing) platform, which has "an enhanced Parallel Correlator that leverages multi-core and multi-processor functionality". This nifty little gadget "supports dynamic business operations that are constantly changing and which require immediate, forward-looking responsiveness to business events with sub-millisecond latency".

Dr John Bates

Dr John Bates

Which is no doubt the kind of attribute you need, if you're running a trade into the market and hoping not to get blindsided by some unexpected complex event out there in the real world. But the point is that the real world, and thus the range and quantity of data required for effective trading, is getting more complex as time goes by (or you could say: trading-related applications are getting better at seeking out and leveraging complexity). Dr John Bates, founder of the Apama division of Progress Software, speaks of a "fundamental shift in the design, development and deployment of CEP applications". Bates says: "The pace at which organizations accumulate and analyze data is increasing exponentially, and CEP products must keep pace."
But just as trading is trading analysis, so any real-time TCA output is data. And if, post-fragmentation, trading itself is a pretty complex event, we might as well jump straight to the conclusion that what you really need, to make best use of trade analysis in general and TCA in particular, is some kind of a joined-up TCA/CEP solution. I mean, if, for example, Progress Apama integrated its Algorithmic Trading Accelerator platform with the overall Vhayu Velocity data-processing engine, we'd be pretty close to getting access to a Complex Transaction Cost Processing solution.

Reader, they did it (and see page 10 for a detailed write-up). Bates says: "High frequency trading applications need simultaneous access to both streaming market data as well as historical data in order to deliver the most competitive strategies. With our Vhayu integration, both sell- and buy-side organizations can create algorithms whose logic spans both streaming and historical data." Including real-time TCA outputs? It may not exactly be ground-breaking to tie together data processing and complex-event trading, but there's a step beyond this where, thanks to fragmentation, it no longer makes sense to treat transaction-cost data as a category distinct from all that other data that influences the in-trade decision-making process.

To end on a forward-looking statement, the history of early twenty-first century automated and algorithmic trading is going to be the history of a series of initiatives aimed at integrating every potential impact on the trading decision into a single feed. And then trading on the result.

Thanks for the memory

This is interesting. One of the significant barriers to effective data processing is availability of memory. There's a new outfit called RNA Networks that has come up with a solution that "floods the enterprise with memory" by the simple expedient of making all the unused memory across the enterprise accessible across the enterprise.

If the accounts department is sitting on a few spare megabytes, for example, that unused capacity can be pulled in to help with, let's say, the model-testing department's processing of vast amounts of historical data.

The product is RNA Messenger and what it does is "memory virtualisation". Which means that it "uses a peer-to-peer architecture that aggregates memory across multiple nodes/devices, and makes it dynamically available to all servers through a patented Memory Virtualization Pool". Rather than approach the shortage-of-memory problem one server at a time or by requiring changes to the BIOS or the installation of specialized chips, RNA "provides high performance computing scale across existing x86 servers and commodity hardware".

More at www.rnanetworks.com.