The Gateway to Algorithmic and Automated Trading

Strategies: Waiting for the Iceberg

Published in Automated Trader Magazine Issue 02 July 2006

Algorithmic trading has radically changed trading patterns in capital markets. Transaction volumes have increased and in the equities markets in particular, individual transaction sizes have plummeted. Debbie Williams, Group Vice President of the Capital Markets and Risk Management Practices at Financial Insights examines the implications of this for risk management and how market participants are (or arenít) responding to the challenge.

When any financial market undergoes a major structural change in the way in which it operates, there are usually significant risk management implications for participants. However, that doesn't necessarily imply that they will respond appropriately in terms of reexamining their risks and controls. To some extent this is understandable, as structural change often begets dollar opportunities, which inevitably tend to take priority.

Just such a situation currently prevails in the case of algorithmic trading, particularly as it pertains to equities. The dollar opportunity here is cost saving and improvements in operational efficiency - especially for the sellside, which is pushing the concept hard, as it can no longer justify its headcount on the basis of available execution margins.

The primary type of risk here is operational risk. With so much of the current activity being exchange traded, credit risk is not usually a major facet of algorithmic trading. By the same token, it does not substantially expand any existing market risks. (One could argue that an algorithm that executes poorly theoretically increases market risk, but it also constitutes an operational risk and it is classified as such for the purposes of this discussion.)

From a historical perspective, the omens are not particularly encouraging. Every time in the past where skilled personnel have been replaced by technology to save money, problems have arisen if the proper controls were not also instituted.

For example, after futures exchanges moved to electronic execution to save on the costs of floor trading, there was a spate of "fat finger" errors - some of them very substantial. This is not to imply that introducing new technology is inherently bad - far from it - merely that doing so has risk management implications that need to be considered.

Specific risks

Some of the most obvious risk implications of algorithmic trading include:

  • An algorithm being tuned for a particular type of market conditions and being left running when those conditions no longer apply.
  • An algorithm not executing at all or failing after a partial completion and going unnoticed.
  • Improper or incomplete quality and assurance of algorithms where code bugs result in incorrect execution.

A further related issue is whether or not sufficient cover has been provided for human traders tasked with monitoring the algorithms.

An additional concern that applies to all the above points is exactly when/where any problems will be caught and remedied. Algorithms have automated the front office, but much of the mid/back office is already at least partially automated. An important component here is the order management system (OMS), which effectively acts as the glue between front and back offices, and is becoming increasingly critical as cross asset class trading increases.

The snag is that some OMSs (particularly those developed in house) were designed well before the inception of algorithmic trading. They are therefore already struggling to cope with the increasing volume of messages relating to equity trades alone. In a sense this raises two operational risks: firstly that the OMS will lock up and possibly lose data, secondly that it will also be inadequate as a tool for trapping and flagging algorithm-related errors, such as those itemised above. In theory, a human trader will cover part of this second role, but in practical terms a human cannot track every one of possibly tens of thousands trades generated by the algorithms. At best they will probably be tracking at a portfolio level and therefore may not immediately notice an algorithm misbehaving - especially if the OMS lacks the necessary alerts.

The technology mindset

While technology is a major consideration in algorithmic and automated trading, the related risk implications currently appear to take a back seat. For example, the focus as regards networking will usually be on matters such as being as physically close to the market as possible in order to minimise latency. By contrast network failure due to overload tends to take a back seat.

Another consistently neglected point is the long-term operational risk of systems infrastructure failing to keep up with the business. When one attends risk management conferences it is relatively commonplace to hear banks citing issues such as employee theft/fraud, improper business practices, client suitability for product and pricing risk. What you never hear is the issue of the infrastructure lagging business requirements, yet this is one of the major reasons why losses occur.

A classic example of this is the failure to upgrade internal network capacity to allow for increased traffic from high frequency/algorithmic trading activity. While considerable investment may be made in external connectivity to execution venues, it is not uncommon to find the same LAN segment being used to service both critical trading applications and Web browsing, printing etc.

The bigger picture

One reason the failure to plan for the operational risks of auto/algo trading is that operational risk as a concept hasn't really made much progress since 2000. The operational risk management industry may have a mandate, but in terms of technology it has self-assessment tools, which are little more than glorified questionnaires. The loss data output from risk events end up being dumped unscrubbed into standard relational databases.

The lack of progress isn't perhaps very surprising, in that there are only two reasons market participants invest in risk management technology:

  • Someone - usually a regulator - tells them to
  • They have realised that there is a way to make an improved return using it

In the majority of cases it is the first reason that predominates - but in the case of algorithmic trading, regulators have not as yet taken a specific interest. In broader terms, algorithmic trading is potentially covered by both Sarbanes-Oxley and Basel II. For example, in the case of Sarbanes-Oxley there is a requirement to create and test controls to cover "substantial risk". However, apart from the question of whether a particular organisation might deem that its algorithmic trading constitutes a "substantial risk", there is a rather more fundamental problem. A significant number of large buyside institutional investors are subject to neither Basel II (if they are not banks) nor Sarbanes-Oxley. They are therefore under no (even indirect) regulatory compunction to do anything about the operational risks of algorithmic trading.

For the time being, this may not be a major problem, as these organisations have not yet rushed headlong into algorithmic trading. (Indeed, some might argue that the prospect of assuming additional operational risk by doing so has acted as a deterrent.) So far, the bulk of algorithmic trading activity has come from the sellside, as they execute their own (and to a lesser extent) their client's trades. Since these firms are typically those best able to afford the cost of operational risk events, this should to some extent - and only for the time being - ameliorate regulators' concerns.


At present, it looks as if the operational risk issues surrounding algorithmic trading will follow the well established pattern for capital markets - nobody will do much about them until a major event occurs. While proprietary sellside activity in this space continues to predominate, this may not be of much public concern. However, once buyside participation picks up (and the sellside has the strongest motivation for encouraging this), the situation will change radically. The prospect of a major fund manager sustaining substantial losses on retail client assets due to an operational failure in their algorithmic trading is exactly the sort of situation likely to trigger much greater regulatory interest. Whether or not this prospect will motivate a wholesale revision of operational risk procedures remains to be seen.