The Gateway to Algorithmic and Automated Trading

Cream Rising

Published in Automated Trader Magazine Issue 08 Q1 2008

BlackCat Trading Technologiesí Matthew Breakwell and John Reeve explain their distinctive approach to building a quantitative trading platform and outline their plans for exploiting it.

Why did you start BlackCat and how would you describe your business model?

Breakwell: BlackCat was founded on a shared interest in quantitative trading that goes back to the late 1990s. For the last four years, we've been researching and developing the tools and technologies to support our trading, both in terms of strategy development tools and the trading platform and its supporting software. The aim now is to capitalise on the technology by establishing an investment management business to which BlackCat Trading Technologies will be the technical service provider.

What's your own financial markets background and how does this influence your role at BlackCat?

Breakwell: Despite our long-term interest in quantitative trading, neither of us has a professional financial markets background. While this has helped us to take a fresh approach to the technology used to support trading, it also meant we had to establish a lot of relationships from scratch. My experience is in managing businesses in their early growth stages, which is where we are now with BlackCat. My interest in trading emerged out of a Russian export business I was involved with and, following the 1998 Russian government bond default crisis, I gradually spent more time trading on a proprietary basis, developing a rules-based approach to trading but with discretionary execution. It soon became clear that execution needed to be automatic and the rules needed to be developed further. So it's been a natural progression to a full-time focus on quantitative trading.

"Our initial approach was to take the trading process and break it down, asking 'What do we need to automate efficiently?' at every stage."

Reeve: My background in radio communication design led to a number of investments in technology and related firms just ahead of the dotcom boom. Although I may have been rather naïve in how I made these investments - and consider myself lucky to have avoided the bust - their success got me hooked. I started to study quantitative techniques and market behaviour. Once I was making consistent returns, I decided to run quantitative models on a full-time basis and those models have grown up alongside the technology over the last four years. My background has proved extremely useful in developing trading technology because many aspects of radio system design also involve utilising technology to the edge of what's physically possible. In addition, I was already very experienced in analysing time series and performing simulations.

How would you describe BlackCat's trading style?

Breakwell: At its simplest, we're looking to identify and exploit inefficient market behaviour. A large proportion of market behaviour can be considered close to random, but within that there are stable periods of predictable price and other behaviour. Using proprietary tools, we look for trading opportunities across a range of asset classes and time frames, from weeks to minutes. Our trading strategies are designed to exploit the periods of inefficient behaviour our analysis identifies. This might include trend-following, mean-reversion, arbitrage or other strategies depending on the type of opportunity found. So far, we've traded equities and futures on LSE, Eurex and CME in relatively modest volumes, but this will expand in the next phase of the business plan. We're already capable of operating at very high speeds and very low latency, but until now have avoided higher frequency trading opportunities due to the high investment in infrastructure required to compete effectively.

How did you approach designing BlackCat's trading platform and supporting infrastructure from scratch?

Reeve: Our initial approach was to take the trading process and break it down, asking 'What do we need to automate efficiently?' at every stage. All the application software was developed in-house and designed to run on standard PC hardware, albeit custom-built to our particular specifications. This provides a great deal of flexibility in configuring systems to meet our needs in terms of hard-disk provision, CPU speed and system memory. For networking, we use standard gigabit Ethernet.

The software is written almost entirely in C# because we wanted to use a managed language rather than C++ to avoid the time and development problems of having to manage memory explicitly within the application. Coupled with a cleaner syntax, this makes C# applications easier to write and maintain and more reliable. We considered using Java, but it does not support C++-type pointers, which we use extensively in the trading platform where fast flexible data access is required.

"… we put a lot of thought into how we should partition this functionality between the hardware, and then between applications and execution threads, …"

We have a middleware layer which enables us to collect and aggregate data from multiple venues into a single feed. Data normalisation is achieved in our proprietary feed handlers. The middleware layer also combines historical and real-time data into a single unified service which is then provided to the trading engines themselves.

What principles do you follow to ensure your trading technology runs as efficiently as possible?

Reeve: To trade a broad range of strategy types across many exchange-traded products, we knew the software had to be flexible and extensible, but without sacrificing performance. Once we defined the functionality of all parts of the system - data management, analysis tools, trading engines, position management, reporting - we put a lot of thought into how we should partition this functionality between the hardware, and then between applications and execution threads, taking full account of the data requirements of each part of the trading platform. We then examined each part of the system to see how it could be implemented most efficiently on standard hardware. Although the software can run under standard installs without much work required, we have disabled unwanted functionality, services and GUI effects to remove unwanted CPU overhead. And over the last four years, we've continued to refine the technology in light of our trading experience.

How do you monitor and improve the performance of your trading models?

Reeve: By integrating our execution algorithms with our trading strategies, we can fine-tune execution on a strategy-by-strategy basis. Every order is tagged with a target price that is logged with the resulting execution, time stamped in milliseconds, to support performance monitoring via our in-house post-trade analytics application. Where we identify underperformance, we drill down and investigate using our extensive archive of tick data to find out whether the problem was just a glitch in the data feed or a bigger underlying issue. As well as refinements to execution algorithms, this has also led to new trading ideas.

What are the key advantages of using proprietary technology compared with buying in?

Reeve: I think it really comes down to performance and flexibility. Assembling a platform from bought-in component applications typically runs into integration problems that can impair overall performance. Moreover, using a proprietary platform means we can customise and extend functionality quickly and easily; we don't run the risk of waiting on changes by multiple third-party vendors to support a single change in functionality. Also, optimising the performance of trading strategies and execution algorithms is much easier when you've developed the code and have a full understanding of every part of the software's behaviour.

"… optimising the performance of trading strategies and execution algorithms is much easier when you've developed the code …"

Breakwell: An end-to-end trading platform with all the functionality we've developed in-house, from database tools and storage and beyond, would cost £350-400k in licence fees per annum if you were to build it from off-the-shelf components, even before you look at the cost of bolting it all together. We're now in a position to explore almost limitless trading opportunities, without that kind of capital outlay. What's out there on the market has limitations because the components must be assembled, rather than having evolved organically to support particular automated trading strategies.

Did you ever consider trying to use a standard order management system or execution management system?

Breakwell: By comparison with our platform, an OMS/EMS has very little data storage capability, limited data functionality and no real-time history retrieval. Also position information stored within the EMS can be difficult to retrieve for use by the strategies. The vast majority of OMSs/EMSs were designed for a specific purpose within traditional fund management, not for supporting automated trading strategies. As such, vendors are trying to build out strategy capabilities on top of an existing structure which results in serious limitations on the way the data flows between the components. What we have is strategy-centric not order- or position-centric.

Reeve:
We follow the OMS/EMS debate very closely, but have concluded that they are neither as efficient nor as flexible as perhaps they should be, which is why we've broken up the functionality typically found in these applications and distributed it differently across our trading platform. Position data might be in the OMS/EMS when you really need it to be more closely associated with the strategy; execution algorithms that are implemented within the EMS might be more effectively integrated with the strategy code. Our execution algorithms are tightly integrated with the strategy within the same assemblies, i.e. the choice of execution is predetermined by the strategy, not an EMS.

Why design your own execution algorithms rather than source from an external provider?

Reeve: The algorithms on the market are pretty standard and all do the same kind of thing. We wanted something specific to our style of trading as well as specific to futures. Also, if a wide number of firms are using the same algorithms, the order execution footprint will be magnified and that can have an impact on effectiveness. If you're using something that's a little more bespoke it has less of a footprint. You don't necessarily want to be trading when everyone else is.

How have you been able to capture such large volumes of tick data?

Reeve: The overall platform design, i.e. the way the different elements communicate with each other, is very important. When we started, we realised that a good store of tick data was essential for the analytics. So we looked at the data we needed and the data retrieval patterns required by the trading application, then designed the database architecture to store data as efficiently as possible and retrieve it as fast as possible for the specific types of data access.

"For an active stock like Vodafone, the platform can read five years of tick history, compute the daily open, high, low and close, and return the results over the network in around a second."


Although it might not have the flexibility of a generic database it has very high performance in the areas where we need it. For an active stock like Vodafone, the platform can read five years of tick history, compute the daily open, high, low and close, and return the results over the network in around a second. The database stores about 20 billion ticks at present, which is added to at a rate of 150 million per day, considerably more on a busy day. The ticker plant uses a 2HGz Core2 CPU but can handle peak data rates of 500,000 per second. With wholesale datasets coming in from 13 exchanges, plus some interbank data, it generally runs at around a few per cent of CPU load. All the software runs on standard hardware and a standard version of Windows, but the machines are configured for our specific requirements. For example, the database machine has a lot of memory and hard disk storage, the ticker plant machine has relatively little memory but has a slightly faster processor. All areas of the code have been carefully fine-tuned to achieve performance. For example, we have modified some of the standard Microsoft controls and in one particular case the result was a hundred times faster.

What are your connectivity arrangements with exchanges to support your trading strategies?

Reeve: We don't connect directly to any of the exchanges we trade on, and generally trade via our brokers over the Internet. We've analysed execution speeds over the Internet and I think the results would surprise a lot of people. A 100Mbps fibre dedicated line that runs from our offices to a nearby Level 3 data centre provides access to multiple services including Internet and our data feed service. While it might be considered unusual, we find using FIX over the Internet for trading works very well. Some people are comfortable with it and some recoil in horror, but as long as you have a reliable connection - i.e. a high-quality dedicated connection via a leased-line - and back-up there shouldn't be any cause for concern.

What supply-side changes/innovations would you welcome to support automated trading?

Reeve: Accurate statistics on latency. Many data providers and technology vendors trade on the promise of low latency, but if you ask them for hard data most have little or no data to back this up. Latency is a stochastic variable with a probability distribution with mean and variance. As such, I'd expect data providers to have statistics available on a market-by-market basis. Instead we get told, "It's difficult to measure, so we don't really provide numbers", or better still, "We don't have latency!" As well as a general market feed from Comstock, we try to source our trade data from our brokers' trading systems; we want to be connected to the trading API if possible, rather than exchange data vendors' APIs.

Genetic Algorithms

Genetic algorithms are so called because they tackle problems using a process similar to the controlled evolution of a species. The starting point is a set of random candidate solutions to a problem. To follow the Darwinian analogy, the problem is the environment in which a living species must thrive, while the candidate solutions to the problem represent the species itself. The genetic algorithm software forces the solutions to evolve and reproduce over many generations, with unsuccessful solutions being killed off. Those that represent the better solutions to the problem are then 'mated' in the expectation that the next generation will be even more effective. This process continues until the 'fittest' life form for the environment - or solution to the problem - evolves.

Computer-driven reproductions of evolution were first attempted in the 1950s and 1960s and the application of genetic algorithms has since expanded into a wide range of spheres, typically related to timetabling, scheduling or optimisation problems, i.e. the optimisation of a function or a set of functions to specific criteria. In the financial markets, genetic algorithms have been used for solving a broad selection of problems, ranging from predicting movements in currency values to optimising equity portfolio composition.

What technology-based developments do you expect to yield greatest advances in automated trading?

Reeve: I think the benefits of complex event processing and event stream processing (CEP/ESP) have been over-hyped, while evolutionary algorithms may offer greater advantages in the future. We've developed code for single and multi-objective genetic optimisers and have found these very useful in solving problems. They have the potential to be applied to both alpha generation and execution algorithms and we expect to do further research into applying evolutionary algorithms to trading.

Genetic algorithms are very good at finding robust solutions because they're based on the evolutionary process, i.e. they create a range of solutions and the ones that are optimal thrive and the ones that don't work fall by the wayside. They're also good when used with noisy data. Even with a lot of different inputs, a genetic algorithm will still come up with a good solution reliably. An example of a multi-objective optimisation problem might be the construction of a portfolio with the highest possible returns and the lowest possible volatility. Clearly these are conflicting requirements that have a set of possible solutions rather than a single one. If you were to try to exhaustively search all combinations of portfolios that could be created from a universe of 2,000 stocks, it would be impossible to compute, but a multi-objective genetic algorithm could actually search the solution space and rank the set of optimal portfolios. That capability can be extended directly to trading to create more intelligent and adaptive trading algorithms.

CEP/ESP is a useful tool but doesn't have the benefits of evolutionary algorithms. It allows continuous queries in real time which can be used to implement simple algorithms, but that's as far as it goes, from a trading perspective. The problem with using rules-based strategies based on events derived from a fixed data window is that if the characteristics of the data change significantly, then trading performance can suffer. Markets are very changeable and to that extent all brittle rules-based systems must fail.

How will you move forward to leverage BlackCat's technology?

Breakwell: Now that the technology's in place, we're looking to build up the trading side of the business. The next phase is to establish an investment management business, creating the necessary infrastructure, finding an investment manager and building up a quant trading team to expand our trading activities, and taking on more capital. We believe the technology we've developed will provide us with a competitive edge as automated and algorithmic trading continues to expand. Quite separately, we are looking to establish a separate business to develop some aspects of the technology further from a commercial perspective.

This is a platform that can fully automate portfolios of hundreds of trading strategies, whereas a traditional fund manager would be limited to a few strategies, unless they had a very large team of traders. At the moment, it's a little like having a Ferrari and driving round country lanes on a Sunday afternoon; it's enjoyable, but it doesn't stretch it to its full potential. The next phase is to move to the race track and see what it can do.