In the days of limited computing and analytic power, 'same old' might not have been ideal, but it would often do. Today it definitely won't, as inefficiencies and opportunities are identified and exploited more quickly than ever. This throws up two major challenges:
• Multidisciplinary idea generation: greater agility is now needed in the development of new trading ideas, but in the interests of diversification this now needs to be accomplished across a wider range of disciplines, such as controls, filtering, statistics and artificial intelligence. This can easily result in massive fragmentation of development tools across disciplines, and a daunting learning curve for those thinking of exploring new techniques.
• Deployment timeline: rising time-to-market pressures also throw an unforgiving spotlight on the process by which algorithms (be they execution or alpha capture) actually make it into live market deployment. For many, the conventional methodology has been: 'Prototype in a high level language, deploy in a low level one.' Unfortunately, for some at least, that looks less and less feasible in an environment where opportunity windows continue to compress.
These challenges have also gone hand in hand with a need to understand and develop increasingly analytical solutions in new areas of financial markets - in terms of both personnel and business.
Execution algorithms and oversight
A classic example of a business area where this need for analytical capability has rapidly grown is in the development of execution algorithms. Once limited to relatively simple mechanisms such as TWAP or VWAP applied to single markets, these are now far more sophisticated. For example, liquidity sensing algorithms now need to be able to process multiple data streams from multiple venues (both lit and dark) to deliver optimal execution that also incorporates capabilities such as advanced anti-gaming.
This has driven a rapid growth in demand on the sellside for analytical tools that can quickly deliver and deploy this type of functionality. A similar situation applies to trading venues, which find themselves having to manipulate ever-larger data sets for tasks such as modelling load balancing or detecting trading abuses. At MathWorks we have seen firsthand evidence of this in a sharp increase in the use of MATLAB among non-investment bank broker-dealers and trading venues over the past year.
Conventional wisdom has it that the best possible performance (especially with these large data sets) can only be achieved with a compiled language, such as C or Java. However, depending upon the problem, that isn't always true - and certainly isn't true if any potential performance gain is outweighed by the time-to-market imperative.
Figure 1: Exponential curve fitting performance in MATLAB
For example, at MathWorks we recently encountered a trading desk aiming to implement a key trading analytic in real time (to fit an exponential curve of the form y = a·exp(b·x) to a varying numbers of data points while achieving low millisecond response rates.) Their initial approach assumed that code generation would deliver fastest response times, but once they'd benchmarked the MATLAB performance they pursued a component deployment route. (See Figure 1 for the results, culminating in 1 million points being fitted in less than a fifth of a second on a dual core machine with 2GB RAM.)
The example above illustrates one component within a larger trading model, but there is also a more general trend away from the need to deploy complete trading models as compiled code. In part, this may be associated with the ever-greater separation between HFT and more analytical approaches. Much of the success, or otherwise, of HFT is now less associated with the speed of the analytics - which are not typically massively complex anyway - but more with the speed of the infrastructure, such as microwave technology.
The fact that a tool such as MATLAB, which has historically been seen as 'offline', now exhibits calculation performance as illustrated above is leading a growing number of non-HFT trading participants to use it in online mode for live trading. This radically compresses the deployment timeline by allowing just one set of code to be written, tested offline (backtested), tested online (broker live simulation environment), and then traded. Third party tools (such as IB-MATLAB) have emerged which further streamline the process by providing streamlined connectivity to broker APIs, while earlier this year MathWorks released its first formal trading target (Trading Technologies' X-TRADER) in its Datafeed Toolbox.
A slightly different situation prevails in the case of larger enterprises. In some organisations, individual business units or trading groups are obliged to use only authorised financial libraries for certain activities to avoid unnecessary duplication of effort and/or as a risk management measure. In other organisations, the opposite applies: there is either a 'coding free-for-all' or there is a complete logjam, as limited low level programming resources are swamped with coding demands.
In both cases, if only compiled 3GL languages like C are used, there is often a resource squeeze when it comes to updating them or adding new functionality, which has an obvious knock-on effect on the timeline when deploying new trading algorithms or applications.
With this in mind, MathWorks is releasing a new product, the MATLAB Production Server, which aims to make component-based deployment of algorithms and analytics more straightforward and robust. Algorithm developers will use MATLAB to rapidly design, test and refine numerical analytics, before using the MATLAB Compiler to package MATLAB programs for deployment on the MATLAB Production Server. Application developers will integrate the .NET and Java client libraries included with the MATLAB Production Server into their enterprise applications or trading algorithms to make calls to packaged MATLAB components running on the MATLAB Production Server. Finally, system administrators can more easily maintain and support packaged MATLAB programs within a production environment through hot deployment, managed load balancing and backwards compatibility of the MATLAB versions.
With more data, more competition and shorter profitability cycles for trading strategies, another of today's challenges is simple processing horsepower. Even the fastest CPUs struggle when confronting large mathematical finance tasks. Fortunately, many of the processing tasks in the trading world - such as trading model parameter optimisation - are inherently parallel in nature, which makes them suitable for processing on a GPU. The difficulty is that low level GPU programming is not a trivial skill to master.
This is one reason GPU functionality was added to parallel computing tools in MATLAB, specifically the Parallel Computing Toolbox and MATLAB Distributed Computing Server. It operates via a special array type called GPUArray that supports a variety of functions such as Fast Fourier Transforms and matrix division. A key point is that the numerics are IEEE compliant with full single and double precision support.
However, an interesting point is that simply attempting to dump all calculations onto GPUs may not be the best performance solution - a judicious combination that considers real-world problems holistically can actually deliver better results. A good demonstration of this arose at the MATLAB Computational Finance Conference in London earlier this year, where a customer spoke about his initial parallel benchmarking expectation, which he thought would be entirely GPU-driven. On considering his overall application, algorithm, and data sources, he realised a hybrid solution offered advantages as per the final benchmark below.
Figure 2: A user benchmark showing performance of a key simulation on GPUs, CPUs and hybrid GPU/CPU systems.
Getting up the curve
In an environment of continual change, a growing number of traders who have been using the same techniques successfully for many years are discovering the harsh reality of 'same old won't do', especially in markets where liquidity has declined. They therefore realise that they must learn new tricks if they are to survive as professional traders. They are aware of the opportunities of taking a more mathematically rigorous approach to trading, but in many cases lack the necessary skills and education and do not have the time to remedy this. The software tools they need may be readily available as open source, but that alone is of little use. For instance, in areas such as econometrics it is extremely easy to misapply tests and draw the incorrect inferences, with deleterious consequences for P&L.
This is one of the reasons why the MathWorks dedicated financial products development team has trebled in size over the last four years. In conjunction with this, we offer extensive training programmes, free seminars/webinars, comprehensive technical support and a vast collaborative user community. By offering this level of assistance, it becomes possible for those without a quantitative background to make the transition painlessly to a more methodically rigorous way of trading.
Conclusion: one stop
As the pressure for shorter model and application timelines increases, a growing number of quants and traders are becoming aware of the need to diversify not just by market and timeframe, but also by trading model. Furthermore, they realise that true model diversification needs to involve multiple disciplines. The snag here is application proliferation, which raises multiple issues around training, licensing, support and so on.
However, because MATLAB combines both the specific and the generic, this problem does not apply. Users can traverse a vast analytical space within one programming environment to achieve extensive diversification. Furthermore, this approach also makes possible the easy combination of multiple and diverse techniques within one interface and model, as Figure 3 illustrates...
Figure 3: A Real-Time Trading Application MATLAB example, demonstrating the application of MATLAB Evolutionary Algorithms, Regression Trees, Neural Networks and Cointegration Functionality to Visualise, Test and Calibrate Intraday Trading Strategies.
The presentations from the MATLAB Computational Finance Conference are available to view on demand at www.mathworks.co.uk/matlabcfc
For more information on MathWorks Algorithmic Trading capabilities, see http://www.mathworks.co.uk/discovery/algorithmic-trading.html
Neural networks have recently been experiencing something of a resurgence. Aly Kassam, founder of consultants Quantitative Support Services, examines why and how this is happening.
Neural networks have often been regarded as the Cinderella of finance. Their application seems to go in phases, with periods of bullish excitement followed by disillusionment and then a further period of obscurity.
In many cases, this is due to straightforward misapplication - users assuming that they can use neural nets for a task for which they are unsuited. A classic example of this has been the frequent attempts to use them for point prediction, such as the next closing price. This was commonplace in the early 1990s, with users developing extremely large and complex multi-layered neural network models as prediction algorithms. It didn't work.
More recently, attitudes and approaches to neural nets have improved, and followed the general trend across finance towards more inference-based rather than predictive approaches, with the explosion of interest in Bayesian statistics being a classic example.
In the case of neural networks this has resulted in their use for tasks like classification that are well suited to their strengths. This has seen users starting to use them for tasks such as determining the state of a market based upon multiple variables that a human trader would be unable to assimilate. This information might then be used as an input to a decision, such as which type of trading model to deploy or which model parameters to use given the current environment.
MathWorks Neural Network box is now seeing growing use in financial markets for this type of problem. Again, documentation and accessibility have been key to this, as have object oriented features.