Managing Director, Mondo Visione
"The problem is not technology itself, rather the realities of political and financial constraints."
Defining the scope of a 21st century market surveillance system is still elusive for regulators and traders alike, even as the shape of things to come emerges from the wreckage of the financial crisis. Not only are firms grappling with the technological demands of seeking alpha in an increasingly fast, data-driven game, so too regulators are harnessing tools to maintain orderly markets. Anna Reitman reports.
Market surveillance has come a long way since spreadsheets were used to programme trading. In 2010, Mondo Visione, a publishing hub for information about exchanges and trading, began running surveillance seminars in London. At that time, how to monitor the super speed of high frequency trading was only just entering the debate.
"Trading has been going faster for as long as I can remember," said Herbie Skeete, Mondo Visione's managing director. "HFT is just using trading technology, it doesn't matter whether you are on a trading floor or using a machine, you have to obey the rules."
For regulators, it's a question of the kinds of systems necessary to keep up, but also about developing them with intelligence to cope with the dynamic nature of rules.
"The way people do things can change all the time, so you have to have systems that are flexible, and self-learning," he added.
Skeete's experience with developing these exact kind of systems began at Reuters in the early 90s, when he pushed to design a user-driven approach to the company's surveillance engine. He also realised that the same systems being used to detect other kinds of fraud, credit for example, could be used in financial market surveillance as well.
"Before, it was people like retired policemen looking for illegal trading patterns. Now there are more and more sophisticated trading systems. The problem is looking across markets and the sheer amount of data you need to look at to see if there is anything odd going on."
In terms of technological innovations in the space, Skeete points to Software AG's use of complex event processing. Speaking at the Spring Surveillance Seminar, Theo Hildyard, product marketing manager at Software AG, said that CEP is "extremely good at taking streams of data, detecting patterns within that data and acting upon them. In order to conduct effective market surveillance, a system needs to be able to do exactly that and perform at extreme scale as well as work with historical data and real-time data."
Budgets and bureaucracy
The problem is not, however, technology itself, explained Skeete, rather the realities of political and financial constraints. "It is not that difficult to build an effective surveillance system that can look at vast quantities of data, look across asset classes, markets…but it costs a lot of money to do it."
Little wonder then that IT budgets for regulators have been increasing. For 2015, the Securities and Exchange Commission put in a request for $177 million for the Office of Information Technology, up $14.5 million from 2014 to "support a number of key IT initiatives, including enhancements to the system for receiving (tips, complaints and referrals), improvements to IT security, and infrastructure upgrades to achieve efficiencies in business operations and reduce long-term costs".
In its budget request, the Commodity Futures Trading Commission said that "the surveillance mission activity is by far the most technology intensive of all of the regulatory activities" representing 19 percent of the total requested resources, or $53.5 million, including $20 million in information technology.
The UK's Financial Conduct Authority increased its Information Systems budget request to £88.2 million ($147.2mn) for 2014/15, up just over 15%. The European Securities Markets Authority IT work programme for 2014 is budgeted at €7.38 million ($9.9mn), up 34% from 2013.
Martha Winata, product manager at Cinnober, said that the problem is more about the many different jurisdictions, authorities with overlapping or unclear spheres of control, and myriad of rules that apply.
For example, tracking a suspicious trade from regulator to exchange to firm, even within the same jurisdiction, requires information sharing, which Winata does not see as being particularly effective in the current landscape.
"It is not a technical barrier or limitation [on the technology]. [It is] the inability for multiple parties to cooperate on the same level to detect the same kind of patterns and activities related to surveillance and also being able to communicate in the same granularity," she said. "I don't think that issue gets solved any time soon at all."
For brokers and exchanges meanwhile, surveillance budgets may be competing with business interests, said Michael Grecoff, head of sales for Cinnober's market surveillance unit, and a former financial regulator in Canada.
Global Head, SMARTS, Nasdaq OMX
"It is still early days for automated trading in terms of what the patterns are."
"There is a reason for a broker or exchange to spend more on business (versus) regulatory reasons. But (surveillance) should never be an afterthought," he said.
Cinnober is a tier one provider of real-time surveillance technology to exchanges, regulators and clearing houses. Some of the firm's biggest clients include the Stock Exchange of Thailand and Deutsche Bourse, as well as Qatar Exchange and the derivatives side of the London Stock Exchange.
Exchanges, noted Grecoff, do tend to have larger budgets on a per market basis for surveillance compared to large banks, which can have potentially 50 or 100 markets around the world to monitor.
Some of the most bureaucratic issues arise in terms of sharing identified patterns of suspicious activity with different, sometimes competing, markets - a reality that has led to a variety of initiatives.
Millennium IT, which provides trading technology to the London, Singapore, and Hong Kong Stock Exchanges, along with over 30 other global venues, recently released the "Algo Alert Library" as an add-on to its existing surveillance system. The free tool is an attempt to avoid costly routine upgrades associated with adapting to market trends and changing regulations.
The library currently contains 20 alerts for suspicious trading activity such as pump and dump, flash orders, driving prices up or down, among others. Fuard Ahamed, project lead for MillenniumIT, said the goal is to include more alerts as new regulations provide clarity on other kinds of activities, particularly those associated with HFT.
"The implementation is centrally managed but once the alert is downloaded it begins its own life in the customers' systems," he said.
Different exchanges can share their own alerts through a portal - so if the LSE detects a particular pattern in the cash equities segment, any other market participant or exchange using MillenniumIT surveillance can have access to the resulting algorithm as well. For now, the firm is looking to promote uptake of the tool but as its use grows, data analysis could provide major insight into market practices across jurisdictions.
James Lam, founder and president of consulting firm James Lam and Associates, as well as a member of E*TRADE's risk committee and the former chief risk officer at Fidelity, said that cloud-based collaborative reporting from many perspectives could be extremely useful in making sense of enterprise risk as a whole.
"Just think about the data environment we operate in. How do we create a map in terms of what is going on with respect to investment risk, market and liquidity risk, credit counterparty risk but also operational and regulatory compliance risk," said Lam.
If all this seems purely in the realm of regulators, it's worth noting that the same principles apply to alpha-seeking buy side firms. In the past, companies looked at VaR and tracking error, but from an analytic perspective. Lam said firms also need to integrate forward-looking alpha-at-risk and backward-looking performance attribution analysis.
"Whether that is stock selection, or concentration in any specific industry, or asset allocation - what are the sources of alpha and what are the key risks that could drive variability in the alpha?" Lam said. "Buy side firms need to integrate those two perspectives as a feedback loop."
Lam added that looking back across major market disasters - such as Knight Capital's software malfunction, the sub-prime mortgage crisis, or even back to the mid-90s when Barings collapsed - visibility was a big part of the problem.
"It is really hard to get a good sense of what the IT risks are, because you can't see it," he said. "It was difficult for boards and management to really see what the risks and leverage of those vehicles (mortgage-backed securities) were because they were highly structured and usually off balance sheet."
On a far more granular level, alpha risk could be something like a dropped network packet, explained Jacob Loveless, CEO of Lucera, a spin-out from Cantor Fitzgerald's former HFT unit. Lucera builds trading infrastructure across the whole stack and makes it accessible to clients. The firm recently began providing electronic trading capabilities to the FX market with its matching engine, LumeFX.
Suzy Moat and Tobias-Preis
Loveless explained that if a downstream system is under stress, it will drop a message intentionally and request a re-transmit. If a firm has real-time measurements on that, it will know about the failure long before an audit trail and be able to react intelligently - whether that means pausing and applying brakes or changing the amount of trades being sent out.
"Automated systems mean operational risk and anything you can do to reduce operational risk is a big win," he said. "Knight Capital more than anybody else has shown what is the cost of risk and as we get to more of an electronic world people aren't trading one asset, people are trading multiple assets and every time you hook up another venue you are introducing operational risk and adding complexity."
Project Lead, Millennium IT
"The implementation is centrally managed but once the alert is downloaded it begins its own life in the customers' systems."
Electronification and speed
In the 90's, Australian firm SMARTS recognised that the emerging trading landscape was electronic, and bet on surveillance moving the same way. In 2007, a broker product was launched and in 2010 Nasdaq OMX acquired the group. Rob Lang, global head of SMARTS at Nasdaq, said that the market needs to be looked at from three perspectives - brokers and exchanges, as well as from the point of view of the regulators.
"It has long been our claim that the market is fairer under that kind of an approach," he said. "Cross market surveillance has been a major trend over the last few years, and while we pioneered it with our broker product, it became equally important for regulators and exchanges."
In terms of automated trading however, Lang said "the jury is still out" but an important first step is to identify exactly what is and isn't automated trading in order to properly identify triggers. He added that regulators are taking those first steps by insisting on identifying computer trading through unique account names or applying a flag on the order.
"It is still early days for automated trading in terms of what the patterns are, and looking at what the different consequences of trading are," he said. "With recent focus on HFT, if machines and algorithms are doing the trading, if it is happening fast in the trading engine, we can keep up on the surveillance engine. We have been doing it for 20 years now, the speed doesn't matter."
Measuring markets meddling
Meddling in markets deserves some monitoring as well. Mondo Visione's Herbie Skeete pointed to the "Market Quality Dashboard" as a good example of technology being used to keep an eye on the impact of the rules themselves.
Developed by Capital Markets Cooperative Research Centre, it is designed to allow market participants to quantify the economic impact of market design changes on market quality - defined by reference to the near universal mandate of regulators, which seeks to ensure that markets are fair and efficient.
Users can analyse, for example, how a reduction in tick sizes a few years ago impacts on the price paid for demanding liquidity today.
The concerns, he added, have more to do with fears of more flash crashes, and what matters most in that regard is to have an early warning system to avoid the situation in the first place.
One of the ways SMARTS monitors the market is to watch for what the development team calls "impending doom" alerts, meaning identifying an order book situation where any mistake carries a disproportionate amount of risk for a period of time.
"For some customers, there are so many untraded orders in the book at such a depth that if someone made a mistake it would dump a lot of orders. So an alert is generated so they can at least know we are in a risky situation," he said.
Future challenges, he added, lie in dealing with increasing levels of unstructured data, even while the trend is towards social and electronic communications, and voice data. Meanwhile, one hurdle is identifying the same individual in many different databases. It's not just about collecting the data though, there is intense interest in being able to mine information, which is why big data technologies are so relevant to surveillance, he added.
The SMARTS database is SaaS-based and designed for processing order books, organised in a way that makes it possible to replay the market and designed so that there is no performance degradation as more data is added. Some customers, Lang explained, have 10 years or more of data stored in the system. "This is one place where our customers have an incredible wealth of information and they want to start use it for more than just surveillance," he said.
In other words, what's useful for regulation and compliance is just as useful for trading firms looking to find actionable signals in the world of unstructured data.
"Automated systems mean operational risk and anything you can do to reduce operational risk is a big win."
Of needles and haystacks
A swathe of firms analysing social media or trends on search engines have sprung up offering a variety of services for behavioural sentiment analysis, notably, Bloomberg and Thomson Reuters. Other firms such as Deltix and Ravenpack are teaming up to provide sentiment analysis services for quantitative trading strategies. Not surprisingly, risk management firms are paying close attention; OptiRisk Systems for example recently hosted a conference bringing together some of leading practitioners and researchers in the field.
Tobias Preis, associate professor of Behavioural Science and Finance at Warwick Business School, presented his findings on how online behaviour relates to market behaviour. A project started in 2009 focused initially on Google searches for the names of S&P500 companies.
As it turned out, the relationship was strong and specific - how often people are looking for a company name was strongly positively correlated with the liquidity of that company's stock in the same week. In 2011, Preis joined forces with Suzy Moat, assistant professor of Behavioural Science at Warwick Business School, and began to investigate whether price movement direction could also be predicted.
Using a virtual portfolio, Preis and Moat's team ran a strategy based on keywords such as debt. If there was an increase in searches for such a term in a given week compared to previous weeks, the team hypothesised that there was a pessimistic mood so they sold the underlying market and kept the position one week. Inversely, if the search volume went down in the week compared to previous weeks, they opened a long position at closing price at the beginning of the following week and kept the position for one week.
Speaking to Automated Trader, Preis said the "very striking pattern" was that the return was higher the more financially relevant the term was - as measured by how often those words appear in the Financial Times compared to the Internet as a whole, which the team called the "Financial Times Index". So words like debt, crisis, or financial market, yielded more return than generic terms. The term "debt" came out on top, increasing the value of the virtual portfolio by 326%.
Moat and Preis then extended the Google based search term strategy to Wikipedia, by using the number of views a particular Wikipedia article received, and found a strong correlation there as well. This provided additional support for the initial hypotheses that increases in attempts to gather financial information is linked to subsequent stock market losses in the coming week.
Still, for some reason words like "color" ended up correlating strongly with higher returns too. "It's important to bear in mind that there is a certain level of noise in the Google signal. For this reason, we look at the distribution of returns across these different keywords, and find that the higher the financial relevance of a word, according to our Financial Times Index, the greater the return our trading strategy would lead to," Moat said.
The project has received its share of criticism, but the theory is not coming from a purely academic perspective.
In 2007, Preis founded the hedge fund Artemis Capital Asset Management, which is applying ideas emerging from the complexity science communities. "We are really excited about new ways to quantify human behaviour on large scales, given the ever increasing amount of data our daily interactions with technology are creating. A range of our algorithms apply some of the insights we have gained around search volume behaviour and human online activity in general," he said.
The translation from research to real money however comes with a number of caveats. For one, future firms wanting to trade a simple strategy like this are going to have to be far cleverer than just following the published key words. "Simple strategies are profitable but slightly weakening, reflecting that this relationship might be being exploited by a number of financial institutions," he said.
Moreover, words that are meaningful to market behaviour change over time, so any real world practical portfolio would have to adapt. Some practitioners that Preis and Moat have worked with have used their published work as the starting point for some "clever solutions", he added. A recent study in collaboration with Chester Curme, research fellow in Data Science at Warwick Business School, may provide further inspiration, by demonstrating that data on certain topics with less obvious relations to the financial markets - for example, politics - can be also be successfully exploited.
Regulators too are invested in improving economic and financial forecasts, while assessing risk. Preis recently spoke at a big data workshop hosted by the European Central Bank, and noted that they are currently in the process of developing collaborations with big data stakeholders such as Google.
"The ECB, and also other central banks across Europe, are extremely interested in forecasting models which utilise records of internet activity. A range of studies have shown that there is potential to create quicker estimates of key indicators, such as unemployment rates and house prices, by drawing on online data," he said.