Chris Smith, Head of Post-Trade Services, Trax
The Hollywood legend Spencer Tracy once made a wry remark about keeping his profession in perspective: "Acting is not an important job in the scheme of things. Plumbing is."
The importance of plumbing is also not lost on sophisticated trading firms in today's post-crisis markets. A plethora of regulation-driven infrastructural changes in recent years has dramatically changed the way markets function, requiring huge investments in new reporting systems, placing far greater emphasis on the traditionally less glamorous back office and offering both buy-side and sell-side players opportunities to compete in entirely new ways.
Firms seeking to work efficiently, comply with regulatory rules and introduce even greater levels of automation have no choice but to make major system decisions based on this new infrastructure. What's more, financial executives say a great deal more change will be needed if the new plumbing is to deliver what regulators want it to in the OTC derivatives space that it was designed to address.
'A completely different industry'
Chris Smith, head of post-trade services at MarketAxess subsidiary Trax, is a 30-year market veteran who can remember what it was like when markets were far, far simpler.
"I've been around a long time and I was looking back at some things I was doing in the early 90s and it's amazing how the volume of exchanges was so much lower and the infrastructure so much simpler. In that time, I was working with a large investment manager, running what was known as the middle office and trying to cope with the amount of paper even then. And then you scroll forward to 2015 and look at the volume, the complexity, the different instruments and asset classes that firms cope with - it's a completely different industry."
Executives noted three important times in recent decades when market infrastructure changes came to the fore. The first of those began in the late 1980s, when electronification began to gain momentum. The second came with the market harmonisation efforts around 2006-2007, with the first MiFID regulation changing the landscape for European equity trading. The final period is the post-crisis era, during which international regulators have sought to transform the nearly $700 trillion OTC derivatives market into something much closer to exchange-based trading while at the same time creating a whole new level of trade reporting.
Source: Association for Financial Markets in Europe
But post-trade and market infrastructure experts say that many firms have not grasped the nettle when it comes to adapting to the changes that have been wrought.
"I think we are still in the learning phase of what the regulators are trying to do," Smith said, noting the level of complexity of what has been introduced.
"What you're seeing is that complexity has not really come through yet in the sophistication of the systems put in place to deal with the regulatory change. In other words, you've put in place what you think is the bare minimum for your firm to meet that regulation. Now as an industry what we have to do is take a step back and say, 'What were we trying to achieve and what tweaks, what changes, what modifications can we make to have a better outcome?'"
If the market is to achieve that 'better outcome', firms will need to focus on some issues that go beyond the walls of individual entities. Market watchers say that first and foremost, regulators and industry bodies will need to make greater progress in establishing standards. Beyond that, the industry will need to become increasingly collaborative. Finally, firms will need to take a more holistic approach about what regulators are trying to do and what the new infrastructure requires.
Gavin Slater, head of product development at infrastructure specialists Stream Financial, said that while the new market infrastructure as it currently stands needs improvements, it should not be a question of going back to the drawing board.
Gavin Slater, Head of Product Development, Stream Financial
"It's definitely not a case of starting from scratch," Slater said. "There are some really good things that have come about because of this and one of the key things is this move towards standardisation and towards specifying what the standard fields are that need to be reported."
For instance, work to create a standard library of legal entity identifiers (LEIs) holds the promise of allowing much greater automation. "That is very important work and that work will stand us in very good stead going forward," Slater said, adding that without the regulatory pressure the industry would probably not have been able to achieve such gains.
The financial industry has a long tradition of regulator-bashing but executives such as Slater said market participants should not lose sight of the tangible benefits that are on offer. "Are we reaping the benefits yet? No. that's part of the problem," he said.
Steve Grob, head of group strategy for Fidessa, said he has always been "pro-competition" and that allowing for market-based choices works extremely well when it comes to venues and trading so that the marketplace has a range of trading experiences on offer to choose from. But he added: "When you're trying to glue it all together at the back end that is absolutely where you want one standardised approach, particularly if your objective is to minimise risk. This would probably be one area where the regulators would be entitled to say, 'We want it done this way.'"
In fact, while automation in the front office has become commonplace, the back office still has a long way to go. As in the past, FIX Trading Community is expected to play a leading role.
"FIX in the front office has been around for many years. Automating the order process is not unusual anymore; in fact, it's unusual if you don't automate the order process. Automating the whole post-trade, pre-settlement, regulatory reporting area, that's new. That's something the industry is just working on. We have to start looking at standards," Smith of Trax said.
Steve Grob, Head of Group Strategy, Fidessa
Grob noted the crucial role clearing houses play in the new market architecture and said hashing out standards in this area will be vital. For instance, different clearing houses use different risk models.
"In a world of interoperability, it's hard enough for a clearing house to do margin offsets within its own walls; imagine doing that with other CCPs that are operating on other margin methodologies and risk methodologies. I think there's a lot of complexity that still needs to be worked through," Grob said.
Margin offsetting is one of the prime areas where infrastructural changes can make a big difference.
"I think specifically within derivatives it's interesting because it's all about margin. Clearing and margin is the name of the game," Grob said. "Because the cost of capital is so high as a result of Basel III, it means that firms, buy- and sell-side, are trying to find ways that they can optimise the capital that they commit to trading."
As a result clearing houses are working to offset margins between long and short positions as well as between different contracts such as swap futures and traditional futures.
"You're seeing all the clearing houses having to solve for how they can do margin offset within their own silos and then potentially with competitors' silos," Grob said.
Automating systems in this regard poses some new challenges.
Grob gave the hypothetical example of a market player considering the interest rate curve and wanting to protect exposures from a particular start and end date. "You will now be able to see a number of economically equivalent but non-fungible contracts that can achieve that objective for you."
That spectrum includes futures, swap futures, constant maturity futures, standardised OTC swaps and traditional OTC products.
"You might want to select the contract not necessarily based on the best price you can get but the ones that might offer the greatest margin efficiency," Grob added.
"And from a technology perspective, that's a really interesting problem to solve for, because you're kind of reversing the work flow. Typically the work flow goes uni-directional from front to back, but what you're trying to do here is extract information from the back end clearing in terms of relative positions and to represent that in the front end. And that's quite a tricky thing to do technically because it's not the traditional way you would build for that."
Standards have become such a major issue mainly because of new regulatory requirements. One of the biggest changes to market infrastructure occurred because of the G20-mandated rule that called for OTC derivatives trades to be reported to newly created trade repositories.
Source: JWG Group
The idea, Slater said, was that regulators would have quick access to information to work out how to respond in times of crisis.
"So the concept of course is very good. The reality is somewhat different," he said. "If you look at the problems being experienced at the moment, there is still a very low percentage of trades between the trade repositories that are being effectively reconciled. Part of that is due to different standards being applied in different regimes."
For example, the US has excluded reporting of exchange-traded deals versus legislation in Europe which calls for exchange-traded transactions to be reported. "So there are inconsistencies between the legislation and that's also exacerbated by problems like the fact that some of the trade repositories are just unable to handle the data because of technical problems," Slater added.
DTCC, for instance, was reported last June as saying that only 30% of inter-depository OTC derivatives were being paired successfully, prompting the Futures Industry Association to call on European regulators to issue better guidance. By year-end, DTCC had teamed up with vendor TriOptima on a method to improve reconciliation of trades reported to its repository.
Smith of Trax said standards always end up playing catch-up with market developments, and here the industry needs to work together.
The T+1 report that goes to European regulators today is done typically in batch files but this takes place in numerous formats, some of which have been around for many years. "We are starting to move that into a real-time reporting capability and for that we're going to need to start developing an XML schema and if possible, I think the industry would like to coalesce around a standard - possibly based on FIX," Smith said.
But industry collaboration goes beyond the question of setting standards.
As Smith tells it, each firm ends up understanding regulatory objectives in its own way.
"But what we actually need are the key service providers to come together and come up with a better solution that benefits the industry, because we know what the regulators are trying to ultimately achieve. They're trying to get quality information, they're trying to get understandable information, information that will allow them to fulfill what they're setting out to do, which is to run efficient markets."
Trax and its parent company MarketAxess are working with a number of large banks to talk about the challenges of MiFID II and the changes it will bring. "The challenges of the information that the regulators require gets very complex and ...coming together as an industry is resonating with the banks we're talking with," Smith said.
Smith notes that in the world of OTC trading, where some trades are done over the phone or via chat systems, there will be potentially less agreement over what has been bought and sold when compared with exchange-based front office activity which is basically electronic. To help firms catch mistakes before trade reporting takes place, one solution is for the industry to converge on best practice for which fields to focus on.
In other words, trading controls are not just about the approximately half a dozen fields that are captured in a trading system but can involve the 30-50 fields that are needed to settle and/or report a transaction. "And the best way of doing that is to make sure that your view of those 30-50 fields is the same as your counterparty. If you've got that agreement between the two of you then you're off to the races and the quality of the data that flows through the systems is going to be higher quality," Smith said.
Just as a firm may need to widen its perspective on what fields to focus on for its trading controls, it similarly may need to take a broader perspective about what is actually happening on the regulatory front. Large banks tend to look at one regulatory issue at a time rather than identifying core themes behind multiple legislative changes.
"They're looking at it regulation by regulation and saying what do I need to build to address this regulation? And then you get the technology guys going off and building something for that regulation. And then five desks over, there's a separate technology team building something for another regulation which is fundamentally doing a similar thing. So that's why they're not reaping the benefits," Slater of Stream said.
Most of the regulation, he said, is based on the idea that banks are able to look across their entire portfolio in an organisation-agnostic way, which means that the geography or business line that books a deal does not matter. Market watchdogs want to be able to see data holistically across an organisation and to be able to aggregate it based on whatever questions they're asking.
Put another way, if a firm is looking to understand an exposure to a particular product or an industry or a country, it needs to know how it can pick up all that information scattered across different legal entities, geographies and businesses and group it in some way quickly.
"And the problem is, we don't know what the next question (from regulators) is. So what they're saying is that they're using this legislation as a way of pushing banks towards ensuring that they can look at all of their trade information and be able to aggregate it very quickly up any dimension," Slater said.
Source: Strategy &, part of the PwC network
That brings the discussion back to standardisation because that in turn allows access to information at the lowest level of granularity.
Slater said firms try to look at regulatory requirements and systemic questions from a strategic perspective, but not necessarily from a holistic one. "The problem is a lot of them are still using a traditional centralised model," he said. "We believe that information should be standardised, it should be stored at the lowest level of granularity, in a way which allows you to aggregate it very quickly."
But he added that putting all of that information in one central physical location is difficult because it takes it out of the hands of the people who understand the actual exposures best.
"And that's the key thing. It's not just being able to aggregate the information and look at it across a particular dimension, but usually the next step is you have a question on that. And if you've got a question, you've got to go ask the people who own that exposure. And they can go back to the individuals who own the positions and have that conversation with them about what does this exposure mean, how quickly can we get out of it and how quickly can we respond. And that's why it's important to have that data back at source with the original owners rather than in a centralised location."
Slater said data management will therefore be a key issue when considering the new architecture.
Tony King, UK Operations, Violin Memory
Tony King, who manages UK operations for Violin Memory, sees no let-up in the volume of financial data that needs to move around as a result of market infrastructure changes. "The complexity of what people want to do with that data is increasing all the time, and the speed that they want to do that is increasing all the time," King said.
"There are pressures coming at the storage of data from every angle," he said. "One side of it is reporting into the business to make sensible decisions, and the other side is providing information to the regulators who have increasing power and can come at you at any point of any day and ask for the information - you've got to be able to turn it round fast to meet the rules, whatever the rules might become. Otherwise you can tie up hundreds of people just keeping the regulator happy."
Slater added: "You need to look at the information at the lowest level of granularity. Even if it's scattered over many different systems, it's still a lot of data."
The new architecture may sound like a lot of cost and aggravation. But technologists often like to adopt the glass-half-full perspective. As such, they see plenty of scope for greater and more sophisticated automation, and by extension, for more cost savings.
"Automation is at the heart of this because what's happening is the regulations are basically pushing towards people looking at things in near real time. And where this is going to end up, a lot of the regulatory rules are actually going to want you to apply them pre-trade," Slater said. "The regulator is not going to be asking you after the event. They're going to be asking you to include these regulatory rules as a pre-trade check."
For example, the so-called Volcker rule designed to prevent too-big-to-fail banking institutions from engaging in risky proprietary trading will need to be addressed systemically.
"That sort of rule needs to be baked into part of the regulatory process," Slater said.
Trading firms will thus need to look at how they adapt to the new architecture across their own organisations.
"One of the things that I'm pushing very heavily is the idea that all the regulatory stuff has historically been treated as a back-office responsibility and what we're seeing now is the regulatory rules basically moving from the back office to the front. And that requires a vastly different infrastructure," Slater said.
The move to realtime and eventually to pre-trade means checks cannot be done via a back-office infrastructure. "What you need is to do it the opposite way round, a complete 180 degree shift in the way things work in regulatory reporting."
Source: Association for Financial Markets in Europe
Regulatory rules will need to be coded as a set of metadata, with logic that can be pushed from a central place. It all adds up to a more federated type of infrastructure where rules are embedded into various front office trading systems and run live. "So it's a complete shift and moves you much more towards the exception-based processing approach," Slater said.
That movement of regulatory activities into a front-office environment will mean traditional downstream regulatory functions become focused on monitoring for exceptions.
"What that also does is save a huge amount of cost because the regulatory function then just becomes a much smaller but higher value-add department looking at what those exceptions are and going to speak to the people closer to where the data is," Slater said.
Fidessa's Grob suggested the mindset shift required under the new architecture extends beyond questions of front office and back office.
"What you need to be able to do in this brave new world of derivatives is master a concept of virtual fungibility," he said.
For instance, in the equities world there is the concept of one stock that can be traded on one of multiple different venues. A similar goal is needed for derivatives, whereby a firm can internally deem different derivatives contracts as virtually the same thing, and present them in one integrated way to a dealer's desk.
"All that stuff is done at a level of abstraction away from what instrument type it is. But I think firms that are using technology that was only ever written for futures or only written for swaps are going to struggle because that stuff was never baked into the product in the first place," Grob said.
"While this stuff looks like a lot of complexity, which at a level it is, I think there's also great opportunity for those firms that can harness the changes that are happening. The ones that can embrace it are going to win out in the end."