The Gateway to Algorithmic and Automated Trading

Corrupt data fears in wake of fix scandals

First Published 12th August 2014

Hefty fines for the manipulation of benchmark rates means that industry should be prepared to justify price setting mechanisms.

Ian Salmon, ITRS

Ian Salmon, ITRS

"The buck stops with the market makers that make a price."

A major overhaul in markets using traditional benchmark rates could have ripple effects across the industry, and experts are warning that ensuring data integrity may become a caveat emptor reality as regulatory actions bite.

In its report, Reforming Major Interest Rate Benchmarks, the Financial Stability Board called for widespread support for benchmark rates that reduce the chances of manipulation and are "anchored in observable transactions wherever feasible" as well as "robust in the face of market dislocation".

Commenting on the report, US-based consultancy firm Finadium said that in seeking alternatives, the FSB recognised the difficulty, or near impossibility, in simply changing the formulas determining derivatives cash flows on existing trades. In other words, "this is really about shifting the market for new trades and not legacy deals".

DTCC's GCF Repo product traded on NYSE Liffe is a major contender, but liquidity remains an issue. "GCF repo is observable, executable, collateralised as well as being centrally cleared - although markets don't extend very far out, making it hard to establish forward rates for discounting," Finadium wrote.

In the FX markets meanwhile, a recent Reuters report stated that asset managers are willing to pay more for currency fixing services in response to a recent FSB consultation.

Marshall Bailey, president of ACI - the Financial Markets Association, said that it is not the fix that is broken, rather that the manner in which it is used by some that warrants scrutiny.

"Technology advances can bring tremendous opportunity for the fixing orders to evolve into true transaction cost analysis tools that will benefit fund managers and the clients for whom they manage assets," he said.

Senior leaders, he added, are realising however that internal processes may not be sufficient, and that best practices in the industry must be recognised as such by a wide participation.

"In all cases in markets, ultimately it comes down to the behaviour of individual market participants, and the ability of their supervisors to enforce high standards through effective oversight and governance."

Precious metals markets have not escaped unscathed. The FCA fined Barclays £26 million for failings surrounding the London gold fix, and a former trader was banned and fined for inappropriate conduct.

Since then, CME Group and Thomson Reuters are set to operate an electronic silver benchmark, a move largely seen as heralding a major shift for the other metals including gold.

Most fines meted out so far have been against the manipulation of these recognised industry benchmark fixes, but other regulators are introducing guidelines for handling contributed data.

"The buck stops with the market makers that make a price," said Ian Salmon, global head of business development at ITRS. "Lots of them are becoming aware of the fact that they have to make sure that their contributed data - being used for responding to RFQs, contributing a price to a limit order book, or in a SEF - is accurate at the market maker end. Nobody is doing the checking for them."

SEF awareness

As a decentralised market place, swaps execution facilities are tackling some of the biggest issues of contributed data corruption. Transparency on the outgoing results can be tricky with prices coming in from numerous data feeds into an algorithmic engine for calculations,

ITRS works with a number of tier one banks and interdealer brokers to provide services validating the integrity of market feed data. In a recent whitepaper, the firm outlined the ways in which OTC markets lean towards large amounts of erroneous or corrupted data - manifested in a number of ways. For example by the appearance of empty fields in quote feeds generated by contributed data, or by clearly outlying data values, where a contributed quote, rate or other figure is published at a wide variance from normal market activity.

"When these potential issues are found, people have got to have a way of logging them and auditing them (if) the regulator comes knocking on the door," Salmon said.

"Good operational governance, and ensuring that the market data you are consuming and then calculating from is correct - those underlying principles apply to a lot of markets but particularly those that are changing rapidly at the moment," he added.