As the amount of trade and transaction reporting rises, data management presents both challenges and strategic opportunities for market participants, the recent Investment Implementation Summit heard.
Chitra Shanker, Executive Manager IT Service Delivery at Colonial First State Global Asset Management said managing the sheer amount of data that is to be reported across regulatory regimes is a considerable challenge, with data availability a particular stumbling block.
Shanker said this was particularly noticeable in the context of MiFID II compliance, and lessons should be drawn from an Australian perspective.
“[A lack of available] data is one big obstacle we face at the moment. [As part of our MiFID II compliance] we have to report on costs and charges, which is similar to RG 97, and the availability of data, what [the European Securities and Markets Authority] requires in terms of trade timing and other information that is required to calculate the implicit and explicit costs etcetera is a big hurdle. Sometimes it is not available and especially when they need historical information it is a struggle to provide that level of information,” Shanker said.
The sheer size of the data sets to be collected presented another challenge. Under MiFID II transaction reporting requirements, the number of reportable fields has increased from 23 to over 60, the number of asset class covered has broadened and the buy-side is no longer exempt.
“From an operational perspective, just thinking about how our trading systems are going to cope with the amount of data that they need to provide for us to report and how our systems that are actually reporting are going to cope with the amount of static data information, it has a huge impact. For example, dealing with the legal entity identifiers – going through every single broker and every single counterparty to collect those LEIs – was a huge exercise and that is ongoing and very costly,” Shanker said.
Meanwhile, upgrading legacy systems to be able to process and store the new levels of data required is another challenge that requires significant investment.
“I’ve worked with firms that had several hundred individual systems that needed to be tied together in order to get data to flow in the right place, and that is clearly not sustainable.[..] It is very expensive to consolidate all these systems and buy a much better all-encompassing one, but if we are talking sustainability, legacy systems are a real handicap,” James Paull, Manager, Governance, Regulation and Conduct at Deloitte said.
Is it worth it?
If market participants struggled to manage the data requirements, European regulators faced similar challenges with conference attendees at times having been asked to stop providing data to allow systems to catch up. This led some to question whether the benefits of obtaining the data justified the significant costs to industry.
“I am a huge proponent of market transparency, I believe it will ultimately increase fairness and eventually that will flow through the chain to the benefit of the end investor, but I’m slightly more healthily sceptical about the value of regulatory reporting. I’m yet to see anything brilliantly snazzy being done with these data points. I’m sure it is coming, but it takes such an amount of computing power to be able to translate those daily data points into either behavioural changes in the industry, or policy changes, or enforcement outcomes, that I think that will take longer to show a benefit to our industry,” Paull said.
Oliver Harvey, Senior Executive Leader Market Infrastructure at the Australian Securities & Investments Commission (ASIC) said that from a technical perspective, ASIC is increasingly gaining more strategic insights from the data it collects, adding value to market supervision and its wider work. However, he said that a default approach that assumes more data and transparency is always the right answer, needs to be sensitive to the cost benefit considerations.
“There is very much a deep analytical edge to the potential that the data has for us and we are starting to claw away at that. We have got the standardised data feeds that we are getting in the exchange traded context [and] in the non-exchange traded context or in the world of less structured data, we are starting to get additional insights into that space too, which is really encouraging,” Harvey said.
“It’s about really getting to the analytical space where we can start to shape behaviours, be very targeted about the problems that we are seeing, identify emerging risks before they actually crystallise – and those are expectations that we have imposed upon ourselves internally. We also speak with other regulatory agencies quite frequently about how we can actually deliver those genuine benefits to the market more broadly – and how we demonstrate to the market that the longer-term benefits of providing the data to us are outweighed by the shorter term costs,” Harvey said.
The cost benefit analysis, he said, is one that features heavily in ASIC’s considerations when looking at new data collection.
“The default is often “yes please send us more information”, but as regulators we need to really challenge ourselves about what it is that we want it for, in what form, to what use are we going to put it and what is the benefit that we can point to [in order to be able] to demonstrate that this is a successful exercise. That is a real structured internal process which we are committed to assessing ourselves against,” Harvey said.
ASIC is currently consulting on methods to facilitate the recurrent collection of aggregate and granular financial services data through a pilot program which – among other things – considers these very questions. Harvey said cost savings are one possible benefit of recurrent data collection.
“[Recurrent data collection] means we have to engage with industry less in terms of ad hoc questions and requests for additional data which can be very taxing. We know people are subject to an immense amount of data provision and expectation so to the extent that we have that data there that alleviates some of that burden, this is a clear potential benefit to the industry at large,” Harvey said.
He also noted concerns around duplication of data sets across different regulators, which is a particular source of frustration for those reporting globally.
“The other critical thing is the coordination between regulators. For example, there are a number of regulators like ASIC and [the Australian Prudential Regulation Authority] that may have common interests in certain areas and across similar information sets. Are we duplicating data sets, are we sharing data sets appropriately, are there insights that we are hoping to gain from the activities and work that others are doing in the regulatory sector that may limit the impost on the market but that are showing the same benefit and outcomes for consumers more broadly, and regulators as well,” Harvey said.
He said ASIC encourages industry perspectives – particularly where duplication of data sets and other inefficiencies are concerned. “We do very much like discussion on those perspectives so that we can refine the benefits that we potentially see from the data more broadly,” Harvey said.
Thinking globally, acting locally
Shanker said that there were considerable concerns about the possibility of similar regulations being introduced in Australia, without those being adapted to local circumstances. “One of the biggest worries we have is that the MiFID II regulations will be copied and adopted here as they were introduced in Europe. You can assimilate it locally, but it has to be tailored to local needs to ensure that it is easier to implement and it makes more sense,” she said.
Harvey told the conference that ASIC is monitoring the evolving regulations in Europe closely and is acutely aware of the challenges facing market participants. He assured market participants that ASIC is committed to the principle of “thinking globally and acting locally” and any adjustments to local regulations would be made in a thoughtful and methodical way.
“We continue to keenly observe the way in which the European experience is playing out and to identify the extent to which some of those expectations are ones that we may or may not be interested in developing more deeply here, either in a similar form or with a nuanced Australian take on them. The European experience gives us a benchmark and so the question becomes “if not, why not” which is a helpful reference point to have as we move towards thinking about the application of those kinds of considerations in this market,” Harvey said.
A strategic view
As reporting requirements increase, market participants should think beyond the operational compliance requirements, and make a strategic assessment of the likely future impact of transparency on the market, Deloitte’s Paull said.
He said looking beyond the operational aspect of transparency towards the strategic considerations was what would divide winners from losers.
“On the operational side you have the actual implementation, so that is the data plumbing, […] setting up the pipe work to ensure that the right data, from the right systems, flows in the correct way, to end up where it needs to be, by when. […] “Every firm will have the capability to deliver that. [It] just means you are compliant.”
“Where I think the true value lies in tackling increased transparency is in having the strategic foresight to see how the transparency requirements will impact markets, your customers, your products and your business. We know from MiFID II transparency requirements for example that there may be impacts in bond market liquidity, we know that there might be impacts felt in best execution; so running a separate workstream, that sits alongside the data plumbing work, and that thinks about those potential strategic impacts and proactively and pre-emptively tries to solve some of them, is where the real winners will emerge,” Paull said.