Quality data is increasingly seen as key to asset management, but building a solid data governance framework is easier said than done.
According to Brad Farrell, Manager Data and Analytics Solutions at Colonial First State Global Asset Management, in asset management, having quality data is analogous to the oil that fuels race cars.
“I like to think of the analogy of a race car. Our people are our drivers and you can have the best race car driver in the world. Our systems and our processes are the engine to that car and you could have the best engine in the world. But the data is the oil that feeds that engine and without good quality, consistent data that engine could easily fail and become very inefficient,” Farrell told a data management and performance analytics seminar hosted by Eagle Investment Systems in Sydney last week.
Developing a framework that stores, generates and maintains good quality data is, however, easier said than done. “I think there is a lot of misconception about how easy that can be,” Farrell said.
Among the challenges encountered by Colonial First State Global Asset Management, Farrell said, are complexities of dealing with legacy data systems that are not easily integrated into new systems. Similarly, ensuring that solutions are implemented in a consistent manner across all global operations can be complicated.
“Having a consistent approach to how those solutions are delivered is very difficult, especially when you want to do something quickly. It does cause some challenges [when] someone acts independently to someone else and produces a result which, when you think about it strategically, you would have done differently if you had your time again,” Farrell said.
Alexis Walker, Director Asset Servicing Australia at BNY Mellon, said this is a common feature in asset management firms that operate across different regions. “[It is very common to see organisations where] individuals have organically built systems and operating models that are different from one region to the next. You end up with this spaghetti diagram of data and you don’t know really where the data is being used and what set you really need,” Walker said.
At a more granular level, inconsistencies across data sources for individual securities are another challenge. “If a security comes in with a different identifier or code or it is not recognised, these kind of data issues [are a challenge]. Speaking from an asset management perspective, if you could outsource that to somebody and they could clean your data, that would be easiest,” Farrell said.
Towards a solid data governance framework
Data governance frameworks are the key to generating and maintaining data that is trusted and fit for purpose and it’s something most asset managers in Australia are currently putting in place – with varying degrees of sophistication.
According to John Dibiase, Managing Director at Shoreline Consulting, asset management firms across Australia employ a variety of approaches to data management.
“For organisations that are just starting on the journey, data management is operationally focused, so they are looking to provide efficient processes to feed data into investment management processes, client reporting, investment risk analytics and so that is really the focus. A lot of organisations are also talking about using data to generate better business insights [such as] better investment performance outcomes but, some pockets aside, most organisations don’t have the capability to do that yet,” DiBiase said.
Walker said it is critical that asset managers take time to understand and analyse the data requirements they need, before embarking on multi-year execution projects.
“Taking the time upfront to use data lineage tools and other tools out there to really understand what your current problem set is and what you really need for the end of your journey – what is really being used versus all the stuff that people bolted on in a rush as they were in a hurry to support their investment managers with new strategies – is very important,” Walker said.
Secondly, it’s imperative to ensure there is business buy-in, that the change that is required is not also more complicated, and that roles and responsibilities are clearly established.
“It can be hard to get people to do things differently and, if you are going to make it complicated as well as different, you are going to struggle to get that implemented, so keep it simple – at least initially,” Farrell said, adding that executive buy-in is equally important to ensure that the organisation stays on track.
Third, it’s important to tackle these projects incrementally. At CFSGAM, metadata is currently a core focus, a project Farrell said could take considerable time.
“It is quite daunting to start with. Whenever you are faced with a challenge like that it is easy to not do it because it is so big, but one thing I would recommend is making sure you do start doing some of it. Rank the importance of that governance process, which might be ‘I need to have metadata, I need to have quality’, at least have some building blocks in place that you can build on,” Farrell said.
Towards a common approach
While every asset manager is different and solutions that work for one manager may not work for another, there are commonalities.
CFSGAM is a global company, we have global challenges […] and the solutions that suit us as a company might not make sense for others. However, while one size doesn’t fit all, certain activities should be fitting all if you want to be successful in data management. So as a firm, things like data quality, metadata, data governance, all these things are important to a firm to own and develop in your own right, no matter what model you go with,” Farrell said.
To help asset managers on this trajectory, there is a broad range of vendor models available – and while every solution is different, there are some common threads.
“Our clients are always looking for at least software as a service – preferably a managed service where there are some operational aspects done by the vendor. Data as a service is certainly a hot topic at the moment and most of the custodians with third party administrators have some kind of an offer in that space, which could range from outsourcing part of the middle office function to effectively outsourcing the majority of the data supply chain,” DiBiase said.
He said that organisations with limited capability typically centralise their data management capabilities, while larger organisations that employ data specialists across different types of asset classes are de-centralising and focusing on more specific solutions.
“That is the definition of “fit for purpose data”. It is about the end use of the data and so if it is being used differently then you need to cater for that,” DiBiase said.