by Marijke van Hooren
The costs of financial data, and more precisely poor data, have become a burning industry issue. Data environments are very large and complex. And when financial data errors appear on entry at a firm, they can be easily propagated elsewhere in that company.
The further such faulty data gets downstream, the more complex and costly it is to fix it. The question is how did we get there? Martijn Groot, Director, Product Management of Euroclear, the pre-eminent provider of post-trade services, explains the business logic behind Euroclear’s suite of services in data management.
“We got into this stage due to the fact that every department approached managing and consuming reference data independently and often differently. Data management is fragmented across locations and/or product lines, making it difficult to track and improve data quality. If companies centralize the data management, they can significantly cut costs associated with sourcing as well as increasing the quality of the data for consumption. New, tighter regulations have triggered greater reporting complexity, and the typical or conventional approach would be to throw more technology and software products at it to meet reporting obligations. Or, firms transfer the hot-potato and make it someone else’s problem by outsourcing it and having an external party run reference data management on a cheaply basis. But does ‘outsourcing’ guarantee quality at low cost and adequate regulatory reporting? Industry opinion differs widely.”
Complexity will even further increase
In the long term, Groot expects that, unless firms pool some of the effort in data management by moving to a utility model, the Total Cost of Ownership (TCO) will increase. “Change requests are inevitable, complexity will even further increase and adding to the technology and data stack makes cost control even harder. And a one-off drop in costs thanks to Business Process Outsourcing (BPO) can be withered away by new requirements.” Martijn Groot says that the lack of industry-wide standards and best practices for data management is also to blame for the spiraling costs of data management.
Euroclear estimates that the current industry spend on market data is some $22 billion per year of which $10 billion is spent on reference data. High quality data means that the entire process will work better and helps with regulatory changes and requirements. Different regulations impacting different parts of the financial services industry such as Basel III, EMIR and Solvency II impose data integrity requirements or lead to new data classification and reporting requirements. : “Some banks estimate the cost of trade failures caused by inadequate data to be as much as $400 million annually. Estimates put the ratio of direct data costs to indirect – error fixing – at approximately 1:5. On top of this, banks need to save 10% (€ 40 billion) over the next five years to achieve a 55% cost-to-income ratio.” Here is where the CDU comes in.
An efficient central data utility should be based on four principles, says Groot, to lead to a successful outcome of cleansed and usable data.
1. Governance – multiple lines of business feed priorities to a transparently operated CDU.
2. Planning – co-ordinate, keep the overview, leverage resources and flexibility.
3. Implementation – a utility service is in a constant state of implementation, a co-ordinated implementation which provides for a smooth roll-out with speed, quality and productivity.
4. Best practices – develop, track and embed best practices, examine processes, procedures and documentation, and ensure they are appropriate. And prove them against an SLA.
He also lists a number of benefits of mutualisation (using common bits) in data management. It leads to more accurate data, harmonization of data, timelier data, sourcing flexibility, lower costs a higher quality client specific service and it takes out the project execution risk. In short: “The CDU helps the industry to significantly reduce data management risks and costs. The CDU centralizes information from all your data sources, it shares industry best practices and guarantees you receive high quality data based on your specific requirements continuously monitored by pre-agreed KPIs.”
Plus Schema Data Utility Services Framework