How Banks Can Improve Data Management as Financial Regulations Expand

We cover 4 practical ways to improve the quality of your organization’s data.

Following the financial crisis of 2007-2008, regulators sought to improve transparency into financial markets by introducing new requirements for customer reporting. At the core of compliance with expanding regulations, from MiFID II to Basel III, is reliable and accurate data. But though years have gone by, data quality is still lacking, posing ongoing challenges for financial institutions (FIs) and regulators alike as they seek to meet the goals of better transaction monitoring and enhanced credit and operational risk reporting.

Here we consider how regulations, including MiFID II reporting requirements, have changed the data landscape as well as four solutions for banks to practically handle the challenges of data quality management.

The Data Deluge

A huge gap remains between regulatory expectations and the actual quality of data within firms and external data sources. Reporting requirements under MiFID II, for example, require firms to ensure that LEI data is transparent, accurate and up-to-date to conduct compliant transactions. Yet one year on from MiFID II’s enactment, issues with data quality and accessibility remain evident across the market.

A recent ICMA survey highlighted that data transparency has not improved following the sweeping regulatory changes. 73% of respondents believe that less than 10% of the data gathered is usable. Further, market volatility, including Brexit, is delaying promised market data to be published by regulators.

Combine this situation with the ongoing pressures associated with 5MLD, BCBS 239, FATCA, and Basel II and III, to name but a few, and it’s clear that firms are experiencing a data deluge.

Here are four strategies to help with the enormous task of data management in banks.

How to Improve Your Data Quality Management

1. Start with clean data upfront

Siloed data, out-of-date information and a lack of consistency are just some of the data quality issues financial institutions deal with. All three pose significant risk, as it’s not possible to make sound business decisions based on inaccurate information.

Initiating a system of cleansing and matching data, as well as enriching it, is the best way to start the change process required by regulators. Starting with a clean slate of deduplicated and accurate data lays the foundation for enhanced compliance capabilities moving forward.

2. Use a unique numbering system

Most organizations house entity information within ten or more databases. Information silos like these are extremely common, as different departments use their own processes and systems to maintain records. Acquisitions further accentuate the problem.

Cross-referencing data is a powerful tool for ensuring a full picture into each customer and for identifying potential business risk. By leveraging a unique numbering system, organizations can cross-reference entity and data sources to pull together disparate enterprise risk management (ERP), contract lifecycle management (CLM), and front and back-office systems.

3. Create a flexible “golden copy”

Financial institutions deal with data from multiple external sources as well, including data vendors and exchanges through to credit rating agencies. Keeping a “golden copy” of information gathered from these sources supports compliance efforts with a combined and cleansed view of entity data that is free of error.

A golden copy should be flexible enough to accommodate different use cases and systems. A key piece of data quality management is understanding what data is integral to the business and what data can change.

4. Conduct regular data maintenance

At the typical bank, data decay occurs at a rate of 25% a year. This is partly due to manual errors during data entry and editing and partly due to the rate of change in the business space. Mergers and acquisitions, office relocations and more all put records out of date at lightning speed.

To fight data decay, a golden copy of data must be regularly revisited and updated. Without ongoing data maintenance, important changes in entity status could easily go unnoticed, and data errors across the organization creep back in.

Maintaining accurate, timely and comprehensive data records is no easy task, and never has been. Opus Data Solutions help you navigate regulatory change by improving data entry, cleansing data and creating a single data viewpoint. If you’re ready to make your data more powerful, request a demo here.

Kelvin Dickenson
President
Follow Kelvin on Twitter, @kelvindickenson