Why is data management crucial for a successful Financial Services organization?
In the wake of the global financial crisis, large financial institutions continue to face a myriad of internal and external pressures, with regulatory concerns continuing to be a primary driver of the data agenda. Regulators and investors are raising the bar on reporting and analysis requirements, with greater demands of granularity, flexibility, and speed. Investors and analysts are also demanding more granular credit data to answer questions related to credit quality, loan performance, and more.
Listed below are some of the internal and external pressures that financial institutions are facing as they try to respond to regulatory mandates:
- Non-standard data definitions and multiple sources of the same data
- Resource intensive manual adjustments and data manipulations
- Inconsistent data management and governance across functional and business units
- No clearly defined data ownership and accountability
- A need to use data to better understand their customers (360 degree view of customer) and products
- Volcker Rule – The reporting requirements for Volcker are unlike any previous regulations and the data is required to prove that banking entities are not engaged in proprietary trading, covered transactions or have ownership interest in hedge funds or private equity funds
- Dodd Frank Act (DFA) – In addition to the Volcker rule, other DFA requirements such as client clearing and living wills are requiring financial institutions to get their data in order to keep up with ever expanding regulatory scope
- CCAR – Regulators have been very stringent with the data being sent for their analysis and stress testing and large banks have had their capital plans rejected
- Basel II/III – Increased capital requirements place a renewed priority on the quality of underlying data
- Foreign Account Tax Compliance Act (FATCA) – with FATCA going live shortly, data captured on counterparties and clients need to be enhanced and IRS reporting needs to be developed
Common causes of data issues
The causes of data issues are categorized broadly under four key areas as follows:
- Fragmented business processes
- Expedited projects to adapt new business/technology trends such as Enterprise mobility without proper planning
- Lack of governance processes
- Lack of enterprise level planning
- Incorrect or incomplete data entry
- Inexperience or lack of business knowledge
- Limited understanding of how data flows through the organization
- Silo IT systems
- Systems designed with lack of complete understanding of business processes
- Data migration and conversion effort
- Limited validations at data entry
- Business transformations and legacy modernizations
- Organic growth- more departments, more products, more channels
- Mergers and Acquisitions
How has the industry responded?
With all the effort being put into regulatory compliance, Financial Institutions now recognize that it is the quality of their data which drives the magnitude of the compliance effort. As such, focus is moving towards data quality/governance initiatives and organizations are investing additional resources into their data agenda. Example initiatives include:
- Development of enterprise level, centralized data quality and data governance frameworks
- Formulation of data quality standards and policies (e.g. data quality dimensions, testing frameworks, etc.)
- Identification and documentation of common definitions for Key Data Elements (KDEs), metrics, and thresholds, and incorporating these as part of business as usual
- Creation of enterprise data governance functions for executing independent data testing and mandating data quality attestation
Data Quality Framework
A Data Quality (DQ) framework examines the completeness, validity, consistency, timeliness and accuracy of enterprise data, as it moves from source to reporting. It also defines data enrichment and enhancement strategies to address any data quality issues with well-defined data control points.
Data Governance Framework
A Data Governance (DG) framework establishes a foundation for the successful management of data assets. It defines data management roles and responsibilities, establishes standards and policies and defines processes and procedures in order to better manage the enterprise data and build confidence in the accuracy and reliability of the data.