Insights

Filter by Category
Filter by Category

“They’re All Critical” – A Methodology to Prioritize Data Quality Issues

Data_Quality.jpg

Faced with unprecedented regulatory scrutiny, Financial Services firms have made data quality a top priority.  Most large banks now have a Chief Data Officer (CDO) as a ‘C’ level function and an entire CDO organization dedicated to achieving higher levels of data quality.

The somewhat easy part of the job for the data quality team is finding data anomalies as these are quite ubiquitous in the early stages of the data maturity cycle. The hard part is that every business unit feels their issues are the most critical.  That being the case, how can an organization prioritize these issues, which can literally number in the hundreds, and develop a plan of attack that delivers on the most critical ones first?

In our experience, a prioritization methodology that has proven useful is called QCD.  QCD stands for “Quality, Cost & Delivery” and the idea is to define measures and associated scoring criteria along each of those dimensions.

For example, when measuring the impact of a data issue in a Capital Markets Finance environment, the Quality measures may be along the following lines:

  • Adjustments with P&L impacts – 5 (highest score)
  • Adjustments on trades with low notional value – 2 (low score)
  • Balance sheet adjustments – 3 (medium score)

 

Cost refers to the effort currently involved in remediating the data quality issue manually.  While manual remediation is never the desired state, a data quality issue that can be remediated in a few minutes will score lower (less impact) than an issue that takes days each month to correct.

The Delivery measure takes into account the last working day before a Financial Close period that adjustments have to be made to resolve the problem.  If adjustments can only be made at Month End plus 9 days for example, that would be a high impact score.

Quality, Cost and Delivery scores are then added together to come up with a combined prioritization score and once this process is complete the data quality issues having the highest impact have now risen to the top of the list and can be prioritized for remediation ahead of others.

What makes QCD powerful is that it allows organizations to follow a structured data quality methodology while still having the flexibility to assign weightings that make most sense for the very particular issues of each organization.

With every business unit clamoring that their data issues are the most critical, QCD provides a way to cut through the noise with an objective and transparent measurement and prioritization approach.

 

 

 

NYSDFS Regulation 504 – Raising the Bar for AML Compliance
What role should the CFO play in a system implementation?

About Author

Bruce Klein
Bruce Klein

Bruce is a Partner within the firm’s Financial Services practice. Bruce has significant experience assisting Financial Services companies with strategic project delivery, large scale transformation activity and regulatory compliance. Click here to read Bruce's full bio.

Related Posts
How Ripple is re-engineering cross-border payments
How Ripple is re-engineering cross-border payments
Why Data Curation is Vital to Your Financial Services Firm
Why Data Curation is Vital to Your Financial Services Firm
4 Ways for Internal Audit to Keep Pace with Technology
4 Ways for Internal Audit to Keep Pace with Technology

Comment