Building or Maintaining a Resilient Data Quality Foundation with Limited and Remote Resources

Now, more than ever, teams are being asked to do more with less: fewer resources and the additional complexity of distributed, remote resources. However, the critical need for a strong and resilient data quality foundation has not changed. This foundation is a must-have to ensure that all downstream reports, metrics/ KPIs, and analytics are accurate and inform the right decisions.

For example, you have been asked to provide a detailed cash forecast to inform upcoming leadership decisions, but you’re not sure which data source/ data table is trusted and commonly used. It’s also unclear how to best extract and validate the data that you need. Your go-to reporting colleague is furloughed or hard to reach, but you need to be confident in the forecast that you’re providing.

Here, we have pulled together a few tips to add clarity, resiliency, and continuous improvement to your data quality foundation.


  • If data definitions, sources, and owners are not already clearly identified and understood for your organization’s critical data elements, start with that documentation. There are cloud-based data catalogs that use machine learning to automate initial documentation of sources and data lineage and where data users can easily add information (e.g. verified data sources). If you’re starting from scratch, quickly document and agree on this important information, even if it’s simply capturing in Excel on a shared drive or website to start.
  • A report inventory and/ or tag next to the “source of the truth” reports for key metrics and needs also helps in order to quickly access the correct one and avoid confusion or reconciliations between different versions.
  • Clarity is equally important when it comes to data quality roles, responsibilities, and handoffs in your organization. Documentation and shared understanding of these roles and handoffs is critical for a trusted data foundation.


Resiliency/ Ease of Use 

  • Consider using a self-service data prep and quality tool so that report users are not as reliant on other teams for mission critical data quality needs. Alteryx is one example of a self-service application that enables repeatable data quality workflows and reports, monitoring, and exception reporting. You can also schedule and automate the refresh of data in an existing format, like a PowerBI or Tableau dashboard. If you have not used these tools in the past, it’s worth some quick conversations with your colleagues to understand if your company already has access to these or ones that are similar. You may be surprised to find that there are licenses or supporting tools already available in another part of your organization. 
  • Using a tool that has rules documented and a transparent, easy-to-understand data process workflow makes it easy for someone new to the organization or process to understand the current state and, as needed, modify the workflow without starting from scratch. These workflows can also be saved and easily run/ refreshed either on demand or through an automated schedule (e.g. weekly, monthly).
  • Performance variance or threshold alerts can also be easily set, modified or re-routed to different recipients as the business needs change.


Continuous Improvement

Continue to re-evaluate your data needs and priorities for meaningful KPIs and reports. Ensure that these KPIs/ reports are incorporated in relevant review meetings and decision-making.

  • If you started with a quick fix in Excel, evaluate tools and applications that will further support your processes and related operational resiliency.
  • Ensure that there is an effective feedback loop to capture any issues (e.g. data quality) and iterate on improvements.
  • Define a cadence and approach for other updates, such as updating roles and responsibilities, as those change in the organization.



By increasing your clarity on data sources and definitions and gaining access to self-service data prep and reporting tools, you were able to deliver an accurate and timely cash forecast. However, it was a scramble to do so and you’re not sure if you have access to everything that you need to efficiently and accurately support the next reporting or analysis ask.

In order to build that foundation and confidence, it is important to take a more comprehensive approach to data strategy, governance, and quality assurance. Once resources are available again to focus on this important initiative, we recommend ensuring that you have a comprehensive, enterprise approach to data governance and quality that is clear, resilient, and incorporates feedback loops for continuous improvement.


Our latest guidebook provides leaders with a roadmap to enhance resiliency plans, simplify operations, address new financial requirements, and more. To download, please click the link below.

Download Now

Implementing Secure Remote Work in the Time of COVID-19
COVID-19 Vaccine for Organizations: Capital Planning
Related Posts
Automating the Validation, Reconciliation, and Reporting of Index Data for a Global Financial Benchmark Provider
Automating the Validation, Reconciliation, and Reporting of Index Data for a Global Financial Benchmark Provider
Drive Internal Audit Success With Automation and Data Analytics
Drive Internal Audit Success With Automation and Data Analytics
Enterprise Risk Challenges: When ‘Disruption Everywhere’ Meets Hyperdispersion
Enterprise Risk Challenges: When ‘Disruption Everywhere’ Meets Hyperdispersion