Datactics has implemented its Self-Service Data Quality data quality reporting solution at a Tier 1 global bank. The bank is using the software to monitor over 10 million underlying records, which requires hundreds of millions of data points to be processed. The bank is also building predicative analytics on top of the Self-Service Data Quality framework to monitor data quality problems and make predictive assessments of data areas that should be prioritised for improvement.
The US bank’s chief data officer (CDO) assessed several data quality products before selecting Datactics, running a proof of concept and working with the vendor to develop a CDO dashboard within eight weeks. The system went live in January. The bank was previously sampling data and using some automation to check data quality, but was not able to monitor its entire universe of data in a timely way.
The deployment of Self-Service Data Quality is based on the bank’s data warehouse of over 10 million records, and uses five dimensions of the Enterprise Data Management Council’s Data Management Capability Model (DCAM) – completeness, conformity, accuracy, duplication and consistency – to ensure data quality and allow bank staff to monitor data using these metrics. As well as reporting on data quality and ensuring data delivered to business applications is accurate and fit for regulatory reporting in line with data quality demands of regulations such as BCBS 239, Self-Service Data Quality will be used to scrub and aggregate information for automatic submission to the SEC and US Federal Reserve.
Click here to read the full article.