Whitepaper SSDQ – As featured in the recent A-Team webinar, we’ve been strong advocates of a self-service approach to data quality (SSDQ), especially when it comes to regulated data types and wide-ranging demands on a firm’s data assets.
This whitepaper, authored by our CTO Alex Brown, goes deeper into the reasons why this approach is so much in demand and explores the functionalities that a fully self-service environment needs to equip business users with rapid access to high-quality data.
- The Changing Landscape of Data QualityThere has been increasing demand for higher and higher data quality in recent years – highly regulated sectors, such as banking have had a tsunami of financial regulations such as BCBS239, MiFID, FATCA and many more stipulating or implying exacting standards for data and data processes. Meanwhile there is a growing trend for more and more firms to become more Data and Analytics (D&A) driven, taking inspiration from Google & Facebook, to monetize their data assets. This increased focus on D&A has been accelerated by easier and lower cost access to artificial intelligence (AI), machine learning (ML) and business intelligence (BI) visualization technologies. However, in the now-waning hype of these technologies comes the pragmatic realization that unless there is a foundation of good quality reliable data, insights derived from AI and analytics may not be actionable. With AI and ML becoming more of a commodity, and a level playing field, the differentiator is in the data and the quality of the data… To read more see the above whitepaper.
Alex Brown is Datactics Chief Technology Officer. He is a former Head of DART Development at Vela (formerly SR Labs) and Market Data Technical Consultant at NYSE Euronext and has over 15 years’ experience in software development and technical innovation.