Best Practices For Creating a Data Quality Framework

Alex Brown, Datactics CTO on Best Practices for Creating a Data Quality Framework

Chief Technology Officer, Alex Brown featured as a panellist in Data Management Insight’s webinar discussing the best practices for creating a data quality framework within your organisation. 

Alex Brown, Datactics CTO on Best Practices for Creating a Data Quality Framework

What’s the problem? A-Team Insight outlines that: “Bad data affects time, cost, customer service, cripples decision making and reduces firms’ ability to comply with regulations. With so much at stake, how can financial services organisations improve the accuracy, completeness and timeliness of their data? What approaches and technologies are available to ensure data quality meets regulatory requirements as well as their own data quality objectives?

This webinar discusses how to establish a business focus on data quality, how to develop metrics as well as experiences of rolling out data quality enterprise wide. It will examine fixing data quality problems in real time and how dashboards and data quality remediation tools can help. Lastly, it will explore new approaches to improving data quality using AI, Machine Learning, NLP and text analytics tools and techniques”.

The topics focused on:

  • Limitations associated with an ad-hoc approach
  • Where to start, the lessons learned and how to roll out a comprehensive data quality solution
  • How to establish a business focus on data quality and developing effective data quality metrics
  • Using new and emerging technologies to improve data quality and automate data quality processes
  • Best practices for creating a Data Quality Framework

Alex Brown, Datactics CTO on Best Practices for Creating a Data Quality Framework

We caught up with Alex to ask him a few questions on how he thought the webinar had gone, whether it had changed or backed up his views, and where we can hear from him next…

“Firstly I thought the webinar was extremely well-run, with an audience well over 300 tuning in on the day.

The biggest takeaway for me was that it confirmed a lot of the narrative we’re hearing about the middle way between two models of data quality management – a centralised, highly-controlled but slow model of IT owning and running all data processes, and the “Wild West” where everyone does their own thing in an agile but disconnected way. Both sides have benefits and pitfalls, and the webinar really brought out a lot of those themes in a set of useful practical examples.

Next up from me will be a whitepaper on this subject which we’ll be releasing really soon; there’ll be more blogs from me over at Datactics.com; and finally, I’m also looking forward to the Virtual Data Management Summit, as CEO Stuart Harvey’s got some interesting insight into DataOps to share”.

Alex Brown, Datactics CTO on Best Practices for Creating a Data Quality Framework

Missed the webinar? Not to worry, you can listen to the full recording here: https://a-teaminsight.com/webinars/data-quality-the-latest-approaches-processes-and-technologies-helping-firms-improve-their-data-quality/?brand=dmi

Alex Brown is Datactics Chief Technology Officer. He is a former Head of DART Development at Vela (formerly SR Labs) and Market Data Technical Consultant at NYSE Euronext and has over 15 years’ experience in software development and technical innovation.

Click here for the latest news from Datactics, or find us on LinkedinTwitter or Facebook