The quality of the data is always a key issue in the implementation initiatives of new regulations in the financial services sector! The financial crisis of 2008 has undermined the confidence of investors, putting front and centre questions concerning the transparency of financial information.

What is good data quality?

There is no unanimous definition of data quality, but I really like this definition by Informatica. According to them, “Data quality refers to the overall utility of a dataset(s) as a function of its ability to be easily processed and analyzed for other uses, usually by a database, data warehouse, or data analytics system. Quality data is useful data. To be of high quality, data must be consistent and unambiguous.”

Since then, regulators have advocated for the provision of data quality but without taking into account the actual state of this data in the real world: usually with a poor standard of quality compared to what is required. Most of the time, the gap between reality and the stipulations of the law are such that they require the implementation of:

– data cleansing if the situation is dire,

– employee awareness of issues concerning data entry into open source software applications…,

– …ensuring the implementation of data governance, if it is not already a long-term initiative…

If organizations want to come to terms with upcoming regulatory issues, they will have to better anticipate the work to improve the quality of their data and to establish solid data governance through their Business Intelligence (BI) program, without forgetting to incorporate the regulatory data, in addition to the traditional marketing data. The upcoming regulations will require firms in the financial services sector to trace their chain of information in order to identify the relevant data in regulatory risk management and compliance.

Finally, I would say that it is difficult to manage regulatory compliance without being able to trust the data. Because banks and insurance companies are obligated to enforce laws, these companies have the golden opportunity:
– to deepen the knowledge of their client,

– to improve their profitability, and

– to enhance their decision-making.

It is also important to note that regulations such as Basel are now also imposing quality standards called principles of “Principles for effective risk data aggregation and risk reporting” (BCBS 239) (http://www.bis.org/publ/bcbs239.pdf).

Feel free to contact me and share your experience on data quality with me as you proceed with the implementation of your regulatory projects!

Leave a Reply

Your email address will not be published. Required fields are marked *