Data governance: Greatest compliance challenge and emerging core competency
April 2026
Susannah Hammond, senior regulatory intelligence expert at CUBE, considers how robust data governance is fast becoming a defining capability for financial services firms
Image: nanishappy/stock.adobe.com
Effective data governance is an emerging core competency and is also seen as the greatest compliance challenge for financial services firms worldwide in the coming year. To reap the potentially significant benefits, firms should begin planning how to build, resource, and execute their strategic approaches to data governance.
Data governance addresses the management of data collection, processing, accuracy, retrieval, consistency, and security. It encompasses personal responsibility for information management throughout the data lifecycle and the processes, policies, standards, methodologies, security protocols and metrics used to ensure effective data management and confidentiality. It also covers where, how, and on what basis firms store data.
A data governance framework is no one-size-fits-all solution and can take many forms. It must cover the whole data lifecycle, including creation, use, communication, unmodified/original retention, retrieval, and destruction. Effective governance requires a detailed understanding of the data a firm needs to operate, the basis for collecting and using that data, processing protocols, and storage methods and locations.
Data is the new oil
Data is seen as the ‘new oil’. While management information has always been important, it is now, in the fast-evolving technological age, the lifeblood of firms, bringing both challenges and opportunities. The potential benefits of using artificial intelligence have sharpened the need for consistently high-quality data. As AI adoption accelerates, the quality of training data becomes even more critical. The adage ‘garbage in, garbage out’ is pertinent for assessing the output of AI tools.
Done well, AI tools can be usefully deployed in many aspects of compliance and risk management. The relentlessly huge volumes of regulatory output from politicians, supervisors and policymakers all need to be tracked, with horizon-scanning that covers everything, from relevant speeches and enforcement actions, to consultations and extraterritorial changes. Firms can use effective data inputs for AI tools to focus their horizon-scanning and conduct initial impact assessments for potential rule changes and evolving regulatory expectations.
The same is true for assessing the effectiveness and efficacy of control frameworks. Again, the output quality from any AI tool will be critically dependent on the completeness and accuracy of its inputs.
Challenges
There are no shortage of challenges to achieving effective data governance, hence why it is seen as the greatest compliance challenge in the coming year. Firms face several internal dilemmas about the need for a well-resourced, wholesale data audit to assess what data the firm possesses, on what basis it was retained and where and how it is stored. Ideally, boards should support and oversee data audits, given that many firms continue operating legacy, often fragmented systems.
Firms should also be aware of external factors that challenge effective data governance. Two of the top, interlinked problems are cyber resilience and third-party outsourcing.
Cyber resilience is a key focus of supranational policymakers, who are concerned that severe attacks, breaches and/or outages can affect financial stability. As a matter of course, firms are required to protect customer data and proprietary information.
Firms need to develop, implement, and test a suite of practical strategies to manage cyber risk and increase resilience through a combination of technology-based approaches, flexible system design, and organisational structure.
Cyber resilience is a deliberately broad-brush concept that encompasses cybersecurity processes, practices, and technology used to protect networks, computers, programmes, and data from unauthorised access, destruction, and retrieval. It is the ability to anticipate, withstand, recover from, and adapt to adverse network conditions, cyberattacks and/or system breaches.
Cybersecurity tends to focus on defence, while cyber resilience seeks to ensure the post-failure continuity of critical functions until systems are restored. Good IT infrastructure practice includes sound security design, but firms still require effective strategies to prevent, withstand and recover from cyber incidents.
The use of third parties and material outsourcing arrangements requires careful and detailed management. The golden rule for successful outsourcing is that while activities can be moved to another group, company or third party, the skills to manage those activities must be retained in-house. This may be less obvious in an intra-group outsourcing scenario, but it is essential for separate legal entities with separate licences. Equally, when a branch or other structure is involved, firms must consider the efficacy of their outsourcing arrangements and the skills, governance, and local responsibilities of the branch.
Firms should embed oversight and management of outsourced third-party arrangements into their overarching risk management approach.
Specific elements for testing may include:
- The need for upfront due diligence on the outsourcer, even when it is a group company, together with a detailed written agreement specifying all aspects of the outsourced arrangements. Among other things, the detailed agreement should cover practical measures for exiting the arrangement, including the complete and accurate return of all data in its original format.
- The ability to access off-site locations personally. If relevant, every effort should be made to conduct at least an annual site visit to all major or material outsourcers to assess the level, timeliness and quality of their information flows and/or reporting.
- Practical ramifications stemming from data protection legislation, particularly as relates to security, localisation requirements, and international data transfers.
- Resilience of outsourcing partners. While most firms conduct comprehensive due diligence at the start of an outsourcing relationship, it is less common to conduct continuous checks to ensure the outsourcer remains effective. All firms should have comprehensive, tested contingency plans to deal with the failure of a third-party provider.
- The inclusion of outsourced arrangements in any ‘dawn-raid’ policy. Firms must remember that regulators, law enforcement agencies, and other authorities may seek access to data and information managed by outsourcing arrangements.
- The right to be informed before any firm data or activity is further outsourced by the third party to another party. This should be covered in the outsourcing contract. Too many firms have found that their data was transferred away from their original outsourcing partner to numerous other entities, thereby increasing the potential for loss, contagion, reputational damage, and concentration risk.
- The inclusion of outsourced arrangements in recovery and resolution plans. This is particularly pertinent for firms required to create a living will, but it will also be critical for all business continuity and disaster recovery plans.
- The maintenance of in-house skills and expertise to oversee outsourced activities.
- As a matter of course, any review conducted on outsourced activities should be reported to the board as part of the firm’s overall risk reporting.
The failure, whether deliberate or unintended, of a third-party outsourcing arrangement is a matter of increasing systemic concern, with notable focus on the concentration risk posed by the small number of cloud and other data service providers.
The Dutch Authority for the Financial Markets and De Nederlandsche Bank published a joint report in November 2025, warning of the increasing systemic risk arising from financial sector reliance on a limited number of non-European IT service providers. It cautions that such dependency amplifies the risk of concentration and systemic disruption, where a failure at a single provider could affect large segments of the financial sector.
“In the current bleak geopolitical climate, there is a risk that state actors will abuse the dependence on digital services as a means of political leverage or as an instrument in a trade conflict,” the Dutch report said.
Firms and their IT suppliers are aware of the risks and are taking measures to mitigate them. Still, the extent to which the solutions provide effective protection against malicious influence from non-European actors “remains to be seen”.
Opportunities
Effective data governance is worth the strategic investment. It enables a firm to treat its data as a key asset rather than simply a custodial liability. Clear line of sight to, and management of, all data is the foundation on which a firm can evidence compliance, accurately report internally and externally, and gain unparalleled insight into emerging risks. Data-driven, evidence-based risk management for regulatory change and control infrastructure is fundamental to the identification and mitigation of personal liability by senior managers.
Data governance addresses the management of data collection, processing, accuracy, retrieval, consistency, and security. It encompasses personal responsibility for information management throughout the data lifecycle and the processes, policies, standards, methodologies, security protocols and metrics used to ensure effective data management and confidentiality. It also covers where, how, and on what basis firms store data.
A data governance framework is no one-size-fits-all solution and can take many forms. It must cover the whole data lifecycle, including creation, use, communication, unmodified/original retention, retrieval, and destruction. Effective governance requires a detailed understanding of the data a firm needs to operate, the basis for collecting and using that data, processing protocols, and storage methods and locations.
Data is the new oil
Data is seen as the ‘new oil’. While management information has always been important, it is now, in the fast-evolving technological age, the lifeblood of firms, bringing both challenges and opportunities. The potential benefits of using artificial intelligence have sharpened the need for consistently high-quality data. As AI adoption accelerates, the quality of training data becomes even more critical. The adage ‘garbage in, garbage out’ is pertinent for assessing the output of AI tools.
Done well, AI tools can be usefully deployed in many aspects of compliance and risk management. The relentlessly huge volumes of regulatory output from politicians, supervisors and policymakers all need to be tracked, with horizon-scanning that covers everything, from relevant speeches and enforcement actions, to consultations and extraterritorial changes. Firms can use effective data inputs for AI tools to focus their horizon-scanning and conduct initial impact assessments for potential rule changes and evolving regulatory expectations.
The same is true for assessing the effectiveness and efficacy of control frameworks. Again, the output quality from any AI tool will be critically dependent on the completeness and accuracy of its inputs.
Challenges
There are no shortage of challenges to achieving effective data governance, hence why it is seen as the greatest compliance challenge in the coming year. Firms face several internal dilemmas about the need for a well-resourced, wholesale data audit to assess what data the firm possesses, on what basis it was retained and where and how it is stored. Ideally, boards should support and oversee data audits, given that many firms continue operating legacy, often fragmented systems.
Firms should also be aware of external factors that challenge effective data governance. Two of the top, interlinked problems are cyber resilience and third-party outsourcing.
Cyber resilience is a key focus of supranational policymakers, who are concerned that severe attacks, breaches and/or outages can affect financial stability. As a matter of course, firms are required to protect customer data and proprietary information.
Firms need to develop, implement, and test a suite of practical strategies to manage cyber risk and increase resilience through a combination of technology-based approaches, flexible system design, and organisational structure.
Cyber resilience is a deliberately broad-brush concept that encompasses cybersecurity processes, practices, and technology used to protect networks, computers, programmes, and data from unauthorised access, destruction, and retrieval. It is the ability to anticipate, withstand, recover from, and adapt to adverse network conditions, cyberattacks and/or system breaches.
Cybersecurity tends to focus on defence, while cyber resilience seeks to ensure the post-failure continuity of critical functions until systems are restored. Good IT infrastructure practice includes sound security design, but firms still require effective strategies to prevent, withstand and recover from cyber incidents.
The use of third parties and material outsourcing arrangements requires careful and detailed management. The golden rule for successful outsourcing is that while activities can be moved to another group, company or third party, the skills to manage those activities must be retained in-house. This may be less obvious in an intra-group outsourcing scenario, but it is essential for separate legal entities with separate licences. Equally, when a branch or other structure is involved, firms must consider the efficacy of their outsourcing arrangements and the skills, governance, and local responsibilities of the branch.
Firms should embed oversight and management of outsourced third-party arrangements into their overarching risk management approach.
Specific elements for testing may include:
- The need for upfront due diligence on the outsourcer, even when it is a group company, together with a detailed written agreement specifying all aspects of the outsourced arrangements. Among other things, the detailed agreement should cover practical measures for exiting the arrangement, including the complete and accurate return of all data in its original format.
- The ability to access off-site locations personally. If relevant, every effort should be made to conduct at least an annual site visit to all major or material outsourcers to assess the level, timeliness and quality of their information flows and/or reporting.
- Practical ramifications stemming from data protection legislation, particularly as relates to security, localisation requirements, and international data transfers.
- Resilience of outsourcing partners. While most firms conduct comprehensive due diligence at the start of an outsourcing relationship, it is less common to conduct continuous checks to ensure the outsourcer remains effective. All firms should have comprehensive, tested contingency plans to deal with the failure of a third-party provider.
- The inclusion of outsourced arrangements in any ‘dawn-raid’ policy. Firms must remember that regulators, law enforcement agencies, and other authorities may seek access to data and information managed by outsourcing arrangements.
- The right to be informed before any firm data or activity is further outsourced by the third party to another party. This should be covered in the outsourcing contract. Too many firms have found that their data was transferred away from their original outsourcing partner to numerous other entities, thereby increasing the potential for loss, contagion, reputational damage, and concentration risk.
- The inclusion of outsourced arrangements in recovery and resolution plans. This is particularly pertinent for firms required to create a living will, but it will also be critical for all business continuity and disaster recovery plans.
- The maintenance of in-house skills and expertise to oversee outsourced activities.
- As a matter of course, any review conducted on outsourced activities should be reported to the board as part of the firm’s overall risk reporting.
The failure, whether deliberate or unintended, of a third-party outsourcing arrangement is a matter of increasing systemic concern, with notable focus on the concentration risk posed by the small number of cloud and other data service providers.
The Dutch Authority for the Financial Markets and De Nederlandsche Bank published a joint report in November 2025, warning of the increasing systemic risk arising from financial sector reliance on a limited number of non-European IT service providers. It cautions that such dependency amplifies the risk of concentration and systemic disruption, where a failure at a single provider could affect large segments of the financial sector.
“In the current bleak geopolitical climate, there is a risk that state actors will abuse the dependence on digital services as a means of political leverage or as an instrument in a trade conflict,” the Dutch report said.
Firms and their IT suppliers are aware of the risks and are taking measures to mitigate them. Still, the extent to which the solutions provide effective protection against malicious influence from non-European actors “remains to be seen”.
Opportunities
Effective data governance is worth the strategic investment. It enables a firm to treat its data as a key asset rather than simply a custodial liability. Clear line of sight to, and management of, all data is the foundation on which a firm can evidence compliance, accurately report internally and externally, and gain unparalleled insight into emerging risks. Data-driven, evidence-based risk management for regulatory change and control infrastructure is fundamental to the identification and mitigation of personal liability by senior managers.
NO FEE, NO RISK
100% ON RETURNS If you invest in only one asset servicing news source this year, make sure it is your free subscription to Asset Servicing Times
100% ON RETURNS If you invest in only one asset servicing news source this year, make sure it is your free subscription to Asset Servicing Times
