Data interoperability as a competitive edge
21 Jan 2026
Tahlia Kraefft explores how the capability to seamlessly integrate, distribute, and use data across fragmented systems in asset servicing is no longer merely a mechanism for operational efficiency but a driver of innovation and a strategic advantage
Image: ronstikW/stock.adobe.com
Data interoperability has evolved from a back office operational plumbing role to a competitive differentiator for asset servicers, allowing firms to steer quicker decision-making, lower costs, and improve client experiences. As asset servicing volumes increase steeply, with a 25 per cent year-over-year (YoY) rise and 67 per cent of errors are the result of data problems, the capacity to harmoniously connect diverse, legacy, and new digital systems through APIs and cloud is not purely important for business agility but key to a firm’s existence and strategic advantage.
It comes against a backdrop of providers moving from siloed, legacy systems to API-driven modular architectures enabling operational alpha and real-time client services.
From operational plumbing to strategic edge
Data interoperability has crucially changed from an IT department requirement to a distinguishing commercial edge, led by the increase of complex multi-provider ecosystems, the requirement for real-time data insights, and strict regulatory demands. Asset servicing faces strong pressure from margin compression, the swift rise of alternative assets, and the need to provide personalised client experiences. Consequently, switching costs, onboarding speed, and data usability have shifted from operational concerns into front-line considerations that affect alpha, cost, and client experience.
Robin Hasson, head of reconciliations solutions, Smartstream, describes interoperability as not just an operational upgrade but a strategic game-changer: “[It is] no longer a technical checkbox; it’s the foundation for growth. By moving from managing data to trusting it, asset servicers can accelerate innovation, reduce operational drag, and deliver richer insights to clients. In a market where speed and certainty matter, interoperability isn’t just a differentiator — it’s a catalyst for sustainable success.
“Interoperability shifts the value proposition for asset servicers. Success no longer comes from simply holding data in custody, it comes from how effectively you move that data and integrate it into a client’s broader ecosystem. This change requires a pivot in investment, product design, and governance.”
Madhu Ramu, head of product for software and lending solutions, at S&P Global Market Intelligence, says the providers that approach data as a high-performance product, and prioritise transparency and ease of access will be the winners: “Investment priorities are moving away from maintaining closed systems and toward building open, scalable back-ends. True interoperability requires more than just a modern interface.
“It requires an underlying architecture that can handle massive volumes without a proportional increase in headcount. For a service provider, the goal is to improve unit economics by automating the lifecycle of a trade or a corporate action so that data flows without friction. Success is now measured by the ability to scale volume without scaling costs.”
Clément Miglietti, chief product officer and chief technology, NeoXam, describes interoperability as a competitive capability that separates leaders and laggards: “Investments in modern data platforms, open interfaces, and governed data processes are prioritised over incremental system patches. Data governance is no longer a back office compliance function; it is the commercial foundation for trust in every number an asset servicer delivers.
“The ability to demonstrate quality, lineage, and control underpins trust with clients and regulators alike. Servicers that embed these capabilities into their technology and operational model are better positioned to adapt to evolving asset classes, regulatory change, and the integration demands of large institutional clients.”
Shift in how asset managers select service providers
Asset managers have different criteria for choosing providers now, based on integration ease, API maturity, and data transparency. This marks a shift from relationship-led selection to architecture-led selection. The sector is seeking time-to-integrate metrics, API catalogues and documentation, self-service data access, and event-driven compared to batch data delivery. Asset managers favour providers that plug in rather than require custom builds and they are becoming intolerant of proprietary formats and manual data handling.
Hasson of Smartstream, comments: “Asset managers increasingly treat providers as nodes in a data supply chain. Selection hinges on whether a provider can produce, persist, and exchange rich, structured data that flows end-to-end across cash, positions, instructions and corporate actions – this is to minimise manual reconciliation and latency.”
Standards like ISO 20022 are being assessed less as “messaging formats” and more as semantic data models that enable consistent identifiers, attributes, and status propagation across the event lifecycle.
Hasson continues to explain that interoperable providers turn over considerably lower error rates and decrease manual intervention during market spikes. He comments: “As a result, selection discussions now focus on measurable benefits — error reduction, faster decision cycles, and a shift from manual remediation to automated prevention. ISO 20022 adoption and open data strategies are seen as structural solutions. Regulatory pressures and multi-market integration further drive preference for providers with native standards support and proven interoperability tools, making standards maturity a key differentiator in Request For Proposals and renewals.”
Miglietti of NeoXam, reflects: “Asset managers are no longer buying isolated services; they’re buying into data ecosystems. Interoperability has become a core test of whether a provider can deliver consistent, auditable information across the entire operating model – not just a single output. Firms recognise that fragmented legacy systems and point-to-point integrations create risk, delay reporting and limit adaptability. What matters increasingly is whether a provider supports a unified data lifecycle — acquisition, enrichment, consolidation, and distribution — so that the same controlled dataset can be reused across compliance, performance, and risk workflows.”
Ramu spells out that to understand how selection criteria are changing, you must look at the reasons why there is such deep, ongoing frustration with the black box model. He says: “For decades, the industry has been defined by proprietary systems that act like walled gardens. This lack of connectivity creates a ‘sticky’ relationship for all the wrong reasons. Asset managers are often still held hostage by their providers because the operational pain and systemic risk of moving data are simply too high. Switching providers is not just a business decision, it is an operational nightmare that many firms avoid, even when service levels are failing them.
“That dynamic is fundamentally shifting. We are seeing a significant ‘pendulum swing’. For years, the trend was toward total outsourcing to a single service provider. However, many asset managers have found that this creates severe limitations in data availability and quality. Different asset class teams often require greater control and more granular data than a single, broad provider can support. This is why we see firms selectively moving critical processes back in-house. They are seeking out infrastructure that acknowledges this reality, prioritising a hybrid model where a manager can outsource commodity functions but keep the high-alpha, data-intensive processes under their own roof.
“When a firm looks at a provider today, they are no longer just buying a standalone service, they are evaluating a node in their broader network. Managers now prioritise ‘velocity of data’ and the ability to maintain a hybrid operational model. They want a provider who acts as an extension of their own infrastructure, not a replacement for it.”
He offers an example to highlight the complexity of global regulatory compliance, explaining that a trade and transaction reporting system such as Cappitech, cannot operate in a void. It needs a seamless, automated flow of data from front-office execution platforms and back-office accounting systems to make sure that every reportable field is accurate and submitted within stringent regulatory periods according to Ramu. He says if these systems don’t communicate with each other, the firm experiences not only operational friction but substantial regulatory risk.
Ramu continues: “If a provider’s data is trapped behind a proprietary wall, they are no longer an asset, they are a source of systemic operational risk. Business leaders now demand that their technology spend enables growth and allows for selective control. They are choosing partners who help them scale their internal capabilities, not partners who keep them dependent on a closed loop.”
Gus Sekhon, head of product at FINBOURNE Technology, states that data interoperability is increasingly core to how asset managers judge providers: “Firms can no longer afford fragmented data architectures that rely on bespoke integrations and manual processes. These legacy set-ups create friction, limit growth, and hamper portfolio diversification.
“Asset managers are prioritising platforms that can seamlessly integrate a vast array of data sources, making it accessible across the entire organisation. They want data to move cleanly between teams and workflows, with a real-time view that supports day-to-day decision-making and client service.
“With asset managers leaning more on advanced analytics and expanding into new asset classes, they risk being constrained by siloed systems. They’re looking for platforms that can support a more data-intensive operation.”
Role of open-data architecture in facilitating interoperability
Open-data architecture works as a foundational framework, allowing seamless data exchange, conversion, and usage across different systems and organisations. Through using open standards and splitting compute from storage, it cuts out vendor lock-in, diminishes data silos, and backs real-time, efficient data integration, which is a requirement for analytics, modern business intelligence, and AI.
Miglietti describes open-data architecture as the structural foundation that turns interoperability from an aspiration into a tangible reality. He comments: “A centralised data backbone with modular components for validation, enrichment, and distribution enables firms to ingest reference, market and operational data from diverse sources once — then feed it downstream to reporting, analytics, risk, and oversight tools without repeated reconciliation.
“This design breaks down internal silos and replaces brittle point-to-point connections with governable, reusable data flows, enabling consistency and transparency even as volumes grow and requirements change. The result is a more resilient, extensible operating model where business consumers can trust and act on the same dataset.”
Ramu explains that open-data architecture is moving beyond the simple act of shifting files from one place to another and suggests we are entering a new era of ‘context interoperability.’ He remarks: “For years, the industry focused solely on plumbing: how to move raw data from Point A to Point B. But raw data is often useless without a map to explain what it means. This is why we are shifting toward a Model Context Protocol (MCP) approach.
“Think of this as the difference between receiving a static PDF of a bank statement versus having a live, interconnected feed that understands your entire portfolio. To make this work for a complex event like a cross-border merger and acquisition, we rely on three interconnected concepts: Taxonomy, RDF, and Ontology.
“Taxonomy serves as the filing system. It is how we categorise information so an AI can find it. Instead of seeing a generic ‘corporate action,’ the taxonomy immediately classifies it as a ‘Cash-and-Stock Merger.’ This categorisation tells the system exactly which workflow to trigger. From there, the Resource Description Framework (RDF) provides the grammar. In a standard spreadsheet, a system does not actually ‘know’ the relationship between a parent company and its subsidiary. RDF defines the data as a web of relationships. It tells the system that the ‘Target Company’ is being acquired by the ‘Acquiror,’ and that this specific ‘Shareholder’ holds ‘Tax Lots’ purchased at different dates and prices. Finally, Ontology provides the shared dictionary and logic. It ensures that if one system calls a payment ‘Cash in Lieu’ and another calls it a ‘Fractional Share Liquidation,’ the AI recognises they are the same thing. More importantly, the ontology holds the rules: it explains how a specific tax jurisdiction treats the cash portion of the merger versus the stock portion.
“By combining these, we can use the Model Context Protocol to create a standardised ‘handshake.’ It allows a manager’s AI agents to talk directly to the provider’s data and tools. Instead of just delivering a file about the merger, we are sharing the ‘context.’ We provide the classification, the relationships, and the logic that explains how that merger affects the cost basis and the resulting portfolio position. This eliminates the need for custom coding because every system is finally using the same map. At S&P Global, we see this as a shift from ‘data sharing’ to ‘context sharing.’
“We eliminate the reconciliation burden because the AI, the accounting system, and the reporting platform are all querying the same ‘Golden Record’ through the same machine-readable logic. This allows managers to use AI to react to market shifts in real time, because the infrastructure finally provides the context-aware intelligence that modern asset servicing requires.”
Hasson of Smartstream notes: “Open data architecture ensures standardised objects are treated as firstclass citizens in the storage and processing layers — schemas, catalogs, and lineage — not only at the interfaces. This allows consistent ‘create once, enrich once, consume consistently patterns’ across the event lifecycle which unlocks real-time reconciliation, status propagation, and analytics improvements.
“Open architectures mandate published APIs, portable formats for unstructured and structured data, and failure-aware interoperation. This reduces provider lock in, simplifies cloud migrations, and protects integrity during outages or counterparty delays.
“And with standardised semantics, governance controls — lineage, audit, and accuracy — become programmatically enforceable and measurable, supporting both regulatory commitments and continuous operational improvement.”
Importance of interoperability standards
Implementing interoperability frameworks such as ISO 20022 and Financial Industry Business Ontology (FIBO) are key to ensuring smooth data exchange between diverse systems.
Adhering to these standards carries benefits of decreasing: integration friction, interpretation risk, and vendor lock-ins. Open architectures are built to move past bilateral connections, to enable a mult-vendor ecosystem where various participants can collaborate, shifting away from a rigid system to a flexible, modular environment that uses open stands such as APIs, to allow interoperability.
Migliette states: “Standardisation around common schemas and messaging frameworks is transforming RFPs from generic output checks into deeper inquiries about data governance. Rather than simply supporting multiple file formats, providers are now asked to articulate how they map and normalise data across domains — and how they adapt to evolving industry standards while preserving data integrity downstream. Standards such as ISO 20022 and conceptual vocabularies act as anchors for semantic clarity, and firms are looking for providers whose platforms can ingest, harmonise and distribute data consistently in line with these structures.”
Hasson comments: “Requirements now often specify canonical data classes and relationships rather than only interface endpoints. Vendors are asked to demonstrate lossless transformation and ingestion of data between systems and touch points. RFP sections on integration increasingly mandate open/published APIs, consistent versioning, and semantic stability to avoid brittle, format-specific mappings. Internal policies emphasise selecting providers that prevent lock-in by adhering to open semantics and ensuring portability.”
Ramu reasons that while the word ‘standardisation’ sounds like a dry, back office topic, it is a key lever to unlock artificial intelligence, remarking: “Standards like ISO 20022 and FIBO, alongside frameworks like Data Management Capability Assessment Model, are finally eliminating the ‘translation tax’ that has plagued this industry for years. This is not a new conversation; the industry has been talking about these standards for decades. However, we have finally moved from theoretical discussion to massive, practical success stories that are changing the way firms buy technology.
“Today, clients are pushing for a standardised implementation model within their data platforms to promote ‘explainability’. They want to see exactly how a data result was achieved and understand the logic behind the transformation. This is where FIBO and DCAM become critical. They provide the schema ontology and the governance controls that ensure data quality across multiple providers.
“It is no longer enough to just deliver a number; you have to provide the completeness of the data definition at the asset class or transaction level. This allows a manager to trace the lineage of a specific trade or transaction back to its source with total confidence.
“A prime example is the global migration to ISO 20022 for Corporate Actions and securities processing. We have seen incredible success in the US, where the DTCC’s transition to ISO 20022 messaging for corporate actions fundamentally improved the speed and accuracy of the entire market.
“Now, that same momentum is hitting Europe. Euroclear’s adoption of these standards for corporate actions is a massive step toward global harmonisation.
“Unlike old legacy formats, ISO 20022 allows for incredibly rich, structured data that captures the full complexity of an event.
“When a provider sends a corporate action notification, it is an ontological package that understands the specific tax implications, the election deadlines, and the resulting impact on the portfolio.
“When we talk about AI, this distinction is critical. AI is only as good as the data it consumes. If you feed a large language model ‘flat’ or unstructured data, you get hallucination and unreliable outputs. But if you feed it ontological data from a shared schema that understands the inherent relationships between an issuer and a complex corporate event, you get true intelligence.
“Standardisation allows managers to move from a ‘trust but verify’ model to one where the data is inherently reliable because it follows a globally recognised, AI-ready schema. It turns a massive pile of data into a strategic knowledge graph.”
As data interoperability becomes the minimum entry criteria for providers, quality, consistency, and usability will be the discerning factor. Ecosystem participation will become more important than traditional bilateral data integration, changing focus from two-party connections to collaborating within larger shared networks.
In an industry where the means to integrate will be as crucial as the standalone service, providers that engage data interoperability as a key business capability, not a mere compliance activity will set the standards for the next era of asset servicing.
It comes against a backdrop of providers moving from siloed, legacy systems to API-driven modular architectures enabling operational alpha and real-time client services.
From operational plumbing to strategic edge
Data interoperability has crucially changed from an IT department requirement to a distinguishing commercial edge, led by the increase of complex multi-provider ecosystems, the requirement for real-time data insights, and strict regulatory demands. Asset servicing faces strong pressure from margin compression, the swift rise of alternative assets, and the need to provide personalised client experiences. Consequently, switching costs, onboarding speed, and data usability have shifted from operational concerns into front-line considerations that affect alpha, cost, and client experience.
Robin Hasson, head of reconciliations solutions, Smartstream, describes interoperability as not just an operational upgrade but a strategic game-changer: “[It is] no longer a technical checkbox; it’s the foundation for growth. By moving from managing data to trusting it, asset servicers can accelerate innovation, reduce operational drag, and deliver richer insights to clients. In a market where speed and certainty matter, interoperability isn’t just a differentiator — it’s a catalyst for sustainable success.
“Interoperability shifts the value proposition for asset servicers. Success no longer comes from simply holding data in custody, it comes from how effectively you move that data and integrate it into a client’s broader ecosystem. This change requires a pivot in investment, product design, and governance.”
Madhu Ramu, head of product for software and lending solutions, at S&P Global Market Intelligence, says the providers that approach data as a high-performance product, and prioritise transparency and ease of access will be the winners: “Investment priorities are moving away from maintaining closed systems and toward building open, scalable back-ends. True interoperability requires more than just a modern interface.
“It requires an underlying architecture that can handle massive volumes without a proportional increase in headcount. For a service provider, the goal is to improve unit economics by automating the lifecycle of a trade or a corporate action so that data flows without friction. Success is now measured by the ability to scale volume without scaling costs.”
Clément Miglietti, chief product officer and chief technology, NeoXam, describes interoperability as a competitive capability that separates leaders and laggards: “Investments in modern data platforms, open interfaces, and governed data processes are prioritised over incremental system patches. Data governance is no longer a back office compliance function; it is the commercial foundation for trust in every number an asset servicer delivers.
“The ability to demonstrate quality, lineage, and control underpins trust with clients and regulators alike. Servicers that embed these capabilities into their technology and operational model are better positioned to adapt to evolving asset classes, regulatory change, and the integration demands of large institutional clients.”
Shift in how asset managers select service providers
Asset managers have different criteria for choosing providers now, based on integration ease, API maturity, and data transparency. This marks a shift from relationship-led selection to architecture-led selection. The sector is seeking time-to-integrate metrics, API catalogues and documentation, self-service data access, and event-driven compared to batch data delivery. Asset managers favour providers that plug in rather than require custom builds and they are becoming intolerant of proprietary formats and manual data handling.
Hasson of Smartstream, comments: “Asset managers increasingly treat providers as nodes in a data supply chain. Selection hinges on whether a provider can produce, persist, and exchange rich, structured data that flows end-to-end across cash, positions, instructions and corporate actions – this is to minimise manual reconciliation and latency.”
Standards like ISO 20022 are being assessed less as “messaging formats” and more as semantic data models that enable consistent identifiers, attributes, and status propagation across the event lifecycle.
Hasson continues to explain that interoperable providers turn over considerably lower error rates and decrease manual intervention during market spikes. He comments: “As a result, selection discussions now focus on measurable benefits — error reduction, faster decision cycles, and a shift from manual remediation to automated prevention. ISO 20022 adoption and open data strategies are seen as structural solutions. Regulatory pressures and multi-market integration further drive preference for providers with native standards support and proven interoperability tools, making standards maturity a key differentiator in Request For Proposals and renewals.”
Miglietti of NeoXam, reflects: “Asset managers are no longer buying isolated services; they’re buying into data ecosystems. Interoperability has become a core test of whether a provider can deliver consistent, auditable information across the entire operating model – not just a single output. Firms recognise that fragmented legacy systems and point-to-point integrations create risk, delay reporting and limit adaptability. What matters increasingly is whether a provider supports a unified data lifecycle — acquisition, enrichment, consolidation, and distribution — so that the same controlled dataset can be reused across compliance, performance, and risk workflows.”
Ramu spells out that to understand how selection criteria are changing, you must look at the reasons why there is such deep, ongoing frustration with the black box model. He says: “For decades, the industry has been defined by proprietary systems that act like walled gardens. This lack of connectivity creates a ‘sticky’ relationship for all the wrong reasons. Asset managers are often still held hostage by their providers because the operational pain and systemic risk of moving data are simply too high. Switching providers is not just a business decision, it is an operational nightmare that many firms avoid, even when service levels are failing them.
“That dynamic is fundamentally shifting. We are seeing a significant ‘pendulum swing’. For years, the trend was toward total outsourcing to a single service provider. However, many asset managers have found that this creates severe limitations in data availability and quality. Different asset class teams often require greater control and more granular data than a single, broad provider can support. This is why we see firms selectively moving critical processes back in-house. They are seeking out infrastructure that acknowledges this reality, prioritising a hybrid model where a manager can outsource commodity functions but keep the high-alpha, data-intensive processes under their own roof.
“When a firm looks at a provider today, they are no longer just buying a standalone service, they are evaluating a node in their broader network. Managers now prioritise ‘velocity of data’ and the ability to maintain a hybrid operational model. They want a provider who acts as an extension of their own infrastructure, not a replacement for it.”
He offers an example to highlight the complexity of global regulatory compliance, explaining that a trade and transaction reporting system such as Cappitech, cannot operate in a void. It needs a seamless, automated flow of data from front-office execution platforms and back-office accounting systems to make sure that every reportable field is accurate and submitted within stringent regulatory periods according to Ramu. He says if these systems don’t communicate with each other, the firm experiences not only operational friction but substantial regulatory risk.
Ramu continues: “If a provider’s data is trapped behind a proprietary wall, they are no longer an asset, they are a source of systemic operational risk. Business leaders now demand that their technology spend enables growth and allows for selective control. They are choosing partners who help them scale their internal capabilities, not partners who keep them dependent on a closed loop.”
Gus Sekhon, head of product at FINBOURNE Technology, states that data interoperability is increasingly core to how asset managers judge providers: “Firms can no longer afford fragmented data architectures that rely on bespoke integrations and manual processes. These legacy set-ups create friction, limit growth, and hamper portfolio diversification.
“Asset managers are prioritising platforms that can seamlessly integrate a vast array of data sources, making it accessible across the entire organisation. They want data to move cleanly between teams and workflows, with a real-time view that supports day-to-day decision-making and client service.
“With asset managers leaning more on advanced analytics and expanding into new asset classes, they risk being constrained by siloed systems. They’re looking for platforms that can support a more data-intensive operation.”
Role of open-data architecture in facilitating interoperability
Open-data architecture works as a foundational framework, allowing seamless data exchange, conversion, and usage across different systems and organisations. Through using open standards and splitting compute from storage, it cuts out vendor lock-in, diminishes data silos, and backs real-time, efficient data integration, which is a requirement for analytics, modern business intelligence, and AI.
Miglietti describes open-data architecture as the structural foundation that turns interoperability from an aspiration into a tangible reality. He comments: “A centralised data backbone with modular components for validation, enrichment, and distribution enables firms to ingest reference, market and operational data from diverse sources once — then feed it downstream to reporting, analytics, risk, and oversight tools without repeated reconciliation.
“This design breaks down internal silos and replaces brittle point-to-point connections with governable, reusable data flows, enabling consistency and transparency even as volumes grow and requirements change. The result is a more resilient, extensible operating model where business consumers can trust and act on the same dataset.”
Ramu explains that open-data architecture is moving beyond the simple act of shifting files from one place to another and suggests we are entering a new era of ‘context interoperability.’ He remarks: “For years, the industry focused solely on plumbing: how to move raw data from Point A to Point B. But raw data is often useless without a map to explain what it means. This is why we are shifting toward a Model Context Protocol (MCP) approach.
“Think of this as the difference between receiving a static PDF of a bank statement versus having a live, interconnected feed that understands your entire portfolio. To make this work for a complex event like a cross-border merger and acquisition, we rely on three interconnected concepts: Taxonomy, RDF, and Ontology.
“Taxonomy serves as the filing system. It is how we categorise information so an AI can find it. Instead of seeing a generic ‘corporate action,’ the taxonomy immediately classifies it as a ‘Cash-and-Stock Merger.’ This categorisation tells the system exactly which workflow to trigger. From there, the Resource Description Framework (RDF) provides the grammar. In a standard spreadsheet, a system does not actually ‘know’ the relationship between a parent company and its subsidiary. RDF defines the data as a web of relationships. It tells the system that the ‘Target Company’ is being acquired by the ‘Acquiror,’ and that this specific ‘Shareholder’ holds ‘Tax Lots’ purchased at different dates and prices. Finally, Ontology provides the shared dictionary and logic. It ensures that if one system calls a payment ‘Cash in Lieu’ and another calls it a ‘Fractional Share Liquidation,’ the AI recognises they are the same thing. More importantly, the ontology holds the rules: it explains how a specific tax jurisdiction treats the cash portion of the merger versus the stock portion.
“By combining these, we can use the Model Context Protocol to create a standardised ‘handshake.’ It allows a manager’s AI agents to talk directly to the provider’s data and tools. Instead of just delivering a file about the merger, we are sharing the ‘context.’ We provide the classification, the relationships, and the logic that explains how that merger affects the cost basis and the resulting portfolio position. This eliminates the need for custom coding because every system is finally using the same map. At S&P Global, we see this as a shift from ‘data sharing’ to ‘context sharing.’
“We eliminate the reconciliation burden because the AI, the accounting system, and the reporting platform are all querying the same ‘Golden Record’ through the same machine-readable logic. This allows managers to use AI to react to market shifts in real time, because the infrastructure finally provides the context-aware intelligence that modern asset servicing requires.”
Hasson of Smartstream notes: “Open data architecture ensures standardised objects are treated as firstclass citizens in the storage and processing layers — schemas, catalogs, and lineage — not only at the interfaces. This allows consistent ‘create once, enrich once, consume consistently patterns’ across the event lifecycle which unlocks real-time reconciliation, status propagation, and analytics improvements.
“Open architectures mandate published APIs, portable formats for unstructured and structured data, and failure-aware interoperation. This reduces provider lock in, simplifies cloud migrations, and protects integrity during outages or counterparty delays.
“And with standardised semantics, governance controls — lineage, audit, and accuracy — become programmatically enforceable and measurable, supporting both regulatory commitments and continuous operational improvement.”
Importance of interoperability standards
Implementing interoperability frameworks such as ISO 20022 and Financial Industry Business Ontology (FIBO) are key to ensuring smooth data exchange between diverse systems.
Adhering to these standards carries benefits of decreasing: integration friction, interpretation risk, and vendor lock-ins. Open architectures are built to move past bilateral connections, to enable a mult-vendor ecosystem where various participants can collaborate, shifting away from a rigid system to a flexible, modular environment that uses open stands such as APIs, to allow interoperability.
Migliette states: “Standardisation around common schemas and messaging frameworks is transforming RFPs from generic output checks into deeper inquiries about data governance. Rather than simply supporting multiple file formats, providers are now asked to articulate how they map and normalise data across domains — and how they adapt to evolving industry standards while preserving data integrity downstream. Standards such as ISO 20022 and conceptual vocabularies act as anchors for semantic clarity, and firms are looking for providers whose platforms can ingest, harmonise and distribute data consistently in line with these structures.”
Hasson comments: “Requirements now often specify canonical data classes and relationships rather than only interface endpoints. Vendors are asked to demonstrate lossless transformation and ingestion of data between systems and touch points. RFP sections on integration increasingly mandate open/published APIs, consistent versioning, and semantic stability to avoid brittle, format-specific mappings. Internal policies emphasise selecting providers that prevent lock-in by adhering to open semantics and ensuring portability.”
Ramu reasons that while the word ‘standardisation’ sounds like a dry, back office topic, it is a key lever to unlock artificial intelligence, remarking: “Standards like ISO 20022 and FIBO, alongside frameworks like Data Management Capability Assessment Model, are finally eliminating the ‘translation tax’ that has plagued this industry for years. This is not a new conversation; the industry has been talking about these standards for decades. However, we have finally moved from theoretical discussion to massive, practical success stories that are changing the way firms buy technology.
“Today, clients are pushing for a standardised implementation model within their data platforms to promote ‘explainability’. They want to see exactly how a data result was achieved and understand the logic behind the transformation. This is where FIBO and DCAM become critical. They provide the schema ontology and the governance controls that ensure data quality across multiple providers.
“It is no longer enough to just deliver a number; you have to provide the completeness of the data definition at the asset class or transaction level. This allows a manager to trace the lineage of a specific trade or transaction back to its source with total confidence.
“A prime example is the global migration to ISO 20022 for Corporate Actions and securities processing. We have seen incredible success in the US, where the DTCC’s transition to ISO 20022 messaging for corporate actions fundamentally improved the speed and accuracy of the entire market.
“Now, that same momentum is hitting Europe. Euroclear’s adoption of these standards for corporate actions is a massive step toward global harmonisation.
“Unlike old legacy formats, ISO 20022 allows for incredibly rich, structured data that captures the full complexity of an event.
“When a provider sends a corporate action notification, it is an ontological package that understands the specific tax implications, the election deadlines, and the resulting impact on the portfolio.
“When we talk about AI, this distinction is critical. AI is only as good as the data it consumes. If you feed a large language model ‘flat’ or unstructured data, you get hallucination and unreliable outputs. But if you feed it ontological data from a shared schema that understands the inherent relationships between an issuer and a complex corporate event, you get true intelligence.
“Standardisation allows managers to move from a ‘trust but verify’ model to one where the data is inherently reliable because it follows a globally recognised, AI-ready schema. It turns a massive pile of data into a strategic knowledge graph.”
As data interoperability becomes the minimum entry criteria for providers, quality, consistency, and usability will be the discerning factor. Ecosystem participation will become more important than traditional bilateral data integration, changing focus from two-party connections to collaborating within larger shared networks.
In an industry where the means to integrate will be as crucial as the standalone service, providers that engage data interoperability as a key business capability, not a mere compliance activity will set the standards for the next era of asset servicing.
NO FEE, NO RISK
100% ON RETURNS If you invest in only one asset servicing news source this year, make sure it is your free subscription to Asset Servicing Times
100% ON RETURNS If you invest in only one asset servicing news source this year, make sure it is your free subscription to Asset Servicing Times
