News by sections
ESG

News by region
Issue archives
Archive section
Multimedia
Videos
Search site
Features
Interviews
Country profiles
Generic business image for editors pick article feature Image: song_about_summer/stock.adobe.com

20 Mar 2024

Share this article





Streamlining corporate actions

Klea Neza reviews the importance of data quality and automation within corporate actions

Corporate actions processing has long been a primary source of operational risk and potential loss for service providers operating in the post-trade sector. Industry working groups have focused heavily on steps to promote standardisation, to improve data quality and to drive automation across the trade lifecycle. This is fundamental to improving information flows between issuer and investor and helping custodians to ensure that shareholders are able to exercise their rights, and receive due entitlement, as equity owners.

Historically, these tasks have been manual and time-consuming. However, good-quality data and automation are key to processing a corporate action efficiently, which may include dividends, mergers and acquisitions, stock splits, right issues and spin-offs. Yet, there are dilemmas certain firms have faced whilst improving their corporate actions efficiencies and the impact these processes have on shareholders and the businesses.

Importance of data quality

Good data quality allows for businesses to reliably make informed, crucial decisions, but deficiencies may result in operational errors and increase the risk of missing out on shareholder entitlements — the data used must serve the outcome it’s intended for.

The effect of poor data quality can be exemplified through a variety of operational issues. Experian, a multinational data analytics and consumer credit reporting company, disclosed in their 2022 Global Data Management Research Report (last updated in January 2024) that: “85 per cent of organisations indicate that poor-quality contact data for customers negatively impacts their operational processes and efficiency, and in turn, hinders the chances of being flexible and agile.”

The report further stated that poor-quality data has had a ripple effect, resulting in the waste of 42 per cent of resources and additional costs, as well as significant damage to the reliability of and trust in analytics.

Mistakes often occur in the data entry phase, enforcing the need for manual labour to rectify these errors. As such, investing in good data quality initially could allow firms to avoid a loss in profit due to unreliable information.

Yogita Mehta, commercial product director for corporate actions at SIX, commented: “When the data in hand is of poor quality, this in turn makes the process labour intensive and arduous. Paired with a limited adoption of existing standards across the corporate actions industry, this means legacy technology systems can no longer keep up with the level of data needing to be processed.”

Similarly, Adam Cottingham, head of asset servicing at SmartStream Technologies, highlighted the importance of good data quality which has allowed the business to prosper and has enhanced straight-through processing (STP) rates and operational efficiency.

He stated: “Operational efficiency is predicated on data quality both for enabling STP and operational exception management. A combination of the status of a process, the critical date for its deadline and the business attribute or group are determinants of STP and exceptions management.”

If there are discrepancies in these items owing to poor data quality, then STP and exception management will be impaired.

Cottingham then underlined that “accounting accuracy is determined by data quality to support the accrual of outcomes and calculation of actuals along with tax and with their provision into profit and loss reporting, net asset value, investment book of record and updating accounting.”

His third point affirmed the importance of grounded data quality in the decision-making process, and the necessity of including all eligible position holders in an event lifecycle. “[They must be] communicated with, called to action on time and provided with accurate business information,” he said.

Impact on shareholders and business

Mike Wood, general manager for asset servicing at Broadridge, underlined how inaccurate data can lead to inefficiencies, delays and potential losses.

He also stressed how issues can occur due to the sheer number of announcements organisations are consolidating from different sources and in different formats.

Wood warned: “There are plenty of horror stories of firms electing on the ‘wrong’ option number due to the different presentation between custodians.

“Missed election deadlines, or clients not being informed of a voluntary corporate action event, are other examples where stakeholders and shareholders are negatively impacted by errors caused by poor data quality.”

Misprocessed corporate actions resulting from poor data quality can leave firms at a competitive disadvantage, leaving them subject to claims from their clients and eating into development resources that can be used for innovation or product development.

Conttingham explained: “Stakeholders and shareholders rely on accurate data to make informed decisions about investments and strategies. When data quality is poor, decisions may be based on flawed assumptions, leading to poor outcomes and financial implications.

“Any technology solution should aim to satisfy regulatory barriers by keeping up to date with market standards as well as evolving the solution’s flexibility and configuration to produce accurate and timely data.”

Delivering further advances in corporate action processing efficiency

There are a number of ways that firms can improve their data quality to ensure reliable and successful corporate actions.

Broadridge advances their corporate action processing efficiency by ensuring rigorous data quality and parallel steps to promote automation of corporate actions processing.

Broadridge’s Wood commented: “Automation is also key to reducing manual intervention, streamline workflows, and minimising risk. Investments in workflow tools and automated processes that remain compliant with both industry and regulatory changes through continued investment is vital for organisations planning for the future.”

Historically speaking, monitoring corporation actions has been a highly labour-intensive process, costing back-office departments of asset managers and investment banks significant time and money, said SIX’s Mehta.

The priority lies with ensuring firms are less reliant on manual processing, he explained, removing fragmentation and establishing a standardised approach when interpreting complex sets.

He continued: “The importance of automation cannot be underplayed and embracing the right technology to improve operational efficiency, evaluate different data sources, and highlight exceptions, will elevate some of the burden and risk. This will in turn free up people’s time to work more closely on the complex, value-add elements of their job.”

Development in the pipeline

The importance of good data quality and automation within corporate actions remains at the forefront of businesses as they decide what actions are the most reliable for a firm to produce material improvements.

Broadridge continues to heavily invest in the capabilities and integrations that are supported within their global corporate actions and income processing solutions.

Broadridge’s Wood demonstrated: “This currently includes the utilisation of AI capabilities to support enquiries, providing data insights and automating the resolution of specific exception scenarios that arise in the corporate actions lifecycle.”

He continued: “We have also invested in our connectivity with market infrastructure providers such as DTC and CREST to provide automation in business-critical areas, such as elections management, and developed partnerships with data providers and industry utility providers (in respect of claims management and tax reclaims) to ensure our clients can benefit from the efficiencies and improved client service these can offer.

As well as automation within corporate actions, Smartstream focuses on operational prudence and capable technology along with data availability to create an ecosystem where data quality and timeliness enable effective business decision making.

SmartStream’s Cottingham said: “Updating the Event Master and its eligible positions in real-time is necessary to achieve data quality as the network of external announcements and internal positions evolves through the event lifecycle.

“Establishing this process on best practices defined in industry working groups — underpinned by established standards like ISO 15022 and newer standards like ISO 20022, along with their various interpretations across counterparties — creates a framework for continuous improvement that will assure data quality is future proofed.”

Advertisement
Get in touch
News
More sections
Black Knight Media