An ideal back office boasts a place for everything, and everything in its place. But regulatory scrutiny is akin to an unexpected visit from the in-laws, and institutions can quickly end up with laundry piles stuffed under the bed and dirty dishes hidden in the oven. Quick fixes lead to more hassle in the long run, and with cost-pressures mounting too, banks are starting to look for a cost-effective way to tidy-up, once and for all.
Bennett Egeth, president of investment management for reference data and risk solutions at Broadridge, points out that as financial institutions bring in multiple solutions intended to improve back-office processing, they can end up in an even bigger mess.
Banks that opt for multiple systems find themselves having to act as their own system integrators, he says, with multiple solutions spread across several different departments.
He says: “Large banks and asset managers often end up with duplicate systems, and no real sense of how much they’re spending.”
“Understanding the real cost of enterprise data management is very hard to do. If you measure the mechanics of the data cleansing and distribution process, at first glance it may not seem like a large dollar amount. In reality, once you add the people and technology costs of each disparate system, we’re looking at potentially tens of millions of dollars.”
Egeth adds, however, that even bigger costs can result from poor data. “With stock record breaks, trade fails and inaccurate regulatory reports, a significant portion of the operations expense within banks is a result of data issues. A small improvement in data quality can have a large impact across the enterprise.”
“But, because the operational issues are so far downstream, they’re not always attributed back to the source of the data issues.”
Addressing this from a reconciliations perspective, Peter Webb, senior product manager at SmartStream, agrees that improving data quality could solve many of the back office’s inefficiencies.
He says: “Throughout the trade lifecycle there is opportunity for trades to fail, and the reason they’re failing is because of problems with the reference data or the trade capture in the first place.”
“If banks can fix the issues with the data upfront, they’re less likely to have those issues in the reconciliations process, which means they can achieve better straight-through processing (STP)—which is what everyone is trying to achieve.”
Webb points out that, while the total number of trades made is increasing, the number of failed trades is too. Equally, although business is starting to pick up again after the financial crisis, revenues have yet to catch up.
“Banks are looking for ways to reduce costs and to become leaner in the back office. That can only be done through increased automation, increased STP and by having better data flowing through the systems, all of which will lead to faster settlement.”
In the post-crisis regulatory environment, operations departments are facing increasing scrutiny, and utilities are starting to emerge as a method for relieving this pressure.
Back in 2014, the Depository Trust & Clearing Corporation (DTCC) came together with six banks to create Clarient Global, an entity data utility solution designed to reduce operational complexity and to address regulatory reporting requirements.
CEO of Clarient Global Matt Stauffer says: “Looking at the entire trade lifecycle, there was a clear need to create a utility model bringing multiple firms together, creating one solution that benefits a whole set of users.”
“The idea is to reduce costs and risks for banks, broker dealers, asset managers and institutional clients in those areas where there is a high degree of redundancy affecting their operational activities.”
He draws attention to the swathe of regulatory demands around legal entity data and know-your-client rules, and the need for better data transparency under the likes of the European Market Infrastructure Regulation and the Markets in Financial Instruments Directive II.
“Each of these events created more cost and more operations and compliance activity, not only within a bank but also among the banks’ clients,” he says.
Webb also notes an increased interest in the utility model. SmartStream’s Reference Data Utility for standardising data processing launched in October 2015, also in collaboration with three major banks.
Now, Webb says, the same principles can be applied to other operations, including reconciliations.
“Having a single source of reconciliations across an organisation comes back to economies of scale.
“Rather than having multiple products and teams distributed globally and conducting reconciliation processes, centralising can reduce costs significantly.”
Organisations are also starting to make their internal utilities accessible to third parties, which “effectively turns the reconciliations business into a revenue generator rather than a cost”, says Webb.
Stauffer also calls Clarient a resource for managing relationships on a single interface. Previously, all transaction relationships would run through the same processes and activities, and every time any section of regulation or in-house policy, changed, that change was made separately for every relationship and client affected.
He says: “That method is inefficient because of the time it takes to collect that information. And the quality of the data is affected because it has to go through multiple touch points. The benefit of a utility is that you can perform each process once, to the benefit of multiple users.”
“The service isn’t only for our founding members, but for the broader industry including buy side, sell-side and custodian clients. As usage and take-up continues to build, the community becomes very powerful,” adds Stauffer.
According to Egeth, however, the future is in managed services. This way, he says, banks can benefit from the improved efficiency in the back office, without making investments in-house.
“Regulatory changes have created a lot of work and required a large spend just to tread water and stay in business, and those rules aren’t going to end. The maintenance cost of systems is huge, and everyone is just patching up aging systems.”
While those running operations departments may have concerns about outsourcing that function, ultimately, Egeth says, the vast improvement of data quality and the amount of money banks could be saving typically win over time.
Egeth also suggests that mandating service providers eliminates an element of the data risk, offloading some of the responsibility on to that provider, who commits to certain service level agreements (SLAs). While there are rarely internal penalties, a service provider can be held accountable.
“Banks don’t tend to have those SLAs internally. There is no fine or penalty, and no indemnification against losses, so in that way a managed service can improve service and decrease risk.”
“There is a lot of noise around utilities, but often the theory is that a utility will grow. That model only works if the utility owner plans to sell it. That’s not in alignment with clients.”
“In the long-term, Broadridge can increase the decision-making tools that firms have, and reduce the error rate and the operational risk. Those benefits ripple through all levels of an organisation,” Egeth adds.
Stauffer remains firmly in the utilities camp, saying: “A managed service can help a firm to outsource or displace some costs, but it does not provide the broader benefit of reducing the number of bilateral interactions and eliminating the redundant processing that is occurring.”
He argues that the industry is on the move from document-driven processing to digitised, data-driven processing, and that throughout all reconciliations and settlement processing, every decision and every action comes down to data.
“We are building the capability to extract the relevant actionable data, validate it to ensure accuracy and reliability, and pass it on to the clients. That way, data exchange is happening with more STP as opposed to each firm having to manually re-key content from documents they receive.”
Of course, the drive for STP is partially down to regulation, however the consensus appears to be that, while mandated, it’s also just good business sense.
Stauffer says: “There is a regulatory component, but it goes well beyond that. A lot of these developments would have occurred anyway. What the regulation has done is accelerate them.”
Webb adds that there are several other non-regulatory benefits, in tangible financial savings such as reduced software and infrastructure costs, but also through improved services for clients.
He says: “Banks are able to increase match rates through better data, so they’re not having to deal with breaks, which leads to better STP and faster settlement. That can improve customer satisfaction. If an organisation has fewer problems with its trades, it is more likely to retain its existing relationships, and to generate additional business.”
He adds: “Settling more quickly also means reducing the risk of the organisation, and that means clients are happy, auditors are happy, and regulators are happy.”
But banks can always do more to address reconciliations errors and trade breaks. Egeth suggests that while institutions are becoming more aware of the potential costs, they’re “still more inclined to patch up old systems to meet regulatory requirements”.
“Some banks are smarter about this than others,” he says. “I don’t know if they’re aware of the need to trace the root cause of errors back to data issues.”
The common goals are clear: a more holistic back office, better STP rates and reduced numbers of broken trades. But whether the best route is utility models or managed services perhaps remains to
be seen. Either way, the objectives seem to be well within the industry’s grasp.
“Back office costs are massive,” Webb concludes. “We have helped banks and financial institutions achieve a lot already, but there
are still huge reductions to make, and we have the capabilities to do that.”