News by sections
ESG

News by region
Issue archives
Archive section
Multimedia
Videos
Search site
Features
Interviews
Country profiles
Generic business image for editors pick article feature Image: Shutterstock

22 March 2017

Share this article





Dawn of the data

The increasing role that technology and data management have to play in asset servicing was a major talking point at The Summit for Asset Managers in London.

An early session at the one-day event explored the changing nature of what asset managers seek from their service providers. One speaker, representing such a service provider, noted that managers are under pressure to achieve more with less resources, faced with more demand from their clients and under more regulatory constraints.

This environment is driving a move towards consolidating systems, allowing managers to be more cost-effective and to focus on their core business.

One speaker, Chris John of Broadridge, said the biggest trend in asset management is to get rid of “repetitive” functions.

He said: “If it’s not differentiated, get rid of it and give it to a proven technology and solutions provider with a proven record.”

An asset management representative on the panel agreed with this, saying managers need to focus on what differentiates the firm as an asset manager--outsourcing tasks that need to be done, but don’t necessarily add value.

Another noted that asset managers must put effort into “collaboration between different groups” within the business, noting that changes don’t add value if they’re implemented in siloes.

John went on to reference Broadridge’s Revport solution, suggesting that, going back eight years or so, some 95 percent of its deals were for the technology, which clients installed for themselves.

Now, 95 percent of the time is is the other way around, with Broadridge hosting the solution. This is reflective of the transformation that is underway in the asset management industry, he said.

Broadridge, John said, is seeing a shift towards more reliance on service providers to help asset managers grow, and to add value through infrastructure, technology and business-specific expertise.

An asset manager panelist added that his business was born out of a series of boutiques, and is now a global organisation with a “huge amount of complexity, oversight and cost”.

Bundling services to one provider, globally, could significantly improve costs and efficiency, and is a big initiative within the firm. He added, however, that IT teams are focusing on this as well as other initiatives, and need to prioritise their projects.

Another session also addressed the function of IT departments within asset managers, with speakers suggesting that this part of the business may see sweeping changes over the next 10 years.

One speaker, who represented a portfolio analysis company, said future IT departments will focus more on third-party technology management, with a shift towards external providers.

He suggested that asset managers would rather partner with third parties that focus purely on analytics and systems, rather than build their own. As a result, IT departments will shrink over the next 10 years.

“It’s a big shift in those skillsets, it doesn’t mean they won’t exist, but just in a very different capacity. It’s no longer their business to run technology,” he said.

The internal IT department’s role will shift from building and managing the systems to managing those partners, the speaker said.

He added that asset managers would rather focus on alpha, instead of worrying about upgrading “legacy” systems.

Regulatory reporting was also highlighted as a major challenge in the financial services sector. In a presentation, Gillian Boston, head of business consulting at AutoRek, a provider of automated financial controls and regulatory reporting solutions, said: “There is no doubt that within the financial services sector as a whole the speed of change and increase in complexity is high on the agenda.”

There will be major changes to regulatory reporting under the Markets in Financial Instruments Directive II and associated Regulation, the Common Reporting Standard and the Regulation on Packaged Retail and Insurance-based Investment Products, Boston said.

One common theme underpinning regulatory challenges is data, and the importance for organisations to be able to demonstrate that they manage, control and understand their data.

“Organisations have to have transparency of data, good governance and comprehensive audit trails in place to prove they are in control.”

She also highlighted the siloed nature of many legacy organisations, with multiple different systems and different data feeds in different formats. Organisations may well be faced with manual processes and, as volumes grow, this can lead to “the perception that regulation is an increasing burden”.

Boston said: “There is now not just a real need, but a real desire by firms to automate their reporting regimes, fuelled not just by the complexity and speed of change of regulations, but also by the increased interrogation from external auditors as well as regulators.”

She added: “Regulatory reporting submissions are only as good as the data they contain.”

Ongoing maintenance is required for companies to have full confidence in their regulatory submissions, linking processes, controls, roles and responsibilities and risks, and underpinning data with robust and automated regimes.

“This will give you confidence in your regulatory reporting and ongoing compliance,” Boston said.

In a panel in the data management stream of the conference, attendees heard that efficient data management in financial institutions is a cultural battle that is still being fought.

Data management providers need to ensure that “everyone in the organisation realises that it is something real”, one speaker said.

“They have to get beyond the idea that ‘this is IT’ and think of it more as information science,” he said. “This is the biggest cultural shift that we have ever seen.”

A big step toward achieving this would be “documenting their processes and putting them in a data framework”, and compliance should be a driving force behind this framework.

Such a framework should also consider differences in business lines and even how each department within an organisation operates. “There are some places that really show the benefit of standardisation, but it’s still a way off [from being the norm],” one speaker explained.

An afternoon discussion focused on the benefits of data warehousing for asset managers. Audience members were asked: “What is the biggest advantage you gain from a data warehouse?”

A significant majority, 43 percent, answered that the main benefit is increased quality and access to a single version of the data.

Increased utilisation of data within the firm was named as the second-biggest benefit, chosen by 29 percent, while 18 percent noted that a warehouse can provide flexibility to meet new data challenges and 11 percent selected lineage and data governance.

A speaker in the session said that, with regards to data warehouses, often a lot of words are used to describe something that is “not that complicated”. He agreed with the audience poll, saying that the main goal is “one truth and one understanding of the data”.

If two areas of an organisation are working with the same piece of data, they don’t have to waste time making sure their versions match up and potentially reconciling mismatches.

If there is a single source of data that both businesses can access, they can immediately start discussing what they are going to do with it. The speaker also noted challenges in the adoption of ‘data lakes’, or stores of raw data that are set aside until they are needed.

The main challenge is getting senior management on board, he said, and persuading them to spend money on data—something they perceive that they already have.

Another speaker noted yet another set of challenges that arise from trying to store everything in one place. Asset managers should have a clear definition between the middle- and back-office data and the front-office analytical data, which he called “less mission-critical”.

He added that data warehouses should be built to support the business in the future, with capabilities to encompass new products or programmes, and this requires some internal overheads.

While the data in the warehouse should be a ‘golden source’, the speaker noted that, in the case of a discrepancy, data must be fixed at its source, and not in the warehouse, otherwise the business will end up with inherent quality issues.

He concluded that a data warehouse can provide operational efficiency and flexibility, offering the ability for asset managers to “report off of something that you would inherently want to”.

Advertisement
Get in touch
News
More sections
Black Knight Media