News by sections
ESG

News by region
Issue archives
Archive section
Multimedia
Videos
Search site
Features
Interviews
Country profiles
Generic business image for editors pick article feature Image: jurgal/adobe.stock.com

28 Oct 2020

Share this article





From acorns into oaks

The quality of data is continuing to grow and mature allowing the asset servicing industry to take full advantage of the opportunities it can offer

The data quality in the financial services industry is not super mature however, more people are waking up to its use cases.

Its ability to drive efficiencies and improve decision making can help firms garner clearer insights into their business and it comes as no surprise that data is integral to a financial institution’s success — it always has been.

Data began driving the asset servicing market around 30 years ago but the difference is that technology is now able to support it in an extremely effective way.

Harmonate’s CEO Kevin Walkup suggests the only thing that’s changed is the means by which data and information in financial services are handled has massively accelerated with cloud-based software-as-a-service (SaaS), and now with machine learning.

In addition, data stream subscriptions and insights for investing decisions, the information that moves around a firm is also valuable. However, experts say that if it’s being moved around manually and it isn’t put into context on where it can be used faster, in a repeatable fashion, and repurposed, then the operation is moving too slow. It is also important to note that there are two types of data; front-office data and back- and middle-office data. Walkup highlights “the smartest firms are the ones that connect the two”.

Meanwhile, data warehouses and data lakes are increasingly becoming the norm. In addition to conversations around moving data to the cloud, the talk is turning increasingly to sharing data that already resides in the cloud.

“A lot of improvements in technology are making data models easier to train and they are producing results that we find can be used with a high degree of confidence,” says Frank Servidio, head of client reporting product, Citco.

Getting the model right

Creating a process that will correctly harness the data is crucial. Inefficient operating models can create a handful of problems such as duplicated data, repetitive processes and increased cost of operation.

“The more manual and convoluted, the higher the risk of operational risk and systemic failure,” affirms HSBC’s Duncan Cooper, director, head of data products strategy, markets and securities services digital and data.

“That’s not to say that everyone should use the same process, but that a firm should challenge itself and ask the question ‘What is the risk of doing this? What is the cost of working in this way?’ There is a cost for every operational and processing decision.”

According to Cooper, sometimes that impact may not be felt by the customer themselves, but by the asset servicer.

As a customer-focused industry, Cooper says “we are always aiming to please, but sometimes being a good partner to our customers is through not always saying ‘yes’, but more accurately asking ‘why?’. That honest discussion should lead us all to improve.”

Storms make the oaks grow deeper roots

The importance of data operations and transparent communications has been even further heightened with the ongoing pandemic. Most of the industry continues to work from home and the pressure has been ramped up to ditch manual data processes, such as an Excel sheet.

“Firms are trying to develop digital crutches in terms of virtual collaboration. And that’s valuable. But it treats a symptom, not the disease,” comments Walkup.

He explains: “You don’t want to just shore up how you have traditionally kept information in your company from straying to far off the mark. With cloud-based software tailored for private funds, and new machine learning approaches, you might as well just have more accurate data and automation.”

The rise of cloud and SaaS is becoming increasingly significant in the industry. The foundations of sound data quality coupled with infrastructure in cloud technology set up institutions in a good position to scale the exciting technology like artificial intelligence (AI) and machine learning.

Despite the hot demand in data spurred on by the storms of the pandemic, the challenges around data still remain; if it isn’t trustworthy then the decisions made based on it will be inaccurate. Likewise, feeding ‘bad data’ into an AI machine will render the fancy technology useless.

However, HSBC’s Cooper highlights that we are dealing with a pandemic that is forcing us to make decisions based on the absence of data.

For example, a firm estimating the number of their users connected to a portal before the crisis is probably quite different to the reality post the pandemic hitting and everyone having to work remotely.

“The more ‘absent’ the data, then the more extrapolation is required and the greater the potential for error. There is always a tradeoff in data between the completeness and accuracy of data. The natural tension between those two viewpoints has moved because of the pandemic and the appetite for such a change in ratio has also adapted because of the pandemic,” comments Cooper.

Innovation acceleration

In addition to showing the importance of data operations and transparent communications, the pandemic is also encouraging innovation. The financial services industry is already slowly but surely lowering the barriers of entry and increasing competition and innovation from within and outside the industry. However, it can be argued that there is still some way to go with innovation within asset servicing.

“It can be challenging due to option fatigue. There are also a lot of players in the industry moving into this space and it can become a bit overwhelming to constantly evaluate and re-evaluate solutions. We find that it is best to take actions sooner rather than later. We commit to the market as we go; as the market evolves, and as our customers’ needs evolve so do we,” says Servidio.

Another challenge in innovation in data is measuring benefit when dealing with an operational silo which occurs when departments or management groups do not share information, goals, tools, priorities and processes with other departments. This can negatively impact operations.

A lack of education and training around data is also a challenge. Experts have observed a phenomenon of people finding it easier to work harder and ask for more people when they’re under pressure rather than face up to something they have less experience with.

Walkup notes: “There is an attitude, that’s not unjustified, among many in fund administration that folks in software have to convince them the software is worth it.”

While that may be fair, there are also still quite a few fund administrators that are taking too long getting comfortable with data operations.

“The later you finally rip off the band-aid and learn how to integrate data operations, the more you have to play catch-up from behind. That distance becomes harder and harder to close,” he says.

Continuing to grow

The future of data will be to deliver quicker with a more comprehensive universe. Cloud has been around for a while now, but being able to store masses amounts of data for a lower price to allow for in-depth analysis to support AI and machine learning will be pivotal.

Costs are a major challenge for the industry right now but the cloud can allow for the reduction in costs as well as minim management of cost. It also allows for virtually unlimited storage.

“We think there is a major opportunity in the barriers of usage when it comes to data science. There is a software opportunity in making data technology more accessible and easier to use,” comments Servidio.

But while data and technology continue to grow, the people and processes used to deliver the services will still be key.

“You can give great tools to people, however, if they don’t adapt the processes that they have in place for using those tools, you are unlikely to release the full benefits,” Cooper says.

Walkup reminds us that going forward “ingestion of data is where the work needs to be done”.

The data has to be accurate in order to work and to be fed into AI machines. Building a new process that ensures clean data is coming in from the front end can potentially be time-consuming and costly, but it is easier than going back into a flawed process and trying to clean it up after it’s already deeply embedded in the organisation.

“Applying machine learning to the ingestion and contextualising of more and more kinds of data is where we’re seeing a lot of attention being devoted right now. And that’s not just true in fund services, that’s true across professional services’ industries that move around high volumes of statements and documents,” concludes Walkup.

Advertisement
Get in touch
News
More sections
Black Knight Media