News by sections
ESG

News by region
Issue archives
Archive section
Multimedia
Videos
Search site
Features
Interviews
Country profiles
Generic business image for editors pick article feature Image: tierneymj/shutterstock.com

29 April 2020

Share this article





Ducks in a row

In a world that heavily relies on technology, data and its capabilities are of paramount importance. In the financial services world, organised and correct data is able to empower decision making, spur revenue growth, and make organisations faster and agile.

Aside from the shiny bells and whistles, data is something that is needed for regulatory reporting, and so many firms have turned to data service providers for help amid stringent and complex regulatory requirements.

The COVID-19 pandemic has highlighted the importance of data operations and transparent communications in the financial services industry. This was recently highlighted by Kevin Walkup, president and COO of Harmonate, a data services firm, who said because of the pandemic, data is in “hot demand”, but if it’s poorly organised and shared inconsistently, “fear of the unknown is tipped too close to panic – the same thing can happen to asset managers”.

Shipshape

Data is so important because it supports day-to-day operations across the financial services landscape. Jamie Stevenson, global head of product management, data and analytics for RBC Investor & Treasury Services (RBC I&TS), explains that when all services and initiatives are data-driven, poor data management results in significant risk to the business and a competitive disadvantage.

Stevenson says: “The COVID-19 pandemic has created market volatility, placing the industry under stress resulting from an increase in transaction volumes and the subsequent rise in inquiries around trade status and book of records updates, all while working remotely. If the data isn’t correct and wisely managed, it makes operations even more challenging.”

Harmonate’s Walkup maintains that good data management increases speed, and reduces errors, while better data management delivers speed and better quality with a quicker and easier deployment.

He highlights: “The best data management makes the process easy for more of your team to use, and not resort to Excel sheets to check their work. It doesn’t get bottled up with the only people in the firm you trust to make sure it’s absolutely right.”

Simon Moss, CEO of Symphony AyasdiAI, a company that deploys machine learning to help clients see new and valuable insights in data, adds: “A beautiful thing about good data management, is if it’s used right, it shows you not just what you should do, but why in an auditable, verifiable way so everyone can get behind it and row together to safety and success. The data management function in financial institutions provides that and scales it. Paper, phone calls and long work hours won’t get you there.”

Meanwhile, from a front-office perspective, Tim Lind, managing director of DTCC data services, says data is of increasing importance to market participants as it has the power to improve trading, asset allocation, price discovery, client service, collateral management and risk management.

Choppy waters

While the importance of data cannot be disputed, the task of keeping it well organised and making it easier for people in an organisation to get data into their system can be difficult.

While it may sound simple, Walkup points out that it means using powerful machine learning and an incredibly easy user interface so you can quickly show a machine what information it is ingesting, and for the system to put that data in a standardised form so it can be rapidly accessed by anyone who needs it and quickly analysed.

The biggest challenge is making it easier than Excel spreadsheets, which has been used for data collection. Walkup notes that if you’ve been using Excel for a large part of your career, an alternative solution would be easier and produce better results. “That being said, there are a lot of people in organisations who aren’t the office Excel guru who want to have access to data without constantly having to go to the office Excel guru,” Walkup adds.

Also discussing the challenges, Moss reflects that moving past automation to intelligence is not easy, but once it’s achieved, it makes you faster and more agile.

“However, if you are only able to react quickly, but not be able to see what’s happening in front of you more clearly, that’s a level of excellence that only a small set of institutions have come to realise they can achieve,” Moss warns.

Further challenges around data relate to the global financial crisis and the regulatory reforms that have taken place since, demanding greater transparency in financial markets has focused on disclosure and trade reporting regimes.

According to DTCC’s Lind, this trend has led to the capture of larger volumes of data but while there’s no shortage of data, what the industry really needs is insights.

Lind comments: “Therefore, the challenge for institutions collectively is to harness the millions of transactions that flow through their infrastructures and create actionable information that will enhance decision-making at all levels.”

Another major challenge, Lind observes, is a lack of global standards, which has complicated reporting and supervision for counterparties, trade repositories and regulators.

“More than 100 data elements are typically reported to trade repositories for each over the counter transaction, but the same set of elements are not necessarily required from one jurisdiction to another, nor are the formats for dates, currencies and other variables consistent between them,” Lind says.

Applying a standardised approach to data terms and formats may make it possible for reported data to be aggregated across TRs and jurisdictions, Lind highlights, which in turn will allow regulators to construct a more comprehensive picture of systemic risk.

Something else that is challenging when it comes to data is the threat of a cyber-attack. Gone are the days where you needed to break into banks with guns and balaclavas, now criminals can attack from the comfort of their sofa with the click of a button.

Moss outlines that cyber-attack risks are a big problem, and frankly a driver of more intelligent IT systems. He says: “There is an arms race between fraudsters and malevolent organisations and the good guys. You can’t build a defensive system strong enough to keep out fraud or attacks. But you can get fast and smart enough to see how events are shifting and react accordingly.”

“It’s frankly the same story as data management generally. Your ability to move beyond predicting future events based on past events, and achieve an understanding of how new events can spring up from where based on how things connect, that determines your ability to fight the bad guys by winning a war of manoeuvre,” Moss affirms.

Staying afloat

In terms of keeping afloat of these challenges, Thierry Zemb, product director at NeoXam, says that cloud-based technology can enable firms to overcome these challenges as a hosted environment provides the elasticity to scale up and down quickly.

Indeed, cloud-based technology is particularly useful as the volume of data continues to rise. It needs to be managed and kept up to speed for both data integration and distribution.

Zemb outlines: “There is no question that the adoption of more cloud-based technologies can enable firms to overcome these challenges as a hosted environment provides the elasticity to scale up and down quickly.”

Weighing on this, RBC I&TS’ Stevenson says: “In order to keep up with tech innovation, there needs to be more data talent working with teams in the financial sector. It’s been a long time coming, but we are now seeing growth in terms of data talent within our clients’ organisations, creating a welcome influx in skills and competencies around manipulating and exploiting data.”

Stevenson also observes an increasing interest from asset services in data standards. He cites: “Capital markets have recognised that there is no competitive advantage in scrubbing and cleaning the data we exchange, and the growing level of interest around APIs will accelerate standardisation and transparency.”

To the cloud and beyond

Working to reach a state of nirvana where companies can have organised, accessible data and integrated structured reference and market data, as well as unstructured data, will not be smooth sailing. NeoXam’s Zemb reminds us that one key reason data management is important to financial institutions right now is market data costs. “With so much pressure on businesses to make efficiency savings, it has never been more important to buy data that is actually going to be used,” Zemb stipulated

Walkup affirms that data services are not evolving in a smooth linear system. According to Walkup, just like the bull market and cheap credit allowed lots of different companies to seem healthy.

“That same market meant weak technologies limped along, and legacy ways of managing data and reporting were good enough. Crisis has a way of pressure testing organizations to see what works and what doesn’t work. Right now we’re seeing rapid, almost wartime-like evolution, or more to the point, what doesn’t work is becoming very apparent, fast,” Walkup adds.

Stevenson predicts that there will be a shift in data management from an operational focus to data engineering and data science, as evidenced by how businesses are using AI models to analyse data and for making decisions based on data insights.

“The industry is also likely to see a shift away from a contrived view of how data looks within a data warehouse to more extensible data models being used to managed data. A good example is how we organise our music. Most people no longer download CDs onto an iPod. Now, we use streaming services like Spotify, which provide more access than any individual needs, but when a Spotify user wants to listen to a particular song, it can be located, streamed, consumed, and then the user moves on. This type of microservice is representative of the evolution of data management,” Stevenson says.

Meanwhile, Moss concludes: “It all comes down to harnessing the efficiencies and speed made possible by automation, by adding a layer of intelligence that is not reliant on past events to see what’s in front of the organisation.”

“You’re building the muscles and sinews, and then adding the faster brain to make it all work.”

Advertisement
Get in touch
News
More sections
Black Knight Media