News by sections
ESG

News by region
Issue archives
Archive section
Multimedia
Videos
Search site
Features
Interviews
Country profiles
Generic business image for editors pick article feature Image: Shutterstock

28 May 2014

Share this article





The risks in risk management

Risk management is now a part of the daily lexicon of the network manager, no matter how senior or junior. The move towards more comprehensive, integrated frameworks for risk management has to be welcomed, providing of course that the benefits outweigh the costs and that, once implemented, risk management frameworks are genuinely fit-for-purpose. The jury is probably out on both counts.

Risk management is now a part of the daily lexicon of the network manager, no matter how senior or junior. The move towards more comprehensive, integrated frameworks for risk management has to be welcomed, providing of course that the benefits outweigh the costs and that, once implemented, risk management frameworks are genuinely fit-for-purpose. The jury is probably out on both counts.

Anyone’s risk management set up is only as good as the weakest link in the chain. The amount of investment required by a bank to make sure that network management is not the weakest link differs widely from bank to bank. Many that have not felt the need to invest in this area have been confident in their reliance on existing systems, generally developed in-house. ‘System’ in this context, is a safer word to use than ‘solution’.

This—the rise of risk management as a discipline—is a game of ‘a better system’, not ‘best’, or, on occasions, do nothing at all (sometimes also described as ‘we’ve got it covered’). For many years some banks have prided themselves on having the best systems because they have been developed in-house. Many have regarded it as a key competitive advantage developing such systems in-house, the key criterion being in-house development rather than quality of solution.

Of course, many people might equally suggest that this approach ultimately builds in obsolescence precisely because there is no ‘best of breed’ collaboration or sharing of ideas. This is not the best approach where risk management is concerned, although diversity does have its benefits when talking about systemic risk.

In a similar article last year, I made the following statement: “A fundamental truth … is that information is the key determining factor in risk management, not the process itself. Poor information will always compromise the risk management function. As clean and comprehensive a stable of core data, organised in a logical, accessible fashion is the only starting point for a proper risk management exercise.” In any risk management decision, those responsible are ultimately relying on the quality and accuracy of underlying data. Gaps or errors in that data will always compromise the final decision.

A good example here is the monitoring of concentration risk. For some time now there has been a ‘push me, pull you’ effect in this area: the number of suitable, available sub-custodians might have fallen in some markets, heightening concentration risk, at precisely the time when having a shadow network in the same markets, to mitigate that concentration risk, has grown. On the one hand, the type of risk being monitored has changed—fewer entities to monitor, but in greater depth—and on the other, more entities to monitor in (arguably) just as much detail.

Overall, the ‘processing’ burden, the assimilation and assessment of all this information, has grown, and sometimes substantially. However, the tick box approach has been to layer in another process, without necessarily looking at the underlying quality of data, its organisation and its immediate availability.

Many teams have been asked to ‘bend’ existing technology from other departments or third party suppliers to their needs. This is the focus on process. Someone else has something you can use and considerations of cost have, at times, over-ridden the risk management objective. Re-using an existing system or process that might well be ill-suited for this specific purpose, somehow misses the point. Even if the process is sufficiently fit-for-purpose, it is still reliant on the quality and accuracy of underlying information. For this you need a proper system, even if it is only a ‘feeder’ system into the wider risk management effort.

You only have to look at the eye-watering fines levelled at banks in recent years, to understand just how misguided and truly expensive the ‘cost first, consequence later’ approach really is.

Indeed, these legacy systems are often inherited from (and paid for by) other teams, because the economic imperative outweighs the sensible approach of implementing a purpose-built solution. A number of banks have embarked on vastly expensive and inefficient internal overhauls of legacy systems because of dominant (and defensive) IT departments. There is considerable risk in this approach, even ignoring the considerations of cost, timescales and shareholder value.

Some institutions are almost positioning themselves for increased risk, at the very time they should be being nimble about reducing exposure. However, some have embraced innovation and implemented off-the-shelf, cost-effective solutions, which have helped to move them ahead of the crowd.

From a risk management point of view, understanding ‘context’ is the most sensible starting point for any platform sitting in the network management arena. If you do not have a truly comprehensive view of all your nostros, you will be hamstrung from the outset. If you do not understand who you are working with and to what level, this will always be a risk. Having different teams working with similar exposures will undermine any risk management effort by the wider group—there is always a chance that what is missed from an umbrella view is actually what ends up costing money.

If information is the key determining factor in the quality of any risk management effort, then information manifests itself in Management Information Systems. MIS is built on a bedrock of sound static data, which has been scrubbed at set-up and around which high standards of data cleanliness can be maintained. There are two connected processes here, each reliant upon the other. The first is creating a transparent, easily maintained database, the second is pitching quality, dynamic information into that database, often on a regular basis.

If good selection and good monitoring procedures can leverage this static context, then all that good work still needs a good home to go to, not just to tick the box this time around, but also to position that dynamic data for future access and re-use. Indeed, re-using old data as part of the risk management process is an important part of making judgements about changes in circumstances. Therefore, calling upon relevant archived data, preferably in the same system, makes a huge amount of sense for the risk manager.

I am not aware of Excel spreadsheets having an ‘alert’ capability. When that date in cell B23 comes to pass and I remain blissfully unaware of its importance and the expiry of a particular document or the passing of a review date, then spreadsheets can be deemed to have fallen at the first hurdle. Unless someone actually scrutinises the spreadsheet on a daily and perhaps even hourly basis, then any notion of automation can be dispensed with, as can the degree of infallibility attributed to this particular system. Herein lies the risk in risk management.

Advertisement
Get in touch
News
More sections
Black Knight Media