Risk managers are coming to grips with the challenges associated with aggregating risk data in response to regulatory and economic forces.
On the regulatory front, global systemically important banks, or G-SIBs, face a January 2016 deadline for complying with principles laid down by the Basel Committee on Banking Supervision on risk data aggregation and risk reporting.
“The biggest issue with risk data aggregation right now is a complete change in the environment with respect to the tractability of data,” said Sanjay Sharma, chief risk officer, global arbitrage & trading at RBC Capital Markets, during a webinar on Thursday. “Before the crisis, there was a considerable emphasis on VaR and other statistical measures that we use to formulate risk. Post-crisis, both from a regulatory perspective and internal risk management perspective, the emphasis is now more on stress testing, which is more deterministic.”
Under the new regime, banks will no longer be able to extrapolate small data sample sizes, but instead will need to address Big Data issues of data quality.
“If you were running probabilistic analysis before, then if you had data quality issues that could be somehow compensated by more rigorous probabilistic analysis,” Sharma said. “But when you're doing deterministic analysis using one set of assumptions, like stress testing, then the quality of the data that you are using is paramount because you are measuring against a threshold.”
When measuring against a threshold, then the outcome is binary: Either the institution is above the threshold or below it, at which point it needs to review the analysis to determine why the results are what they are.
Another challenge is that risk data is scattered across the institution. When doing an institution wide stress test, all of this data have to be combined. “It is a very challenging process to combine all of the data into one receptacle or container where you can maintain the tractability of it,” said Sharma. “So this is what I would call a three-dimensional challenge, which is the aggregation, the tractability, and applicability. These are the challenges that, I believe, the industry is facing right now.”
To facilitate consistent and effective implementation of the principles among G-SIBs, the Basel Committee decided to use a coordinated approach for national supervisors to monitor and assess banks' progress. The first step of this coordinated approach was to issue a "stocktaking" self-assessment survey completed by G-SIBs, other large banks and supervisors during 2013.
“The Basel Committee on Banking Supervision has tasked us to come back to them on what our strategy is to meet the principles that they decide around risk aggregation by January 2016,” said Simon Feddo, head of change for legal entity, client and account at UBS, during the webinar. “We're doing a lot of work internally auditing our current risk data aggregation capabilities, and also taking a very close look at our risk reporting practices.”
Feddo added, “Today things aren't as consolidated across our control functions as we would like, both from an operational or a technology perspective. When we look at our architecture and the plumbing that we have put in place internally, we see a number of breakpoints where we need to be able to get this holistic view across our risk data.”
Issues associated with risk data aggregation center around accuracy, comprehensiveness, completeness, accuracy, and integrity, said Feddo.
UBS has “risk aggregation hubs” where trade execution data is matched with data that it holds for its inter-trading agreements.
“What we've found is, the hubs and all the different touch points where this data flows through end up with us making pessimistic matching decisions due to the lack of ability to match data across the control functions, across risk finance and treasury,” said Feddo. “When we look at that supply chain and how risk aggregation data flows between the functions, this is where we start to see areas for improvement.”