Articles Marketmedia

Filtering Out The Noise

Written by Terry Flanagan | May 18, 2012 7:50:50 AM

A cacophony of information impedes regulatory risk management.

Amid a heightened level of regulatory scrutiny, capital markets firms are trying to filter out the noise to get a more precise view of vital information needed for assessing performance, risk and compliance.

“Regulatory risk is pressuring firms to refocus on their data strategies and their ability to rapidly analyze data, collect information and accurately respond to demands for reporting and transparency,” said Jeremy Skaling, head of product management at Eagle Investment Systems, a subsidiary of BNY Mellon that provides data management, investment accounting and performance measurement systems.

A common thread running throughout all discussions of regulatory risk is consolidation.

“As a risk issue, we constantly hear about single source for the derivation of risk data and a firm-wide view of collateral,” said J. Michael Hopkins, president of securities processing (U.S.) fixed income and risk, at Broadridge Financial Solutions, a provider of investor communications.

“Organizational data repositories are a hot topic of discussion in terms of responding to regulatory/compliance requests for data,” said Hopkins.

Regulatory risk is requiring firms to collect and manage much more data than they have in the past and for the data to be easily accessible to meet demands.

“Firms must also be able to meet client demands to understand what risks were taken to generate the performance,” said Skaling. “This is also leading to increases in types of calculations and enrichment required and also the volume of reporting.”

For example, the adoption of Form PF, which requires private fund advisors to file disclosure data on holdings and risk exposures within their funds, has led to multiple new data aggregation services offering services to help firms manage and report the data.

“Similar kind of challenges will be faced for various other regulatory initiatives, such as reporting on Form CPO-PQR for commodity pool operators and foreign account withholding under the Foreign Account Tax Compliance Act [Fatca],” said Bruce Treff, managing director and U.S. product head of mutual fund servicing at bulge bracket bank Citi.

“These data management challenges and solutions will ultimately come at a price to industry participants and investors,” said Treff.

As a result, firms are looking more closely than ever at enterprise data management tools to manage the lineage of where data originates, where it is stored and who are the consumers.

Due to these demands, firms require the ability to rapidly integrate disparate systems, enrich data with additional sources, create central reporting repositories and ensure that data is accurate and readily available.

“Firms are actively changing their data management processes to ensure two primary future needs are met with respect to regulatory compliance,” said Chris Grandi, managing director of Abacus Group, which provides IT services to hedge funds and private equity firms.

“First, firms are moving their data to cloud platforms to ensure multiple layers of redundancy,” Grandi said. “Second, firms are deploying processes to retain all data and all changes to files such that any regulatory organization can quickly view this data if required.”

Information Overload
While information overload poses a challenge for all industries, it’s particularly acute for the financial sector because of regulatory demands.

“Information is a pillar of the financial services business; it is now globalized, electronic, high-frequency and inter-related across instruments, markets and institutions,” said Phil Lynch, chief executive of Asset Control, a provider of financial data management solutions and services. “As such, it is no longer able to be effectively managed by humans using terminals, limited sources of information and spreadsheets.”

Reference data management, for example, requires multiple subsets each with its own requirements, such as product data, client data, pricing data and taxonomy data.

“Taxonomy data is critical to identifying counterparty exposure,” said Alberto Corvo, managing principal, financial services at outsourcing firm eClerx. “Firms need to manage exposures with entities and sub-entities and, if you don’t have the taxonomy right, it creates a number of problems.”

Critical functions like reference data management require outside expertise.

Outsourced service providers have emerged to provide buy- and sell-side firms with high-quality data management services, freeing them to do the planning and overseeing that is required to build robust data management functions.

“Reference data is one of the most active remediation programs in capital markets today,” said Fred Cohen, group vice-president and global head of the capital markets and investment banking practice at technology and outsourcing firm iGate.

Quality Control
Poor quality of reference data continues to create major problems for financial institutions, according to a survey of reference data professionals conducted by iGate.

Among key findings, the 2012-2013 Reference Data Management Industry Survey revealed that multiple data silos still exist in 75 per cent of firms surveyed.

“We have clients that operate in a siloed reference data environment,” said Cohen. “The risk is that you are pricing and integrating reference data differently across silos and, when you try to aggregate holdings, the pricings and components of these securities are different.”

Despite recommended practices of centralizing reference data operations, 31 per cent of the firms surveyed still manage data locally. Home-grown reference data solutions predominate, putting institutions at risk for meeting regulatory constraints.

The survey had responses from a group of 107 reference data professionals across the globe—the U.S., Europe, Middle East and Africa, and Asia-Pacific—from Tier 1, Tier 2 and Tier 3 organization sizes (based on revenue) and included a variety of firms from the buy side, sell side and universal (retail and investment) banks to cover every major segment of business.

“The industry survey shows that financial services firms are still facing major issues maintaining quality and consistency in their reference data management processes,” said Cohen. “Firms still seem to be at a lower maturity level of data GRC [governance, risk and compliance] requirements and most are pre-occupied at the lower levels of trying to improve data quality and manage silos.”

Risk management rose in importance since iGate’s last reference data management survey in 2010, reflecting the impact that regulatory directives may have on customer operations. The need for cost savings, system automation and speed of new product introductions were cited as lower priorities, according to the respondents.