websites-group
  • NewsLetter
Data Management

Scaling the Regulatory Heights

Scaling the Regulatory Heights

New regulations require firms to sift enormous quantities of data.

As new regulations such as Dodd-Frank, Basel III and Solvency II are implemented, demands are being placed on financial firms to track the source of data, how it has changed over time and who has changed it.

The ability to pull data in near real-time from a wide variety of sources to calculate exposure reports and feed risk algorithms is especially important as plans toward centrally cleared OTC markets continue.

Regulatory changes are forcing firms to source and report increasingly large amounts of trade data, as well as to adopt higher quality risk and pricing models.

“The focus of regulatory agencies on collecting and analyzing systemic risk, as well as SEC [Securities and Exchange Commission] examination priorities on alternative product managers, especially as it relates to valuation and pricing, are challenging investment firms in new and multiple directions regarding how they collect, store and manage data,” said Bruce Treff, managing director and U.S. product head of mutual fund servicing at bulge bracket bank Citi.

The Dodd-Frank Act contains major reforms to the derivatives market, including requiring that standardized, or vanilla, OTC swaps be executed on a swap execution facility or exchange, and be cleared through a central counterparty clearing house.

Other jurisdictions, such as Europe with the updated Markets in Financial Instruments Directive (MiFID II), Markets in Financial Instruments Regulation (MiFIR) and European Market Infrastructure Regulation (Emir), have or plan to establish similar requirements.

“The reality is that while most firms are struggling to manage their information management and record retention in a way that satisfies existing privacy and regulatory requirements, many do have systems and infrastructure in place to achieve that aim,” said Richard Baker, chief executive of commodities bourse Cleartrade Exchange.

“Accordingly, any regulatory mandated changes will only serve to add to their existing burdens.”

The way a firm will choose to cope with additional regulatory needs will be determined by the nature of the firm’s business.

“For example, OTC derivative dealers’ method of compliance may be to enforce daily tracking and monitoring requirements plus data collection and timely reporting of all transactions, including those exempted from mandatory clearing requirements,” Baker said.

An increasing number of service providers have been providing trade repository systems to help meet the near real-time post-trade reporting requirements.

Filling in the Gaps
Regardless of the type of business a firm is in, it is necessary that a gap analysis be carried out to identify the processes required to be put in place to remain compliant with any additional regulation.

“As reporting requirements increase, there has been a definite demand on data that would not have been aggregated previously so there is a lot of work going on to make sure that they have the right data and that it is accurate,” said John Schneider, lead partner for accounting firm KPMG’s Investment Management regulatory practice.

It is also essential that there is an accessibility and completeness of data.

“Firms are also realizing that this data is valuable for other purposes as well, including providing better reporting to clients and regulatory reporting,” said Schneider.

Asset managers are grappling with how to automate data management processes to ensure that data has been properly vetted before it’s disseminated to the public.

“Since the financial crisis, firms have re-evaluated their current operational and technical capabilities, and the forthcoming regulations have pushed that evaluation even further,” said Mark Coriaty, managing director at software provider Ledgex Systems.

While data workflow and data management really is at the heart of regulatory risk governance.

“We are seeing many firms rely on specialized third-party integrators to ensure their data management strategy is in line with the impending regulations and with the advent of private cloud services, firms note that these private services not only meet their needs but they are also cost-effective,” said Coriaty.

Preparing product data for distribution to the public domain has traditionally involved lots of manual processes.

Data comes from a variety of sources both inside and outside the organization, and it needs to be aggregated, normalized and validated so that it can be placed in the public domain via fact sheets, websites, client reports, presentations and RFPs.

Ensuring Quality Data
Too often, a formal data quality management process and supportive technology framework is not employed to prepare product data for publication, therefore inconsistent data can appear in different areas.

With increasing regulatory pressures on the radar of board members, forward-looking companies are using this opportunity to exert top-down pressure for change.

“Home-grown systems are being abandoned for custom software developed by companies specializing in data aggregation and workflow management,” said John Bosley, chief financial officer at software firm Bonaire Software Solutions. “Bonaire continues to challenge leaders by providing the software tools necessary to reform antiquated data management and business practices.”

Management of data in large institutions is a constant struggle. Following is a short list of some of the challenges facing current chief technology officers, according to Bosley.

1) Some legacy systems are grandfathered into practice, while mandates require the upgrading or sun-setting of other systems, either by license terms, compatibility mandates or external pressure.

2) Operations are spread across multiple geographies, often with varied regulatory and compliance standards.

3) Homogenization between front, middle and back offices requires massive coordination and effort; and

4) Data is stored in vastly different formats. Cross-platform incompatibilities are a persistent barrier against the elusive ‘golden source’.

“These challenges result in a level of data distress that reduces efficiency, complicates audits and cripples business intelligence at the organizational level,” said Bosley.

Related articles

  1. ISDA warns on proposed changes to post-trade deferrals regime.

  2. The partnership will focus on delivering an institutional custody solution for digital assets.

  3. The IOSCO Fintech Task Force will collaborate closely with other international bodies.