websites-group
  • NewsLetter
Contributors Corner

The Value of Time: Solving the Cost of Compliance

The Value of Time: Solving the Cost of Compliance

Today, the cost of compliance is perhaps the number one issue in the boardrooms of banks across the world. Banks and brokers have invested time, energy and money trying to solve the issue with varying degrees of success. Mark Sykes, Group Market Strategist at Kx Systems, examines how firms are rethinking data, technology and the underlying concept of time to overcome the hurdles.

The challenge is that regulations and the need for more insight and control all require far greater integration of data. Many have deployed solutions to help do this cost-effectively – and terms such as ‘enterprise-wide view’ have understandably become popular. However, trying to solve the data riddle purely through the lens of architecture is missing a vital point: the unifying factor across all data is a dependency on time. The ability to capture and factor in time when ordering events could truly be the key to unlocking the real cost efficiencies.

The three types of time
All data within a financial institution falls into one of three categories: historical, recent or real-time – and firms will need to access each for different reasons.

The incoming BCBS 239 regulation is a good example of where firms – in this case, global systemically important banks (G-SIBs) – need to address governance of historical data. It includes principles around clarity, completeness, quality and others – all of which need to be in place by January 2016.

Customer reporting requires firms to get a handle on recent data from earlier in the day while various surveillance activities demand a combination of recent and real-time insight. This allows firms to draw conclusions by comparing events as they happen against past scenarios. The Basel III CRD IV also calls for recent information while, increasingly, risk metrics such as VaR rely on rapid interpretation of real-time data.

Creating a temporal view
Whether it’s market data, corporate actions, specific trades, client orders, chat logs, emails, SMS or the P&L, each piece of data exists and changes in real-time, earlier today or further in the past. However, unless they are all linked together in a way that firms can analyse, there is no way of providing a meaningful overview of the business at any point in time.

Mark Sykes, Kx Systems Mark Sykes, Kx Systems

One option is a flexible temporal approach, which allows firms to knit together information at precise points on a metaphorical ball of string. That means they can rewind events and see what the state of the business was at a particular moment in history, including the impact of any actions leading up to it.

Some firms have struggled to introduce this concept due to the limitations of their technology. There are architectures that simply don’t allow for time-based analysis – and it’s a common issue that can result from opting for a seemingly safe choice.

Even where firms have succeeded in layering a temporal view on their data, it is only part of a longer journey. Just storing the time at which each business event occurred is simply not enough – firms need to fundamentally understand time. That means they have to interconnect all the information within the time dimension in a meaningful way. Also, to look at the world in this way calls for speed so users can cut through the complexity and maintain performance.

Square pegs and round holes
Legacy technology, again, is a hurdle. In many cases, the three different collections of data will live on three different platforms – such as a traditional database for historical data, in-memory for recent and event processing software for real-time.

This type of technology stack can offer a wide variety of options. However, it is rare for constituent applications to span more than one of the three domains effectively – and even rarer for them to do so with sufficient performance.

As a result, users can make connections within their historical or in-memory databases, but not across their whole system. The unfortunate reality is that programmers might have to write very similar code two or three times in different systems and then aggregate results back together before they can be used.  However, this is time-consuming, error-prone and makes the system very difficult to maintain in the long-term.

Ultimately, if the set up is not designed for time-series analysis at its heart, it becomes an expensive, complex and slow solution – and users will find themselves trying to fit square pegs into round holes.

A unified approach
The question then becomes how to write the code once so users can deploy a single query and see results across all of their data sets. The answer is one unifying piece of architecture that spans all time boundaries – and this really is the key to putting theory into practice.

Many firms will recognise that all their data is in some way time-dependent. By rethinking and unifying the underlying technology, they can move beyond simply storing data or even analysing part of it, to truly understanding all of it and making correlations across the business.

The cost benefit is huge – and that’s crucial. In order for regulations to deliver on transparency and stability, firms have to be able to comply in the first place. Often, this ends up as a balance sheet issue. Time really does mean money – and firms that embed it into their technology in the right way stand to make the greatest savings.

Related articles

  1. Solidus HALO is currently used to monitor more than 1 trillion events per day across more than 150 markets.

  2. The absence of designated intermediaries pose challenges for regulation, compliance, and financial stability.

  3. FTX describes the tools the crypto firm uses to to implement a sanctions compliance program.