Articles Marketmedia

Exchanges Need to Justify High Data Fees

Written by Terry Flanagan | Nov 6, 2019 1:22:03 PM

Exchanges can justify high fees – but only if they are adding value!

By Neill Vanlint, Head of Global Sales & Client Operations, GoldenSource

Neill Vanlint, GoldenSource

Across global financial markets, it is hard to think of a more contentious issue right now than the prices exchanges are charging for their trading data. Some high-profile potential acquisitions, including the London Stock Exchange’s $27bn bid to buy Refinitiv, means that there could soon be more market data monopolies than ever before. Let’s face it, regardless of the providers, market data has never exactly been cheap but prices are getting higher for traders. And with just a few firms currently dominating, there is no question that banks need to ensure they are getting more for their money.

Prior to the financial crisis, lower capital requirements meant trade volumes were booming and the markets were awash with cheap data. This is a far cry from today, with more stringent capital rules putting a real squeeze on risk taking. As a consequence, trade volumes are down, and a reduction in income from trading activity has forced exchanges to develop additional revenue streams.

At the same time, regulation among other things is requiring firms to consume broader data sets more frequently.

From fancy co-location servers for high speed traders, to trendy trading software to stamp out market abuse, exchanges go way beyond being a place for companies to raise money by listing securities. However, these new services do not change the fact that a continued tightening of belts means that financial institutions are not going to jump on the idea of paying excessive data fees unless the exchanges can demonstrate they are adding value.

Despite all the developments and sophistication of the data sets and the technology used to publish it, it remains an indisputable fact that the information distributed from the exchanges is never immediately fit for use. It still needs to be checked, repaired, validated and organised before it can contribute to analysis or reporting by the various teams that use it, such as traders and risk managers. And herein lies the problem – all these operational data niggles feed into the overall cost. But even if there is an enforced cap on market data prices from exchanges, or even if banks ‘pool’ their data collectively and make it available for less, this still won’t make a drastic difference.

The only way that investment firms can plan for and manage data costs from exchanges is to underpin their market data strategy with an infrastructure that makes data operations as efficient as possible, and oversee it with data governance policies that ensure the data is being used judiciously. From the perspective of an exchange, members continue to demand access not just to pricing, but reference and corporate actions data from listed firms, regardless of the cost. The trouble is that in the modern world of equity trading, members are increasingly dependent on small basis point movements and instantaneous reaction. As such, there has never been a greater need for exchanges to provide accurate, all-encompassing market data. After all, if the price of the data goes up, so too should the quality.

Moving forward, in an attempt to make better informed decisions, analysts and traders are constantly seeking for alternative data sets. And there is no question better quality data is at the heart of the LSE’s interest in Refinitiv. Regardless of the other drivers behind this potential acquisition and others before it, the reality is that a smaller group of very dominant exchanges have greater data firepower at their disposal. With this in mind, financial institutions need to remember one thing – data is a premium staple rather than a commodity. It comes at a price, and the more value that technology is seen to be able to derive from it, the greater the premium that exchanges will be ‘justified’ charging for it.