Articles Marketmedia

Real-Time Risk Reduction Strategies

Written by Terry Flanagan | Mar 13, 2012 7:17:00 PM

At its core, Big Data is about extracting value from the mass of information swirling about capital markets firms.

“Big Data refers to the complexity of simultaneously aggregating, integrating, consolidating and analyzing data from multiple sources in near time, helping generate actionable information,” said Marc Alvarez, senior director, reference data infrastructure, at financial information supplier Interactive Data Corp.

While the issues of Big Data—as well as tools for addressing it, such as business intelligence and data mining—have been around for a long time, the term has come to encapsulate exploding volumes of data brought on by changes in market structure, technology and regulations, which demands a more creative approach.

“Big Data has three pillars,” said Marcus Kwan, vice-president of product strategy and design at CQG, a provider of real-time and historical data, graphics and technical analysis tools. “The first is determining business need, the second is collection and distribution, and the third is analytics and execution.”

“The speed at which data needs to be collected, processed and delivered has grown exponentially,” said Kwan. “Even more complex, by orders of magnitude, is applying analytics to that data.”

Capturing and making sense of real-time market conditions and event data are driving the adoption of technology for performing real-time analytics.

“I get the most out of market data when the tools we use can aggregate data across multiple sources; we can get financial data, price data and shareholder patterns all in one single interface,” Pratik Sharma, managing director at Atyant Capital, a hedge fund specializing in Indian equities and precious metals. “I also need to run analytics, i.e., look at the correlation between Berkshire Hathaway’s share price and the gross money supply.”

Financial institutions are finding flexibility to price more complex or illiquid instruments a key concern.

“Hedge funds are making markets in illiquid areas and need a pricing and risk solution they can trust and that they can get up and running quickly,” said Bob Park, chief executive of FINCAD, a provider of over-the-counter pricing software. “A lot of the legacy systems might not be taking a modern approach to price derivatives and build curves.”

This has led to a lot of interest in FINCAD’s F3 product platform.

“There is still a significant and growing need to use derivatives as part of the financial toolset,” Park said. “We have seen a growing number of diverse FINCAD users and prospects that are looking for a variety of analytics and risk tools for both hedging and speculative purposes.”

StreamBase, a provider of complex-event processing (CEP) technology, has launched StreamBase LiveView, which enables firms to address the challenges of managing high-speed order flow and reacting in real-time.

“Front desk trading activities have long been measured by milliseconds, but middle and back offices still struggle to maintain a real-time view of positions, markets and operations,” said Richard Tibbetts, chief technology officer at StreamBase. “We will see an increased pressure on keeping risk management and compliance applications as close to the speed of trade executions as possible.”

StreamBase LiveView consumes data from streaming real-time data sources, creates an in-memory data warehouse and provides push-based query results and alerts to end-users.

At the bank for which it was developed, StreamBase LiveView handles around 10 million trading events a day, streaming from 12 event-based systems within the firm.

“Trading and risk systems need to connect to multiple trading platforms, aggregating and distributing prices to clients and analytic systems,” said Tibbetts. “These all present not only new challenges but also opportunities for market participants to differentiate themselves with a broader, deeper and real-time awareness of market conditions.”

As the frequency of risk reporting has speeded up, from end of day to intraday, and to real-time or near real-time, so has the need for gathering and analyzing data.

“The movement towards real-time risk monitoring is one of the most common use cases we encounter for complex-event processing,” said Dominic Iannacone, director of business development at enterprise software maker Sybase.

The focus in the last couple of years has shifted from pure trading speed to gaining intraday views of a firm’s risk position, which can fluctuate rapidly during the course of a trading day.

During the early to mid-2000s, firms were intent on increasing the speed of trading, which now takes microseconds or nanoseconds. By 2008, however, there began a change of emphasis from blinding speed to gaining insight into real-time exposures, and being able to do that in near real-time.

Paradoxically, a brute force approach of throwing more technology at the problem is not the solution.

“There’s a diminishing return on deploying faster hardware to get to the same answer as your competitors more quickly,” said Iannacone. “The solution lies not just in speed, but in the ability to spot something new using analytics.”

In fact, speed can diminish the quality of analytics by magnifying the impact of wrong decisions, just as a ship or airplane traveling at high speed will get to its correct destination in a longer period of time if its compass is off by a single degree.

“As a market data provider, our platform can do anything except give a specific answer,” said CQG’s Kwan. “It’s not an issue of technology, but of asking the right questions. Otherwise, you’ll just get the wrong answer faster than anybody else.”