The issue of the role of high-frequency trading has once again bubbled up to the surface.
The issue of high-frequency trading and its impact on markets is receiving greater scrutiny in the wake of the trading incident involving electronic market maker Knight Capital, as well as other well-publicized cases of market failures.
While HFT is producing profound changes to markets, the discussion on whether these have been a net positive or negative will inevitably be colored by the perspective of those advocating one viewpoint or another.
What’s needed more than anything else, according to experts, is an unbiased critique of HFT and implications for market making, liquidity and volatility.
“All of the technical breakdowns, such as Bats, Nasdaq/Facebook and Knight have contributed to the perception that the markets are no longer a safe place,” said John Taft, chief executive of asset manager RBC Wealth Management in the U.S.
“There is a perception on the part of individual investors that the market is unsafe because of the preponderance of institutional and computer-driven trading.”
This perception, however, is more than likely untrue.
Market quality has improved concurrently with changes in equity market regulation and technological advances, including the use of computer-based trading, financial market trade association Sifma noted in a 2011 report.
Computer-based trading, including some strategies described as HFT, “help to provide liquidity, ensure fair pricing and contribute to orderly markets”, the report said.
A More Nuanced View
Taft, a former chair of Sifma, takes issue with those who adopt a knee-jerk response to weaknesses in market structure by portraying high-frequency traders as bogeymen.
“The perception that HFTs are stepping in front of retail order flow and exacerbating market volatility is not true,” he said. “In fact, most academic research supports the opposite view, namely that HFT has improved liquidity and reduced execution costs, which benefits individual investors.”
The Knight case and those that preceded it illustrate the need for better monitoring, not less technology.
The recent case of Knight Capital, the big U.S. market-maker that lost $440 million during a 45-minute software glitch, has inevitably rekindled the debate about whether the capital markets have become too reliant on technology, according to Rik Turner, senior analyst, financial services technology, at Ovum Research, a consultancy.
Ovum argues that rather than advocating a less technological approach, the case proves the need for more, and better, monitoring systems.
“High-frequency trading is currently demonized in some circles for its perceived distortion of the market at moments such as the ‘flash crash’ [in May 2010 when the Dow Jones Industrial Average index plunged 1,000 points, almost 9%, only to recover within minutes], not to mention what some observers consider to be the unfair advantage it gives compared to retail investors, on account of the technology invested,” said Turner in a recent online posting.
“Ovum begs to differ, regarding HFT not as an aberration but as a natural consequence of the ever-increasing automation of trading generally,” he added.
The acerbic reaction to Knight Capital’s technology glitch has elicited a cynical pessimism about the role of technology in markets.
“Unfortunately, [Knight’s] internal systems failures have tipped off yet another crisis of confidence,” said Louis Lovas, director of solutions at OneMarketData, a New York-based financial analytics provider. “The code bug has ignited the ‘techno-luddites’ with warnings of market mayhem and the portents of another flash crash.”
Controls, Not Censure
There can be no denying that algorithmic trading is advancing rapidly due to sophisticated algo-development frameworks.
“While test infrastructure can validate new algorithms, it’s pre-trade risk that is the sentinel safeguarding the castle and ultimately the entire kingdom,” said Lovas. “But Knight’s mis-step is not an indictment of the whole algo-driven industry.”
Knight’s software glitch puts the spotlight firmly back on to the use of technology to underpin activities in the capital markets, and “calls into question the wisdom of automating trades by instantiating strategies in software algorithms”, said Taylor of Ovum.
However, a more nuanced view is that technology is a response to a more complex, fragmented market.
“Reliance on technology in the capital markets generally, and on automated strategies in particular, are no more than a logical response to evolving market conditions,” said Taylor. “Indeed, there is no real choice for market participants but to equip themselves with the latest technology if they are to be in the game with anything other than a long-only strategy.”
The need for thoroughly vetting algorithmic trading strategies and for rigorous software development discipline is amply illustrated by the Knight incident and similar episodes.
“Market participants need to debug their code thoroughly, as well as test the system on a demo account at great length before releasing it in the wild,” said Matthew Hors, managing director at Varick Capital, an equity raising and capital introduction firm. “I also feel regulators need tighten things up in regards to black box systems. More specifically there needs to be ways to control the volatility resultant of HFT.”
Indeed, a joint report by the U.S. Securities and Exchange Commission and the Commodity Futures Trading Commission on the May 2010 flash crash concluded that HFT had made the market “so fragmented and fragile that a single large trade could send stocks into a sudden spiral”.
Sifma has come out in support of regulatory proposals to address the possible adverse consequences of computer-based trading, such as limiting excessive market data traffic by providing incentives to reduce quote-to-trade rations, maker-taker pricing/rebates and access fees, market maker incentives and obligations, and additional empirical studies on factors causing increased volatility, including HFT.
However, Sifma strongly opposes banning HFT or other forms of computer-based trading, or imposing artificial limits on technological advances, such as minimum quote durations and limiting co-location.
“What companies need to do is implement monitoring systems that enable them to nip such events in the bud,” said Taylor of Ovum. “Rather than letting the rogue algorithm run amok for 45 minutes, Knight should have had a system in place to detect and flag the software’s aberrant behavior and minimize its effect.”