« June 2010 | Main | August 2010 »

July 2010

Tuesday, July 27, 2010

Smart - but is it Smart Enough?

Posted by John Bates

Nasdaq liked the idea of real-time market surveillance so much that it bought one of the companies that specialize in it.

 

Nasdaq OMX announced this week that it will buy Smarts Group, an Australia-based market surveillance business that helps exchanges monitor compliance with trading rules.  You can read the full story here: http://online.wsj.com/article/BT-CO-20100727-712964.html.

 

The market moves a lot faster than it used to thanks to algorithmic trading. What has not kept pace is monitoring high speed trading. Smarts is one commercial approach that aims to enable such monitoring. However, there is a big problem with Smarts - the time it takes to develop a new surveillance scenario. I have spoken to a number of venues around the world, including the Australian Stock Exchange, who have told me they are totally dependent on Smarts to add new rules when they need one – and it takes 6 months to a year – if they’re lucky. In fast-moving markets we need to evolve in hours not years!!

 

Despite shortcomings with Smarts, the Nasdaq acquisition is an indicator of the importance of real-time surveillance in a post-flashcrash world. Maybe the flash crash of May 6th has a silver lining if lessons learned are leading exchanges to better use surveillance and monitoring. In the aftermath of the crash, exchanges scrambled to recover trading data and do some forensic investigation into the causes.  This proved extremely difficult probably because of inadequate analysis capabilities to pinpoint what had happened.

 

Exchanges, ECNs, brokers, traders and regulators all must take an intelligent approach to monitoring and surveillance in order to prevent rogue trades and fat fingers. Transparency is the key. Regulators in the US and Europe are concerned about the lack of transparency in markets where high frequency algorithmic trading takes place, as well as in dark pools.

 

We ran a survey at SIFMA this year where we asked 125 attendees about high frequency trading and market surveillance. A staggering 83 percent said that increased transparency is needed to effectively deal with market abuse and irregular market activity, such as the flash crash. However, only 53 percent of firms surveyed currently have real-time monitoring systems in place.

 

Nasdaq says that Smarts will be used to expand broker surveillance solutions, which I take to mean monitoring scenarios such as sponsored access. This would be a smart move (forgive the pun). With naked access, high frequency traders can plug straight into an exchange through their broker – and it’s critical that pre-trade risk and surveillance is in place to prevent a crisis in which wild algos could cause havoc.

 

The detection of abusive patterns or fat fingered mistakes must happen in real-time, ideally before it has a chance to move the market. This approach should be taken on board not just by the regulators, but by the industry as a whole. Only then can it be one step ahead of market abuse and trading errors that cause a meltdown (or up).

 

As many market participants have pointed out, technology can't solve all of the problems, but it can help to give much more market transparency. To restore confidence in capital markets, organizations involved in trading need to have a much more accurate, real-time view on what's going on. In this way, issues can be prevented or at least identified much more quickly.

 

While I applaud Nasdaq's initiative and dedication to improving market surveillance buying Smarts, I must point out that you don't have to go quite that far to get the same results. Progress provides market-leading real-time monitoring, surveillance and pre-trade risk – powered by Complex Event Processing – enabling complex real-time monitoring of the fastest moving markets. Unlike Smarts, Progress includes the ability for business users to customize and create new scenarios rapidly (in hours rather than Smart’s months). And you don’t have to buy and integrate our company to get access to it!!

Thursday, July 22, 2010

Beware the weight-challenged digits

Posted by John Bates

Fat fingers (or weight-challenged digits to my politically correct friends) have had a good run lately. First we heard that Deutsche Bank had to close its quantitative trading desk in Japan after an automated trading system misread equities market data. The system generated a massive sell order that caused the Nikkei 225 Stock Average to dive (full story here: http://tinyurl.com/23rnn5v). Then an unknown trader spiked the Swedish krona and a computer glitch at Rabobank smacked sterling by about 1%, according to the Wall Street Journal (http://tinyurl.com/2el9kgw).

Although the press was surprised that the efficient foreign exchange market was susceptible to trading errors, it is just as vulnerable as equities or futures. In FX, trades are often made directly to an FX trading destination such as EBS, Reuters or Currenex. In many institutions, trades are often made without adequate pre-trade checking or risk management applied.

As my colleague, Deputy CTO - Dr. Giles Nelson, told the Wall Street Journal: “The consensus in the market is that this was a computer-based trading error, but ultimately there would have been a human involved somewhere.”

Human error is part of being human. The reality of highly automated trading is that the programs are built by humans and run by super-fast machines. And unless there are robust computerized checking mechanisms that vet trades before they hit the markets, errors can wreak havoc in the blink of an eye.

Deutsche Bank's algorithms generated around 180 automated sell orders worth up to 16 trillion yen ($183 billion) and about 50 billion yen's worth were executed before the problem was addressed. The Rabobank mistake could have dumped £3 billion worth of sterling into the market in one lump, rather than splitting it up to lower market impact - but luckily the bank spotted the error and stopped the trade before it was fully completed. The Swedish krona mistake sank the krona against the euro by 14% before it was spotted. 

Pre-trade risk checks would help to prevent errors, trade limit breaches, or even fraudulent trading from occurring. And pre-trade risk controls need not be disruptive.  Ultra-low latency pre-trade risk management can be achieved by trading institutions without compromising speed of access.  An option is a low latency "risk firewall" utilizing complex event processing as its core, which can be benchmarked in the microseconds. 

With a real-time risk solution in place, a message can enter through an order management system, be run through the risk hurdles and checks, and leave for the destination a few microseconds later. The benefits of being able to pro-actively monitor trades before they hit an exchange or ECN or FX platform far outweigh any microscopic latency hops. They include catching fat fingered errors, preventing trading limits from being breached, and even warning brokers and regulators of potential fraud - all of which cost brokers, traders and regulators money. 

India – big potential for algorithmic trading

Posted by Giles Nelson

I spent last week in India, a country that, by any standards, is growing fast.  Its population has doubled in the last 40 years to 1.2B and economic growth has averaged more than 7% per year since 1997.  It’s projected to grow at more than 8% in 2010. By some measures, India has the 4th biggest economy in the world. 

Progress has a significant presence in India. In fact, people-wise, it’s the biggest territory for Progress outside the US with over 350 people. Hyderabad is home to a big development centre and Mumbai (Bombay) has sales, marketing and a professional services team.

The primary purpose of my visit was to support an event Progress organised in Mumbai on Thursday of last week on the subject of algorithmic trading. It was also our first real launch of Progress and Apama, our Complex Event Processing (CEP) platform, into the Indian capital markets. We had a great turnout, with over 100 people turning up. I spoke about what we did in capital markets and then participated in a panel session where I was joined by the CTO of the National Stock Exchange, the biggest in India, a senior director of SEBI, the regulator, and representatives from Nomura and Citigroup. A lively debate ensued.

The use of algorithmic trading is still fairly nascent in India, but I believe it has a big future. I’ll explain why soon, but I’d like first to give some background on the Indian electronic trading market, particularly the equities market, which is the largest.
 

The market
India has several, competing markets for equities, futures and options, commodities and foreign exchange too.  In equities, the biggest turnover markets are run by the National Stock Exchange (NSE) and the Bombay Stock Exchange (BSE), with market shares (in the number of trades) of 74% and 26% respectively. Two more equity exchanges are planning to go live soon – the Delhi Stock Exchange is planning to relaunch and MCX is also currently awaiting a licence to launch. This multi-market model, only recently adopted in Europe for example, has been in place in India for many years.

It was only two years ago that direct market access (DMA) to exchanges was allowed. Although official figures don’t exist, the consensus opinion is that about 5% of volume in equities is traded algorithmically and between 15% and 25% in futures and options. Regulation in India is strong - no exchange allows naked access and the BSE described to me some of the strongest pre-trade risk controls I’ve come across - collateral checks on every order before they are matched. The NSE has throttling controls which imposes a limit on the number of orders a member organisation can submit per second. Members can be suspended from trading intra-day if this is exceeded. The NSE also forces organisations who want to use algorithms to go through an approval process. I’ll say more about this later. Controversially, the NSE will not allow multi-exchange algorithmic strategies so cross-exchange arbitrage and smart-order routing cannot take place. Lastly, a securities transaction tax (STT) is levied on all securities sales.

So, with the above restrictions, why do I think that the Indian market for algorithmic trading has massive potential?
 

The potential
The Indian market is very big. Surprisingly so to many people. Taking figures from the World Federation of Stock Exchanges (thus I’m not counting trading on alternative equity venues such as European multi-lateral trading facilities), the Indian market, in dollar value, may still be relatively modest – it’s the 10th largest. However, when you look at the number of trades, India’s the 3rd largest market, only beaten by the US and China. The NSE, for example, processes 10 times the number of trades as the London Stock Exchange. So why isn’t more traded in dollar terms? That’s because trade sizes on Indian exchanges are very small. The median figure worldwide is about $10K per trade. The figure in India is about $500 per trade, a 20th of the size. In summary, surely the task of taming the complexity of this number of trades and the orders that go with them is ideal for algorithmic trading to give an edge? To compare to another emerging, “BRIC”, economy, that of Brazil, where the number of firms using Apama has gone from zero to over 20 in as many months, the dollar market size is fairly similar but the number of equity trades in India is 33 times more. The potential in India is therefore enormous.

India is already there in other ways. All exchanges are offering co-location facilities for their members and debate has already moved on to that common in more developed markets on whether this gives certain firms an unfair advantage or not and whether co-location provision should be regulated.

 

The challenges
There are some difficulties. The STT is seen by some as an inhibitor. However, its effect is offset somewhat by the fact that securities traded on exchange are not subject to capital gains tax. 

The NSE process for approving algorithms is more controversial. Firms that want to algorithmically trade must show to the NSE that certain risk safeguards are in place and “demonstrate” the algorithm to the exchange. As the biggest exchange, the NSE wields considerable power and thus its decision to vet algorithms puts a brake on market development. I believe this process to be unsustainable for the following reasons:

  1. As the market develops there will simply be too many algorithms for the NSE to deal with in any reasonable timeframe. Yes, India is a low-cost economy, but you need highly trained people to be able to analyse algorithmic trading systems. You can’t simply throw more people at this. Firms will want to change the way algorithms work on a regular basis. They can’t do this, with this process in place.
  2. It raises intellectual property issues. Brokers will increasingly object to revealing parts of their algorithms and their clients, who may want to run their alpha seeking algorithms on a broker-supplied co-location facility, will most definitely object. 
  3. It puts the NSE in an invidious position. Eventually an algo will “pass” the process and then go wrong, perhaps adversely affecting the whole market. The NSE will have to take some of the blame.
  4.  Competition will force the NSE’s hand. The BSE is trying to aggressively take back market share and other exchanges are launching which will not have these restrictions.

It strikes me that the NSE should spend its efforts into ensuring that it protects itself better. Perhaps a reasonable comparison is a Web site protecting itself from hacking and denial of service attacks. If they can do it, so can an exchange. And it would offer much better protection for the exchange and the market in general.
 

In conclusion
I’m convinced of the growth potential in India for algo trading. The market is large, the user base is still relatively small and many of the regulatory and technical prerequisites are in place. There are some inhibitors, outlined above, but I don’t think they’ll hold the market back significantly. And finally, why should India not adopt algo trading when so many other, and diverse, markets have?

Progress has its first customers already in India. I look forward to many more. 

Wednesday, July 21, 2010

Defending Against the Algo Pirates

Posted by John Bates

It was an honor to sit on the CFTC Technology Advisory Committee (TAC) last week. I was very impressed with the presentations and discussion, chaired ably by Commissioner Scott O’Malia. I was also impressed by the other Commissioners and with my fellow committee members. This week the CFTC has been discussing new rules to handle disaster recovery and has also received further coverage on one topic discussed at the TAC – that of pirate algos attacking algos going about their normal trading business and aiming to manipulate the market.

 

Further coverage can be seen in this article “CFTC posits new disaster recovery rules as regulators probe 'algo price pirates'”

 

http://www.finextra.com/news/fullstory.aspx?newsitemid=21610

 

The CFTC has a sensible proposal on the table to require exchanges and clearing houses to have effective disaster recovery plans in order to quickly recover from any market-wide disruption. After 9/11 it became clear that many NYC-based financial services firms were not prepared for a disaster of that magnitude, and subsequently took disaster recovery (or business continuity as it came to be known) very seriously. Now it is time for those virtual businesses - exchanges and ECNs - to do the same.

 

Operational risk is a very real issue in today's fast moving markets, where anything can go wrong. Being able to recover and quickly start trading again - across all exchanges and destinations - is paramount. The May 6th 'flash crash' gave us a glimpse of what can happen if something went wrong at one exchange and the rules across other exchanges were not harmonious.

 

The flash crash was a man-made event exacerbated by machines. Algorithms are programmed to do as they are told, and if one destination is not responding they will hunt down and ping, scrape and trade on whatever others they can find. Sometimes this can have unfortunate consequences for the market as a whole. This is why there must be consistency across trading venues in how they respond to crises.

 

At the CFTC's Technology Advisory Committee meeting last week, there were several interesting observations about high frequency trading and algos. We heard new analysis of the flash crash from trade database developer Nanex LLC. The Nanex report suggested that predatory practices such as "quote stuffing", where algos try to prevent others high-frequency traders from executing their strategies, may have contributed to the crash. Commissioner Chilton of the CFTC (who I had the pleasure of sitting next to at the TAC last week), the TAC and the SEC are taking these claims very seriously. Commissioner Chilton expressed his concern that there are algorithms out there hunting down and interfering with other algorithms, calling them 'algo price pirates' that may trigger a new enforcement regime. Now I believe that firms and their algos are going to be monitoring the market with the goal of figuring out how your algos work and devising a strategy to capitalize – that’s just the natural order of capitalism. However, that’s different from using algo terrorism to bully the market into behaving a particular way. That’s something we need to watch for and prevent for it causes damage.

 

If such 'pirates' are to be policed and caught, the regulators will have to sail with the pirates in shark-infested high frequency waters. Surveillance and monitoring are critical, as is the need for speed. The speed at which algorithms can pump quotes into a destination is daunting, so the policemen will also need to work at ultra high velocity. I was a little concerned when Commissioner Chilton said at the TAC meeting: "Just because you can go fast it doesn't mean you should." I know where he’s coming from but would modify the statement to say that in HFT it is critical to go fast to be competitive – but you need the proper best practices, training and safety precautions. High frequency trading, if properly monitored, need not be scary or evil. It can contribute to liquidity and market efficiency, and provide alpha generation. To truly address the HFT issue, real time market surveillance technology must be adopted to monitor and detect patterns that indicate potential market abuse such as insider trading or market manipulation. Or pirate algorithms trying to board your strategy ship and take off with the gold doubloons. 

Tuesday, July 13, 2010

CFTC Launches Technology Advisory Committee

Posted by John Bates

Yesterday the CFTC, the regulator in charge of Futures and Options markets, announced details of a new Technology Advisory Committee (TAC), chaired by the very capable Commissioner Scott O’Malia. See article here:

 

http://www.reuters.com/article/idUSN126207520100712

 

I am absolutely delighted to be included in the group of experts that the CFTC has called together to form the TAC. I am joined by an extraordinary group of some of the industry's top executives from banks, brokers, trading firms, exchanges and clearing firms as well as some very impressive academics. On Wednesday, July 14th (tomorrow as I write) we will meet to discuss the impact of high frequency and algorithmic trading on the markets, including whether algorithms may be implicated in the May 6th 'flash crash'. From this, we’ll discuss what recommendations we have for regulation of and/or best practices for algorithmic and high frequency trading.

 

High frequency and algorithmic trading are essential for efficient execution and alpha generation in a complex, multi-asset, fast-moving world. However, there are a number of accusations that have been made against these forms of trading, including that they may aggravate volatility and may even have caused the ‘flash crash’. I believe evidence from the TAC participants will exonerate the accused.

 

I am hoping that our meetings will result in solutions that not only head-off future ‘flash crashes’, but also help exchanges, banks and brokers to better monitor and police trades. The proactive use of real-time monitoring systems can alert regulators to problems before they become a crisis. Monitoring technology can 'see' major price and volume spikes in particular instruments, how often they happen and maybe even why, and whether a pattern in market behavior caused them. It can also tell how much trading is potentially market abuse, for example, insider trading might be detected by correlating unusual trading incidents with news releases and market movements. (The FSA, for example, thinks that 30% of trading around acquisitions is insider.)

 

It is now possible to apply high frequency techniques to not just trading – but also to market monitoring, surveillance and pre-trade risk checks – for regulators, exchanges and brokers. The technology is out there (with proven approaches built on next generation platforms such as CEP) and it needn't be expensive. The CFTC's TAC is a positive step in the right direction. I look forward to the meeting and will let you know how it goes! Follow me on Twitter @drjohnbates where I'll Tweet when possible.