Business Event Processing

Friday, June 17, 2011

Still big room for growth in Japan

Posted by Giles Nelson

And from Mumbai on to Tokyo. In so many ways, a bigger contrast between cities is difficult to imagine. 

Japan has, of course, had a tough time of it recently. Not only has the recent earthquake and tsunami knocked Japan back, but also Japan has had a long period of relative economic stagnation, compared to other Asian economies.  Its development in algorithmic trading also sets it apart from other developed economies due to the relatively low proportion of trading which is done algorithmically.

There’s little consensus on what this proportion is however. In a recent report, Celent, the financial market analyst firm, reported that around 25% of trading was algorithmic in 2010. Figures from the Tokyo Stock Exchange (TSE) report that around 35% of orders submitted to the exchange are from co-location facilities – it is reasonable to assume that nearly all of these could be regarded as “algorithmic”. From my conversations with people in the days I was in Tokyo who worked at exchanges, sell-side firms and our customers and partners, I’m going to put the figure at between 40-50%. 

That means there’s a lot of room for growth when you consider that the proportion of securities traded in the US and Europe algorithmically, in one way or another, nears 100%. One inhibitor to growth has now been removed. In 2010, the TSE launched a new exchange platform, Arrowhead, which reduced latency from up to two seconds down to around 5 milliseconds. In other words, the TSE is now “fit for purpose” for algorithmic trading and, in particular, for high frequency trading. Previously, with latencies being so long, high frequency firms who, for example, wanted to market-make on the TSE, simply weren’t prepared to take the risk of the exposure and uncertainty that such high latencies bring. Since Arrowhead's launch in January 2010, and according to the TSE’s own figures, the total number of orders on the TSE has risen by a modest 25%, but the proportion of orders submitted from co-location facilities has more than doubled. 

Progress exhibited and spoke at Tradetech Japan and we were joined on our stand by our partner, Tosho Computer Systems, that we’re working with on a service in Japan which we will be launching later this year. They’ll be more news on that nearer the time. Attendance wise, Tradetech was down on its peak in Japan in 2008, but up on previous years – a reflection of the renewed interest in trading technology generally.

Market surveillance was one of the key topics that came up in Tradetech Q&A and panel discussions. This is common across pretty much any market now, with the essential question being: how can markets be kept safe as markets get faster and more complex? Some say there should be restrictions and whilst circuit breakers, insistence on pre-trade risk checks and similar are important, over emphasis on “the dangers” can hold markets back. Progress’ view is that regulators and exchanges should all move towards real-time market surveillance. (Find more on this here and here).

There’s a lot of emphasis at the moment in European and US markets on OTC derivatives regulation and the move of trading in such instruments onto exchanges. Japan is relatively advanced in this regard with regulation requiring many domestic OTC derivatives to be cleared, a trend which is happening elsewhere in Asia too more quickly than in Europe and the US. 

Regionally, Japan is the biggest developed market in terms of share trading volume. Twice as big in dollar terms than the next biggest, Hong Kong. But Japan is itself dwarfed now by China and I’ll be writing about that next.


Monday, February 07, 2011

The Trouble with Algorithms: Wild Children or Reckless Parents?

Posted by Dan Hubscher

Algorithms and high frequency trading have been blamed for everything from the credit crisis to the May 6th flash crash and high speed market abuse, and have attracted unwanted interest from regulators on both sides of the pond. But questions remain whether these tools are really computer models gone wild or whether they are the spoiled children of reckless parents - regulation.

According to, the definition of reckless is to be utterly unconcerned about the consequences of an action. One could argue that the Regulation National Market System was designed without regard to some of the consequences down the line. Blaming the wild children, algorithms, is to ignore that the parents - RegNMS - were somewhat reckless in designing the system.

In a blog on the TABB Forum on January 24th,  Steve Wunsch of Wunsch Auction Associates explained that the system was working the way it had been designed.

"What really went wrong in the stock market on May 6? Prices aside, all of the plumbing was working fine. Not only were there no fat fingers, rogue algos, manipulators or terrorists at work, there were no significant breakdowns of order routing systems or data systems or any other elements of the stock trading infrastructure," wrote Wunsch.

Meanwhile, the National Commission on the Causes of the Financial and Economic Crisis in the United States released its report (Jan. 27th) and HFT was not mentioned at all. Nor were algorithms, as such, but 'computer models' were vindicated. The report said: "The crisis was the result of human action and inaction, not of Mother Nature or computer models gone haywire."

And it criticized regulators for not doing their jobs: “Widespread failures in financial regulation and supervision proved devastating to the stability of the nation’s financial markets.”

The result of the credit crisis and market meltdown in Sept. 2008 was the Dodd-Frank Act, which attempts to prevent another Sept. 2008.  But the flash crash insinuated itself into the picture, pointing out that no one had baked that possibility into the market reforms.  And, ironically, the market reforms set the stage for more flash crashes.

At the Tabb Forum Derivatives Reform Event a couple of weeks ago, a lot of people commented that Dodd-Frank puts in place a market structure that injects the equities and futures markets model, along with fragmentation, price transparency, streaming  quotes, into other asset classes. This theoretically invites algorithmic and high frequency trading and the threat of more flash crashes. At the event Peter Fisher of BlackRock said that what keeps him up at night is a flash crash in the interest rate market, citing the market structure argument, but specifically pointed out that this possibility was not envisioned in Dodd-Frank. 

With more and more asset classes becoming tradable electronically, partly thanks to mandated swap execution facilities (SEFs), the possibility of truly wild or rogue algos and market abuse becomes increasingly inevitable. And, as we pointed out last week, the very real possibility of a flash crash splashing across asset classes - we call it a "Splash Crash" - rears its ugly head.

Although the evidence against algos gone wild is thus far anecdotal for the most part, the belief that they can and will go wrong permeates the industry. Market abuse such as insider trading and manipulation are undoubtedly more prevalent. Fat finger errors are easier to prove, and are a fact of life in a high speed, high stress electronic marketplace.

Stay Calm and Remain Vigilant

The antonym of recklessness is vigilance. The regulatory parents must be more vigilant when it comes to their arguably brighter and naughtier children - algorithms and HFT. With algorithms and HFT come the possibility of mistakes and abuse. Many more firms outside of the equities world are embracing HFT and their inexperience can cause market disruptions. A flash crash in oil or other commodities - or even foreign exchange - is not to be scoffed at. In fact, many commodities markets are much less liquid and homogenous than equities, and can be even more vulnerable to mistakes or manipulation.

 There are a number of best practices that can be used to mitigate against algos going wild:

  • Diligent backtesting – using historic data and realistic simulation to ensure many possible scenarios have been accounted for. A backtesting process needs to be streamlined of course – as short time to market of new algos is key.  
  • Real-time risk monitoring - building a real-time “risk firewall” into your algo environment. Just like a network firewall stops anomalous network packets reaching your computer, so a risk firewall should stop anomalous trades getting to trading venues.
  • Real-time market surveillance. Even if trades do not breach risk parameters, they may breach compliance rules, regulations or may be perceived by a regulator as market manipulation.

An algorithm is a tool in a trader's toolkit, not a naughty wild child. If the regulator parents are vigilant, and algos are subject to practical controls and monitored constantly for performance and for errors, market participants can sense and respond to market patterns before the aberrations or errors have a chance to move prices.


Thursday, January 20, 2011

Red Flags in Morning, Firms Take Warning

Posted by John Bates

A pattern is emerging within new financial services regulations where regulators and financial services firms deploy monitoring technology to "red flag" potential issues such as risk, position limits, errors and manipulation. The "red flags" raised would then alert the relevant personnel or authorities.......... See the full post here

Monday, December 13, 2010

Calming 'Regulation Anxiety'

Posted by Dan Hubscher

There is a new kind of emotional disorder going around the financial markets - the previously unnamed fear of something ominous now that new financial rules have been laid down. Let's call it regulation anxiety.

Regulation anxiety has led to all sorts of new types of behavior in banks such as laying off proprietary trading staff, hiring ex-SEC lawyers, and laying on extra lobbyists to besiege Capitol Hill. The syndrome is so widespread that it has finally attacked the foreign exchange market - the market that performed the best during the financial crisis despite a lack of almost any regulation. And although the FX market 'ain't broke' it will undoubtedly get 'fixed' under new rules. It is these fixes that are causing panic attacks in the FX industry.

A survey of FX professionals at the Bloomberg FX10 conference in October showed marked anxiety over the impact of regulation and also possible changes to market structure.  More than 80 percent of those polled said they were concerned about the impact of recent regulations on their businesses.  They were also against structural reform and at odds as to which industry model is best for the future.  According to Bloomberg, the majority of the respondents were opposed to an exchange-traded model or a clearing house model, with only 19% believing the FX markets should have both clearing houses and exchange-traded requirements.

FX is a unique asset class in many respects; being (to date) almost totally free from regulation and benefiting from high liquidity on a global scale. Traders - wholesale, institutional and retail - are attracted by the ease and convenience of online currency buying and trading. The statistics bear this out with an average turnover of around $1.5 trillion per day – a clear indication of the strength of the market.

FX liquidity and volatility is growing day by day and trading foreign exchange in fast-moving, highly volatile times carries a high level of risk. As such it may not be suitable for all types of investors, institutions and buy-side firms. As a result, sell-side organizations that are serving the quickly-growing needs of hedge funds, proprietary traders, and other firms that take on these risks take on their own additional risk. There is a need to manage their own increased risk intelligently without erasing their competitive advantages.  

At the same time increased automated order volumes from the buy-side represent revenue opportunities for sell-side firms. But attracting that order flow away from competitors requires unique services, aggressive pricing and the ability to find the best prices in a highly fragmented market - not to mention the speed and scale needed to keep up in a high-risk environment.

There are solutions available which enable sell side institutions worldwide to rebuild their FX eCommerce platforms in line with the requirements of the most challenging customers and prospects. This is with a view to automate and customize their trading operations to become more competitive. There are now technologies that combine FX trading venue connectivity with a bird’s eye view of the market in real time; aggregating fragmented liquidity and including smart order routing algorithms, enabling every element of an eCommerce platform to automatically find and leverage the best prices.  

And, a very few include a rich development framework for both business users and IT. The flexibility for the business user allows traders to create and rapidly deploy proprietary FX and cross-asset trading strategies that help them competitively engage with clients.

There have been numerous recent examples of banks looking to take advantage of these solutions. For example, Royal Bank of Canada (RBC) recently deployed a new FX Aggregation solution to support its foreign exchange dealing operations. The Progress Apama FX Aggregation Solution Accelerator  is completely customizable and has been modified for RBC to meet its specific requirements. RBC's new system has significantly increased the efficiency in which its traders obtain the best FX prices for their clients.

RBC is the latest in a growing list of global and regional banks, which have deployed this type of platform as a foundation for eCommerce. Other organizations that have deployed FX solutions driven by technologies from Progress Software (namely its Apama product) recently include BBVA, UniCredit and ANZ, who can now access multiple sources of liquidity and dramatically improve their ability to handle increased trade volume.

The best way to deal with anxiety is to address the root cause. In this case, regulation. Regulation is coming, change is coming. Since the FX world is now facing looming regulations with dramatic impact, you’re going to need to adapt your business models and supporting applications quickly in order to survive – for instance by building flexible rules within your FX trading systems to identify and manage risks, whatever they may turn out to be.  If you do, you’ll be ahead of the pack and will be able to create competitive advantage.


Thursday, July 22, 2010

Beware the weight-challenged digits

Posted by John Bates

Fat fingers (or weight-challenged digits to my politically correct friends) have had a good run lately. First we heard that Deutsche Bank had to close its quantitative trading desk in Japan after an automated trading system misread equities market data. The system generated a massive sell order that caused the Nikkei 225 Stock Average to dive (full story here: Then an unknown trader spiked the Swedish krona and a computer glitch at Rabobank smacked sterling by about 1%, according to the Wall Street Journal (

Although the press was surprised that the efficient foreign exchange market was susceptible to trading errors, it is just as vulnerable as equities or futures. In FX, trades are often made directly to an FX trading destination such as EBS, Reuters or Currenex. In many institutions, trades are often made without adequate pre-trade checking or risk management applied.

As my colleague, Deputy CTO - Dr. Giles Nelson, told the Wall Street Journal: “The consensus in the market is that this was a computer-based trading error, but ultimately there would have been a human involved somewhere.”

Human error is part of being human. The reality of highly automated trading is that the programs are built by humans and run by super-fast machines. And unless there are robust computerized checking mechanisms that vet trades before they hit the markets, errors can wreak havoc in the blink of an eye.

Deutsche Bank's algorithms generated around 180 automated sell orders worth up to 16 trillion yen ($183 billion) and about 50 billion yen's worth were executed before the problem was addressed. The Rabobank mistake could have dumped £3 billion worth of sterling into the market in one lump, rather than splitting it up to lower market impact - but luckily the bank spotted the error and stopped the trade before it was fully completed. The Swedish krona mistake sank the krona against the euro by 14% before it was spotted. 

Pre-trade risk checks would help to prevent errors, trade limit breaches, or even fraudulent trading from occurring. And pre-trade risk controls need not be disruptive.  Ultra-low latency pre-trade risk management can be achieved by trading institutions without compromising speed of access.  An option is a low latency "risk firewall" utilizing complex event processing as its core, which can be benchmarked in the microseconds. 

With a real-time risk solution in place, a message can enter through an order management system, be run through the risk hurdles and checks, and leave for the destination a few microseconds later. The benefits of being able to pro-actively monitor trades before they hit an exchange or ECN or FX platform far outweigh any microscopic latency hops. They include catching fat fingered errors, preventing trading limits from being breached, and even warning brokers and regulators of potential fraud - all of which cost brokers, traders and regulators money. 

India – big potential for algorithmic trading

Posted by Giles Nelson

I spent last week in India, a country that, by any standards, is growing fast.  Its population has doubled in the last 40 years to 1.2B and economic growth has averaged more than 7% per year since 1997.  It’s projected to grow at more than 8% in 2010. By some measures, India has the 4th biggest economy in the world. 

Progress has a significant presence in India. In fact, people-wise, it’s the biggest territory for Progress outside the US with over 350 people. Hyderabad is home to a big development centre and Mumbai (Bombay) has sales, marketing and a professional services team.

The primary purpose of my visit was to support an event Progress organised in Mumbai on Thursday of last week on the subject of algorithmic trading. It was also our first real launch of Progress and Apama, our Complex Event Processing (CEP) platform, into the Indian capital markets. We had a great turnout, with over 100 people turning up. I spoke about what we did in capital markets and then participated in a panel session where I was joined by the CTO of the National Stock Exchange, the biggest in India, a senior director of SEBI, the regulator, and representatives from Nomura and Citigroup. A lively debate ensued.

The use of algorithmic trading is still fairly nascent in India, but I believe it has a big future. I’ll explain why soon, but I’d like first to give some background on the Indian electronic trading market, particularly the equities market, which is the largest.

The market
India has several, competing markets for equities, futures and options, commodities and foreign exchange too.  In equities, the biggest turnover markets are run by the National Stock Exchange (NSE) and the Bombay Stock Exchange (BSE), with market shares (in the number of trades) of 74% and 26% respectively. Two more equity exchanges are planning to go live soon – the Delhi Stock Exchange is planning to relaunch and MCX is also currently awaiting a licence to launch. This multi-market model, only recently adopted in Europe for example, has been in place in India for many years.

It was only two years ago that direct market access (DMA) to exchanges was allowed. Although official figures don’t exist, the consensus opinion is that about 5% of volume in equities is traded algorithmically and between 15% and 25% in futures and options. Regulation in India is strong - no exchange allows naked access and the BSE described to me some of the strongest pre-trade risk controls I’ve come across - collateral checks on every order before they are matched. The NSE has throttling controls which imposes a limit on the number of orders a member organisation can submit per second. Members can be suspended from trading intra-day if this is exceeded. The NSE also forces organisations who want to use algorithms to go through an approval process. I’ll say more about this later. Controversially, the NSE will not allow multi-exchange algorithmic strategies so cross-exchange arbitrage and smart-order routing cannot take place. Lastly, a securities transaction tax (STT) is levied on all securities sales.

So, with the above restrictions, why do I think that the Indian market for algorithmic trading has massive potential?

The potential
The Indian market is very big. Surprisingly so to many people. Taking figures from the World Federation of Stock Exchanges (thus I’m not counting trading on alternative equity venues such as European multi-lateral trading facilities), the Indian market, in dollar value, may still be relatively modest – it’s the 10th largest. However, when you look at the number of trades, India’s the 3rd largest market, only beaten by the US and China. The NSE, for example, processes 10 times the number of trades as the London Stock Exchange. So why isn’t more traded in dollar terms? That’s because trade sizes on Indian exchanges are very small. The median figure worldwide is about $10K per trade. The figure in India is about $500 per trade, a 20th of the size. In summary, surely the task of taming the complexity of this number of trades and the orders that go with them is ideal for algorithmic trading to give an edge? To compare to another emerging, “BRIC”, economy, that of Brazil, where the number of firms using Apama has gone from zero to over 20 in as many months, the dollar market size is fairly similar but the number of equity trades in India is 33 times more. The potential in India is therefore enormous.

India is already there in other ways. All exchanges are offering co-location facilities for their members and debate has already moved on to that common in more developed markets on whether this gives certain firms an unfair advantage or not and whether co-location provision should be regulated.


The challenges
There are some difficulties. The STT is seen by some as an inhibitor. However, its effect is offset somewhat by the fact that securities traded on exchange are not subject to capital gains tax. 

The NSE process for approving algorithms is more controversial. Firms that want to algorithmically trade must show to the NSE that certain risk safeguards are in place and “demonstrate” the algorithm to the exchange. As the biggest exchange, the NSE wields considerable power and thus its decision to vet algorithms puts a brake on market development. I believe this process to be unsustainable for the following reasons:

  1. As the market develops there will simply be too many algorithms for the NSE to deal with in any reasonable timeframe. Yes, India is a low-cost economy, but you need highly trained people to be able to analyse algorithmic trading systems. You can’t simply throw more people at this. Firms will want to change the way algorithms work on a regular basis. They can’t do this, with this process in place.
  2. It raises intellectual property issues. Brokers will increasingly object to revealing parts of their algorithms and their clients, who may want to run their alpha seeking algorithms on a broker-supplied co-location facility, will most definitely object. 
  3. It puts the NSE in an invidious position. Eventually an algo will “pass” the process and then go wrong, perhaps adversely affecting the whole market. The NSE will have to take some of the blame.
  4.  Competition will force the NSE’s hand. The BSE is trying to aggressively take back market share and other exchanges are launching which will not have these restrictions.

It strikes me that the NSE should spend its efforts into ensuring that it protects itself better. Perhaps a reasonable comparison is a Web site protecting itself from hacking and denial of service attacks. If they can do it, so can an exchange. And it would offer much better protection for the exchange and the market in general.

In conclusion
I’m convinced of the growth potential in India for algo trading. The market is large, the user base is still relatively small and many of the regulatory and technical prerequisites are in place. There are some inhibitors, outlined above, but I don’t think they’ll hold the market back significantly. And finally, why should India not adopt algo trading when so many other, and diverse, markets have?

Progress has its first customers already in India. I look forward to many more. 

Wednesday, June 30, 2010

What do you do with the drunken trader?

Posted by John Bates

The news that Steven Perkins, (former) oil futures broker in the London office of PVM Oil Futures, has been fined 72,000 pounds ($108,400) by the FSA and banned from working in the industry is no surprise, see article here:


It could have been worse given that the broker, after a few days of heavy drinking, took on a 7.0 million barrel long position on crude oil in the middle of the night. The fine seems miniscule since it cost PVM somewhere in the vicinity of $10 million - after unwinding the $500+ million position.


The surprising thing about this incident is that it happened at all. Perkins was a broker, not a trader. He acted on behalf of traders, placing orders on the Intercontinental Exchange among other places. That he could go into the trading system and sneak through 7.0 million barrels without a customer on the other side is unbelievable.


Heavy drinking is practically a job requirement in the oil industry, my sources tell me, so this kind of thing could be a real issue going forward. As algorithmic trading takes hold in the energy markets, trading may approach the ultra high speeds seen in equities markets.  This is a recipe for super high speed disaster, unless there are proper controls in place - especially if there were a way for the broker or trader in question to enrich himself in the process.


One powerful way to prevent this kind of accident or fraud is through the use of stringent pre-trade risk controls. The benefits of being able to pro-actively monitor trades include catching "fat fingered" errors, preventing trading limits from being breached, and even warning brokers and regulators of potential fraud - all of which cost brokers, traders and regulators money. PVM is a good example of this.


Ultra-low-latency pre-trade risk management can be achieved by brokers without compromising speed of access.  One solution is a low latency "risk firewall" utilizing complex event processing as its core, which can be benchmarked in the low microseconds.  Errors can be caught in real-time, before they can reach the exchange. Heaving that drunken trader right overboard, and his trades into the bin.


Monday, June 14, 2010

Rogue Trading Below the Radar

Posted by John Bates

Jerome Kerviel, the trader who allegedly lost Societe Generale nearly 5.0 million euros, went on trial in Paris on Tuesday, June 8th. The bank alleges that Kerviel took "massive fraudulent directional positions" in 2007 and 2008, which were far beyond his trading limits.

It is interesting to note that Kerviel was not only experienced on the trading floor, but he also had a background in middle office risk management technology. It may have been this knowledge that enabled him to manipulate the bank's risk controls and thus escape notice for so long.

Still, it is perplexing that fraud on such a scale can go on without detection for so long, even if Kerviel did have an insider's knowledge of the firm's risk management systems. Internal risk controls are not something that a financial firm can take for granted, left to run unchecked or unchanged for months or years.

The detection of criminal fraud or market abuse is something that must happen in real-time, before any suspicious behaviour has a chance to lose a firm money or to move the market. Pre-trade risk management is paramount, with trading limits specified and checked in real-time. Internal controls should be monitored for possible manipulation, again in real-time. The good news is that technology does exist in the form of real-time surveillance software from companies can analyse data transactions by the millisecond.

Financial institutions need to start looking inward to improve standards, regardless of current regulation. Otherwise the culture of greed and financial gain at all costs will encourage more and more Kerviels.

Thursday, June 03, 2010

FSA Loses Insider Trading Case - but more to come...

Posted by John Bates

Today’s insider trading cases acquittals in London are a big blow to the FSA, but their ability to detect and prosecute these market abusers cannot be overlooked. Without the technology to detect the trading anomalies, alleged white collar criminals cannot be prosecuted in the first place.

It’s also clear that the FSA is sending a message to the investment community: shape up or be prepared to pay. The £33.32 million ($48.8 million) fine for JPMorgan is the largest in the FSA’s history. 

As the SEC and CFTC in the US looks to adopt market surveillance technology, it will be interesting to see the potential rise in insider trading court cases and the size of fines in the US.

I think we're going to see a lot more of this type of prosecution around the world. The FSA is currently prosecuting 11 people for alleged market abuse.

As you may have read this week (link here ->, the FSA is using new technology to crack down on potential market abusers. in the UK, the FSA receives 6m-8m transaction reports daily. The FSA will soon even have a system in place that will automatically alert staff to potential abuse in “real time”. Alexander Justham, the FSA’s director of markets, says the use of such “complex event processing” technology will give the FSA “a more proactive, machine-on-machine approach” to surveillance.

Optimism in the world of financial services regulation

Posted by Giles Nelson

It seems that we’re finally making some progress on making the financial markets function more safely. 

After the “flash-crash” of 6 May, US equity market operators have agreed to bring in coordinated circuit-breakers to avoid a repeat of this extreme event. There is widespread agreement on this. Industry leaders from brokers and exchanges yesterday made supportive statements as part of submissions to the SEC.

Regulators are going public with their use of real-time monitoring technology. Alexander Justham, director of markets at the Financial Services Authority, the UK regulator, told the Financial Times that the use of complex event processing technology will give the FSA “a more proactive machine-on–machine approach” to market surveillance (the FSA is a Progress customer). Other regulators are at least admitting they have a lot of work to do. Mary Schapiro, the SEC chair, believes that the technology used for monitoring markets is “as much as two decades behind the technology currently used by those we regulate”. Scott O’Malia, a commissioner at the Commodity Futures Trading Commission admitted that the CTFC continues to receive account data by fax which then has to be manually entered. 

The use of real-time pre-trade risk technology is likely to become much more widespread. “Naked” access, where customers of brokers submit orders directly to the market without any pre-trade checks, is likely to be banned. This is an important change as late last year Aite Group, an analyst firm, estimated that naked access accounted for 38% of the average daily volume in US stocks. The SEC is also proposing that regulation of sponsored access is shorn up – currently it has evidence that brokers rely upon oral assurances that the customer itself has pre-trade risk technology deployed. The mandated use of pre-trade risk technology will level the playing field and will prevent a rush to the bottom. Personally I’ve heard of several instances of buy-side customers insisting to brokers that pre-trade risk controls are turned off as they perceive that such controls add latency and therefore will adversely affect the success of their trading.

The idea of real-time market surveillance, particularly in complex, fragmented markets as exist in the US and Europe is gaining credence. The SEC has proposed bringing in a “consolidated audit trail” which would enable all orders in US equity markets to be tracked in real-time. As John Bates said in his previous blog post, it’s likely that the US tax-payer will not be happy paying the $4B the publically funded SEC estimates that such a system would need to get up and running. Perhaps the US could look at the way the UK’s FSA is funded. The FSA reports to government but is paid for by the firms it regulates.

As I mentioned in my last blog our polling in April at Tradetech, a European equities trading event, suggests that market participants are ready for better market monitoring. 75% of respondents to our survey believed that creating more transparency with real-time market monitoring was preferable to the introduction of restrictive new rules.

CESR, the Committee of European Securities Regulators, is currently consulting on issues such as algorithmic trading and high frequency trading. It will be interesting to see the results of their deliberations in the coming months.

I’m so pleased the argument has moved on. This time last year saw a protracted period of vilifying “high frequency trading” and “algo trading”. Now, there is recognition of the benefits as well as the challenges that high frequency trading has brought to equity markets and regulators seem to understand that to both prevent disastrous errors and deliberate market manipulation occurring it is better for them to get on board with new technology rather than try to turn the clock back to mediaeval times. 

New approaches are sorely needed. Yesterday saw the conclusion of another investigation into market manipulation when the FSA handed out a $150,000 fine and a five-year ban to a commodity futures broker.