The Event Processing Market

Thursday, July 19, 2012

Scooping FX Bubbles Out of a Boiling Pot

Posted by Ben Ernest-Jones

Foreign exchange trading appears to be moving from a beneath-the-radar, bank-dominated activity into the international trading limelight. There has been an explosion of new trading platforms and a wave of newer participants lately, partly thanks to new transparency afforded by Dodd-Frank and partly because of the relentless hunt for alpha. Increasingly automated, FX is also becoming the next go-to asset class for high frequency and algorithmic trading.

It wasn't always that way. As TABB Group's Larry Tabb said in an article in Wall Street & Technology: "FX has always been different. Be it that currency is a bank’s core product, be it that banks control the payments infrastructure, or be it that banks are critical in implementing Central Banks’ monetary policy, the banks have historically dominated FX."

Because FX is mainly traded via single dealer platforms, multi-dealer platforms such as FXall, and interdealer marketplaces, it is fragmented in a different way from equities.  Traditional trading platforms along with a couple of the sturdier newcomers like multi-dealer platforms FXall (which Thomson Reuters is buying) andCurrenex have been the dominant destinations for electronic trading of FX.

But now that the SEC and CFTC have clarified that forex contracts will be determined to be swaps, they will become part of the centrally cleared instrument pool. This means a whole new layer of banks, brokers, and venues are already popping their heads up. FX will soon emulate the expansion, consolidation, and then contraction of destinations experienced by the equities markets. 

There will be more opportunities for market participants to trade, hedge, arbitrage and manage risk. Algorithmic strategies will dominate, attracting more and more destination venues - and then fragmentation will be the mantra. So how are traders going to position themselves to scoop the profitable FX bubbles out? It is not as easy as you would think. In many cases, what appears to be an increase in liquidity is actually an increase in “phantom” orders, as institutions advertise the same underlying liquidity across an increasing number of locations. Trading algorithms will need to be smarter, and tuned over time to counteract this as the landscape changes.

Bank traders are looking for ways to handle the new world order of FX. Because their clients want to be able to trade forwards, swaps, spot and even options on the same system, banks are having to do the once-unthinkable: merge their forwards desks with their spot desks.

In the old days of voice trading, forwards and spot traders ran completely separate books and dealt with (mostly) different customers. Today clients are asking to hedge forwards and spot on the same system at the same time. Some want to trade using forward-to-spot conversions against aggregated spot prices from several platforms and some want to use aggregated forward rates directly. Some want a blending of both. The opportunities for banks are plentiful, if they can harmonize FX products, trading and hedging across trading systems successfully.

Many bank clients have seen what has happened in the equities markets; with high frequency trading and algorithmic strategies becoming problematic and largely vilified. When the world’s largest interdealer brokersaid recently that it would tackle “disruptive” practices by high-frequency traders on its foreign exchange platform it became clear that some of the lesser-loved equities issues were already creeping into FX markets.

The Wall Street Journal says that there are already fears of an FX "boom" reminiscent of the equities venue explosion in 2001. "Some market insiders fear the trend for highly specialized new systems aimed at separate pockets of clients could end up splitting the liquidity that underpins this $4 trillion-a-day market, making it harder to trade," said the paper. Pigeon-holing traders, whether it be by class of trader, asset class or by delivery date, only creates more fragmentation. This could equate to lower volumes (like equities), more volatility (like equities) and an increase in manipulative practices (like equities). Regulators will no doubt be watching, and new rules will be implemented even faster than has happened in equities.

In a discussion at the FX Week USA event in NYC recently one FX trading platform provider said that you need a full market ecology to provide proper efficiency. This means having all market participants operate in the same liquidity pool.  In the end the unique self-regulating properties of the FX markets mean that the market will shift towards what is best for the market - because it can. Preparation for this inevitability will determine who survives. 

Wednesday, June 22, 2011

A foray into Beijing

Posted by Giles Nelson

Beijing was the last stop on my three city Asian tour and, from a personal perspective, the most exciting one as I’d never visited mainland China before.

China’s seemingly inexorable economic rise has been well documented. In the last 20 years, China’s GDP growth has averaged over 9%. As I travelled from the airport into Beijing’s central business district I saw few older buildings. Virtually everything, including the roads, looked as if it had been built in the last 10 years.

The Chinese stock market is big. In terms of the total number of dollars traded, the combined size of the two stock exchanges, Shanghai and Shenzhen, is approximately double that traded in the next biggest Asian market, Japan. The increase in stock trading has been very rapid. Trading volumes on Shanghai and Shenzhen have risen by approximately 20 fold in the past 5 years, although there has been significant volatility in this rise. 

The domestic algorithmic trading market is nascent. Currently, intra-day trading in company shares is not allowed. It is the recently established futures markets therefore where algorithmic and high-frequency trading are taking place. No figures exist on the proportion of trading done algorithmically in China currently, but I’m going to estimate it at 5%.

I was in Beijing to participate in the first capital markets event Progress has held there. Although Shanghai is the finance capital of China, we chose to hold the event in Beijing to follow up on previous work we'd done there. In the end, we had about 60 people along from domestic sell-side and buy-side firms attending which was a great result considering the relatively low profile Progress has at present in this market. There was optimism and an expectation that algorithmic trading had a bright future in China. 

I believe it's a practical certainty that the Chinese market will adopt algorithmic and high frequency trading. In every developed market a high, or very high, proportion of trading is done algorithmically and, although different regulations and dynamics make each market unique, nothing except an outright ban will prevent widespread adoption in every market in time. Liberalisation in China is occurring. For example, stock index futures are now traded, exchanges are supporting FIX, short-selling has been trialled and it is now easier for Chinese investors to access foreign markets. Also, earlier this year, the Brazilian exchange, BM&FBovespa, and the Shanghai exchange signed an agreement which may result in company cross listings. Only some of these changes support electronic trading growth directly but all are evidence that the liberalisation necessary to support such growth is happening. Inhibitors remain: no intra-day stock trading, restrictions on foreign firms trading on Chinese markets thus preventing competition and knowledge transfer from developed markets, and tight controls on trading in Renminbi. The Chinese regulators will continue to move cautiously. 

The question is not if, but when. We expect to sign our first Chinese customers soon. China is becoming a very important blip on the radar. 

 

Friday, June 17, 2011

Still big room for growth in Japan

Posted by Giles Nelson

And from Mumbai on to Tokyo. In so many ways, a bigger contrast between cities is difficult to imagine. 

Japan has, of course, had a tough time of it recently. Not only has the recent earthquake and tsunami knocked Japan back, but also Japan has had a long period of relative economic stagnation, compared to other Asian economies.  Its development in algorithmic trading also sets it apart from other developed economies due to the relatively low proportion of trading which is done algorithmically.

There’s little consensus on what this proportion is however. In a recent report, Celent, the financial market analyst firm, reported that around 25% of trading was algorithmic in 2010. Figures from the Tokyo Stock Exchange (TSE) report that around 35% of orders submitted to the exchange are from co-location facilities – it is reasonable to assume that nearly all of these could be regarded as “algorithmic”. From my conversations with people in the days I was in Tokyo who worked at exchanges, sell-side firms and our customers and partners, I’m going to put the figure at between 40-50%. 

That means there’s a lot of room for growth when you consider that the proportion of securities traded in the US and Europe algorithmically, in one way or another, nears 100%. One inhibitor to growth has now been removed. In 2010, the TSE launched a new exchange platform, Arrowhead, which reduced latency from up to two seconds down to around 5 milliseconds. In other words, the TSE is now “fit for purpose” for algorithmic trading and, in particular, for high frequency trading. Previously, with latencies being so long, high frequency firms who, for example, wanted to market-make on the TSE, simply weren’t prepared to take the risk of the exposure and uncertainty that such high latencies bring. Since Arrowhead's launch in January 2010, and according to the TSE’s own figures, the total number of orders on the TSE has risen by a modest 25%, but the proportion of orders submitted from co-location facilities has more than doubled. 

Progress exhibited and spoke at Tradetech Japan and we were joined on our stand by our partner, Tosho Computer Systems, that we’re working with on a service in Japan which we will be launching later this year. They’ll be more news on that nearer the time. Attendance wise, Tradetech was down on its peak in Japan in 2008, but up on previous years – a reflection of the renewed interest in trading technology generally.

Market surveillance was one of the key topics that came up in Tradetech Q&A and panel discussions. This is common across pretty much any market now, with the essential question being: how can markets be kept safe as markets get faster and more complex? Some say there should be restrictions and whilst circuit breakers, insistence on pre-trade risk checks and similar are important, over emphasis on “the dangers” can hold markets back. Progress’ view is that regulators and exchanges should all move towards real-time market surveillance. (Find more on this here and here).

There’s a lot of emphasis at the moment in European and US markets on OTC derivatives regulation and the move of trading in such instruments onto exchanges. Japan is relatively advanced in this regard with regulation requiring many domestic OTC derivatives to be cleared, a trend which is happening elsewhere in Asia too more quickly than in Europe and the US. 

Regionally, Japan is the biggest developed market in terms of share trading volume. Twice as big in dollar terms than the next biggest, Hong Kong. But Japan is itself dwarfed now by China and I’ll be writing about that next.

 

Monday, December 13, 2010

Calming 'Regulation Anxiety'

Posted by Dan Hubscher

There is a new kind of emotional disorder going around the financial markets - the previously unnamed fear of something ominous now that new financial rules have been laid down. Let's call it regulation anxiety.

Regulation anxiety has led to all sorts of new types of behavior in banks such as laying off proprietary trading staff, hiring ex-SEC lawyers, and laying on extra lobbyists to besiege Capitol Hill. The syndrome is so widespread that it has finally attacked the foreign exchange market - the market that performed the best during the financial crisis despite a lack of almost any regulation. And although the FX market 'ain't broke' it will undoubtedly get 'fixed' under new rules. It is these fixes that are causing panic attacks in the FX industry.

A survey of FX professionals at the Bloomberg FX10 conference in October showed marked anxiety over the impact of regulation and also possible changes to market structure.  More than 80 percent of those polled said they were concerned about the impact of recent regulations on their businesses.  They were also against structural reform and at odds as to which industry model is best for the future.  According to Bloomberg, the majority of the respondents were opposed to an exchange-traded model or a clearing house model, with only 19% believing the FX markets should have both clearing houses and exchange-traded requirements.

FX is a unique asset class in many respects; being (to date) almost totally free from regulation and benefiting from high liquidity on a global scale. Traders - wholesale, institutional and retail - are attracted by the ease and convenience of online currency buying and trading. The statistics bear this out with an average turnover of around $1.5 trillion per day – a clear indication of the strength of the market.

FX liquidity and volatility is growing day by day and trading foreign exchange in fast-moving, highly volatile times carries a high level of risk. As such it may not be suitable for all types of investors, institutions and buy-side firms. As a result, sell-side organizations that are serving the quickly-growing needs of hedge funds, proprietary traders, and other firms that take on these risks take on their own additional risk. There is a need to manage their own increased risk intelligently without erasing their competitive advantages.  

At the same time increased automated order volumes from the buy-side represent revenue opportunities for sell-side firms. But attracting that order flow away from competitors requires unique services, aggressive pricing and the ability to find the best prices in a highly fragmented market - not to mention the speed and scale needed to keep up in a high-risk environment.

There are solutions available which enable sell side institutions worldwide to rebuild their FX eCommerce platforms in line with the requirements of the most challenging customers and prospects. This is with a view to automate and customize their trading operations to become more competitive. There are now technologies that combine FX trading venue connectivity with a bird’s eye view of the market in real time; aggregating fragmented liquidity and including smart order routing algorithms, enabling every element of an eCommerce platform to automatically find and leverage the best prices.  

And, a very few include a rich development framework for both business users and IT. The flexibility for the business user allows traders to create and rapidly deploy proprietary FX and cross-asset trading strategies that help them competitively engage with clients.

There have been numerous recent examples of banks looking to take advantage of these solutions. For example, Royal Bank of Canada (RBC) recently deployed a new FX Aggregation solution to support its foreign exchange dealing operations. The Progress Apama FX Aggregation Solution Accelerator  is completely customizable and has been modified for RBC to meet its specific requirements. RBC's new system has significantly increased the efficiency in which its traders obtain the best FX prices for their clients.

RBC is the latest in a growing list of global and regional banks, which have deployed this type of platform as a foundation for eCommerce. Other organizations that have deployed FX solutions driven by technologies from Progress Software (namely its Apama product) recently include BBVA, UniCredit and ANZ, who can now access multiple sources of liquidity and dramatically improve their ability to handle increased trade volume.

The best way to deal with anxiety is to address the root cause. In this case, regulation. Regulation is coming, change is coming. Since the FX world is now facing looming regulations with dramatic impact, you’re going to need to adapt your business models and supporting applications quickly in order to survive – for instance by building flexible rules within your FX trading systems to identify and manage risks, whatever they may turn out to be.  If you do, you’ll be ahead of the pack and will be able to create competitive advantage.

-Dan

Monday, August 23, 2010

Evacuate the Dancefloor

Posted by John Bates

Looking for all the world like someone yelled "fire" in a crowded nightclub, prop and quant traders are stampeding out of investment banks and headed for the hedge fund world. Some, mainly the prop traders, are being pushed gently out the door as banks prepare for the Volcker Rule (http://tinyurl.com/39ap28d). Others, like the quants (http://tinyurl.com/23c5h6d), are in search of the mega-bonuses that their prop trader or hedge fund manager compatriots are (or were) getting.

 

Impending changes in regulation are prompting banks to spin off proprietary trading activities, many by expanding their operations overseas where Messieurs Dodd and Frank cannot reach them. I’m very concerned about this “regulatory arbitrage” in which firms may move away from the US to find less strict regulatory regions. We don’t want to lose the lead in this important area of the economy.

 

Spin offs and regulatory arbitrage may well leave a herd of US traders looking for work and many may end up working at - or starting - hedge funds. The quants, having slaved over hot computers for the last few years to line bankers' pockets, are forming their own trading companies or joining prop trading firms with a profit-sharing deal.

 

Most of these traders will be in for a rude awakening when they sit down to work. Prop traders joining hedge funds will find that the technology budgets may not be as generous as they were at their last bulge bracket employer's firm. The quants, who are essentially programmers, will face huge challenges in finding firms that have the kind of low latency, scalable architecture that they need to design, tweak and trade with their algorithms. The level of trading freedom is different, too. Hedge fund managers will have something to say about a trader's profits - or lack thereof. Quants may find that designing an algorithm and handing it over to the trading desk is not quite the same as being responsible for the profits that the algo makes - or doesn't make.

 

Make no mistake, these prop traders and quants are highly intelligent and adaptable people. There will be many challenges to face going forward, but technology need not be one of them. There are instantly useable, scalable platforms that quants and hedge funds can use to build and deploy algorithms. These platforms, such as Progress Apama's Complex Event Processing Platform, offer a robust technology infrastructure to successfully create, test, deploy and manage their algorithmic strategies.

 

Algorithmic trading software is constantly transforming. As the volume of real-time market data continues to increase, algorithmic trading solutions demand an infrastructure that can respond to market data with near zero latency. To trade effectively in competitive markets requires rapid, opportunistic response to changing market conditions before one's competition can seize those opportunities. The people that are running for the doors and into the arms of hedge funds or other trading firms, will need this advantage. Competition is fierce, and their previous employers already have the technology advantage.

Thursday, July 22, 2010

India – big potential for algorithmic trading

Posted by Giles Nelson

I spent last week in India, a country that, by any standards, is growing fast.  Its population has doubled in the last 40 years to 1.2B and economic growth has averaged more than 7% per year since 1997.  It’s projected to grow at more than 8% in 2010. By some measures, India has the 4th biggest economy in the world. 

Progress has a significant presence in India. In fact, people-wise, it’s the biggest territory for Progress outside the US with over 350 people. Hyderabad is home to a big development centre and Mumbai (Bombay) has sales, marketing and a professional services team.

The primary purpose of my visit was to support an event Progress organised in Mumbai on Thursday of last week on the subject of algorithmic trading. It was also our first real launch of Progress and Apama, our Complex Event Processing (CEP) platform, into the Indian capital markets. We had a great turnout, with over 100 people turning up. I spoke about what we did in capital markets and then participated in a panel session where I was joined by the CTO of the National Stock Exchange, the biggest in India, a senior director of SEBI, the regulator, and representatives from Nomura and Citigroup. A lively debate ensued.

The use of algorithmic trading is still fairly nascent in India, but I believe it has a big future. I’ll explain why soon, but I’d like first to give some background on the Indian electronic trading market, particularly the equities market, which is the largest.
 

The market
India has several, competing markets for equities, futures and options, commodities and foreign exchange too.  In equities, the biggest turnover markets are run by the National Stock Exchange (NSE) and the Bombay Stock Exchange (BSE), with market shares (in the number of trades) of 74% and 26% respectively. Two more equity exchanges are planning to go live soon – the Delhi Stock Exchange is planning to relaunch and MCX is also currently awaiting a licence to launch. This multi-market model, only recently adopted in Europe for example, has been in place in India for many years.

It was only two years ago that direct market access (DMA) to exchanges was allowed. Although official figures don’t exist, the consensus opinion is that about 5% of volume in equities is traded algorithmically and between 15% and 25% in futures and options. Regulation in India is strong - no exchange allows naked access and the BSE described to me some of the strongest pre-trade risk controls I’ve come across - collateral checks on every order before they are matched. The NSE has throttling controls which imposes a limit on the number of orders a member organisation can submit per second. Members can be suspended from trading intra-day if this is exceeded. The NSE also forces organisations who want to use algorithms to go through an approval process. I’ll say more about this later. Controversially, the NSE will not allow multi-exchange algorithmic strategies so cross-exchange arbitrage and smart-order routing cannot take place. Lastly, a securities transaction tax (STT) is levied on all securities sales.

So, with the above restrictions, why do I think that the Indian market for algorithmic trading has massive potential?
 

The potential
The Indian market is very big. Surprisingly so to many people. Taking figures from the World Federation of Stock Exchanges (thus I’m not counting trading on alternative equity venues such as European multi-lateral trading facilities), the Indian market, in dollar value, may still be relatively modest – it’s the 10th largest. However, when you look at the number of trades, India’s the 3rd largest market, only beaten by the US and China. The NSE, for example, processes 10 times the number of trades as the London Stock Exchange. So why isn’t more traded in dollar terms? That’s because trade sizes on Indian exchanges are very small. The median figure worldwide is about $10K per trade. The figure in India is about $500 per trade, a 20th of the size. In summary, surely the task of taming the complexity of this number of trades and the orders that go with them is ideal for algorithmic trading to give an edge? To compare to another emerging, “BRIC”, economy, that of Brazil, where the number of firms using Apama has gone from zero to over 20 in as many months, the dollar market size is fairly similar but the number of equity trades in India is 33 times more. The potential in India is therefore enormous.

India is already there in other ways. All exchanges are offering co-location facilities for their members and debate has already moved on to that common in more developed markets on whether this gives certain firms an unfair advantage or not and whether co-location provision should be regulated.

 

The challenges
There are some difficulties. The STT is seen by some as an inhibitor. However, its effect is offset somewhat by the fact that securities traded on exchange are not subject to capital gains tax. 

The NSE process for approving algorithms is more controversial. Firms that want to algorithmically trade must show to the NSE that certain risk safeguards are in place and “demonstrate” the algorithm to the exchange. As the biggest exchange, the NSE wields considerable power and thus its decision to vet algorithms puts a brake on market development. I believe this process to be unsustainable for the following reasons:

  1. As the market develops there will simply be too many algorithms for the NSE to deal with in any reasonable timeframe. Yes, India is a low-cost economy, but you need highly trained people to be able to analyse algorithmic trading systems. You can’t simply throw more people at this. Firms will want to change the way algorithms work on a regular basis. They can’t do this, with this process in place.
  2. It raises intellectual property issues. Brokers will increasingly object to revealing parts of their algorithms and their clients, who may want to run their alpha seeking algorithms on a broker-supplied co-location facility, will most definitely object. 
  3. It puts the NSE in an invidious position. Eventually an algo will “pass” the process and then go wrong, perhaps adversely affecting the whole market. The NSE will have to take some of the blame.
  4.  Competition will force the NSE’s hand. The BSE is trying to aggressively take back market share and other exchanges are launching which will not have these restrictions.

It strikes me that the NSE should spend its efforts into ensuring that it protects itself better. Perhaps a reasonable comparison is a Web site protecting itself from hacking and denial of service attacks. If they can do it, so can an exchange. And it would offer much better protection for the exchange and the market in general.
 

In conclusion
I’m convinced of the growth potential in India for algo trading. The market is large, the user base is still relatively small and many of the regulatory and technical prerequisites are in place. There are some inhibitors, outlined above, but I don’t think they’ll hold the market back significantly. And finally, why should India not adopt algo trading when so many other, and diverse, markets have?

Progress has its first customers already in India. I look forward to many more. 

Thursday, June 03, 2010

Optimism in the world of financial services regulation

Posted by Giles Nelson

It seems that we’re finally making some progress on making the financial markets function more safely. 

After the “flash-crash” of 6 May, US equity market operators have agreed to bring in coordinated circuit-breakers to avoid a repeat of this extreme event. There is widespread agreement on this. Industry leaders from brokers and exchanges yesterday made supportive statements as part of submissions to the SEC.

Regulators are going public with their use of real-time monitoring technology. Alexander Justham, director of markets at the Financial Services Authority, the UK regulator, told the Financial Times that the use of complex event processing technology will give the FSA “a more proactive machine-on–machine approach” to market surveillance (the FSA is a Progress customer). Other regulators are at least admitting they have a lot of work to do. Mary Schapiro, the SEC chair, believes that the technology used for monitoring markets is “as much as two decades behind the technology currently used by those we regulate”. Scott O’Malia, a commissioner at the Commodity Futures Trading Commission admitted that the CTFC continues to receive account data by fax which then has to be manually entered. 

The use of real-time pre-trade risk technology is likely to become much more widespread. “Naked” access, where customers of brokers submit orders directly to the market without any pre-trade checks, is likely to be banned. This is an important change as late last year Aite Group, an analyst firm, estimated that naked access accounted for 38% of the average daily volume in US stocks. The SEC is also proposing that regulation of sponsored access is shorn up – currently it has evidence that brokers rely upon oral assurances that the customer itself has pre-trade risk technology deployed. The mandated use of pre-trade risk technology will level the playing field and will prevent a rush to the bottom. Personally I’ve heard of several instances of buy-side customers insisting to brokers that pre-trade risk controls are turned off as they perceive that such controls add latency and therefore will adversely affect the success of their trading.

The idea of real-time market surveillance, particularly in complex, fragmented markets as exist in the US and Europe is gaining credence. The SEC has proposed bringing in a “consolidated audit trail” which would enable all orders in US equity markets to be tracked in real-time. As John Bates said in his previous blog post, it’s likely that the US tax-payer will not be happy paying the $4B the publically funded SEC estimates that such a system would need to get up and running. Perhaps the US could look at the way the UK’s FSA is funded. The FSA reports to government but is paid for by the firms it regulates.

As I mentioned in my last blog our polling in April at Tradetech, a European equities trading event, suggests that market participants are ready for better market monitoring. 75% of respondents to our survey believed that creating more transparency with real-time market monitoring was preferable to the introduction of restrictive new rules.

CESR, the Committee of European Securities Regulators, is currently consulting on issues such as algorithmic trading and high frequency trading. It will be interesting to see the results of their deliberations in the coming months.

I’m so pleased the argument has moved on. This time last year saw a protracted period of vilifying “high frequency trading” and “algo trading”. Now, there is recognition of the benefits as well as the challenges that high frequency trading has brought to equity markets and regulators seem to understand that to both prevent disastrous errors and deliberate market manipulation occurring it is better for them to get on board with new technology rather than try to turn the clock back to mediaeval times. 

New approaches are sorely needed. Yesterday saw the conclusion of another investigation into market manipulation when the FSA handed out a $150,000 fine and a five-year ban to a commodity futures broker.

Tuesday, April 27, 2010

Monitoring and surveillance: the route to market transparency

Posted by Giles Nelson

Again this week, capital markets is under the spotlight, with the SEC and Goldman standoff. Just a few weeks ago, the FSA and UK Serious Organised Crime Agency were making multiple arrests for insider trading. Earlier this year Credit Suisse were fined by the New York Stock Exchange for one of their algorithmic trading strategies damaging the market. Still, electronic trading topics such as dark pools, high frequency trading are being widely debated. The whole capital markets industry is under scrutiny like never before.

Technology can't solve all these problems, but one thing it can do is to help give much more market transparency. We're of the view that to restore confidence in capital markets, organisations involved in trading need to have a much more accurate, real-time view on what's going on. In this way, issues can be prevented or at least identified much more quickly.  I talked about this recently to the Financial Times, here

Last week at the Tradetech conference in London, Progress announced its release of a second generation Market Monitoring and Surveillance Solution Accelerator. This is aimed at trading organisations who want to monitor trading behaviour, whether to ensure compliance with risk limits for example, or to spot abusive patterns of trading behaviour. Brokers, exchanges and regulators are particularly relevant, but buy-side organisations can also benefit from it. Previously this solution accelerator just used Apama. Now it's been extended to use our Responsive Business Process (RPM) suite, which includes not only Apama, but Savvion Business Process Management, which extends the accelerator to give it powerful alert and case management capabilities. We know that monitoring and surveillance in capital markets is important now, and believe it will become more so, which is exactly why we've invested in building out product. You can read the take on this from the financial services analyst Adam Honore here and more from Progress about the accelerator and RPM. A video on the surveillance accelerator is here

As all this is so relevant at the moment and Tradetech is the largest trading event of its kind in Europe (although very equity focused), we thought we'd conduct some research with the participants. We got exactly 100 responses on one day (which made calculating the percentages rather a breeze) to a survey which asked about attitudes to European regulation, high frequency and algorithmic trading and dark pools. Some of the responses relating to market monitoring and surveillance are worth stating here. 75% of respondents agreed to the premise that creating more transparency with real-time trading monitoring systems was preferable to the introduction of new rules and regulations. 65% of respondents believe that European regulators should be sharing equity trading information in real-time. And more than half believe that their own organisation would support regulators having open, real-time access to information about the firm's trading activity. To me, that's a pretty strong sign that the industry wants to open up, rather than be subjected to draconian new rules.

There will be substantial changes to the European equity trading landscape in the coming year. There will be post MiFID regulation change by the European Commission acting on recommendations by the Committee of European Securities Regulators who are taking industry evidence at the moment. Their mantra, as chanted last week, is "transparency, transparency, transparency". Let's hope that this transparency argument is expressed in opening up markets to more monitoring rather than taking a, perhaps politically expedient, route of outlawing certain practices and restricting others.

Wednesday, April 21, 2010

Observations from Tradetech 2010

Posted by Giles Nelson

Day one of Tradetech Europe 2010 has nearly finished. I won't be here tomorrow, so here are some thoughts and take-aways from today's event.

It's fair to say that Tradetech is the premier European equities trading and technology event, and thus very relevant for Progress' business in capital markets, particularly customers using Apama. Progress has a substantial presence as always. It's a good event to meet brokers, hedge funds, exchanges and pretty much every one within the industry. Lots of old friends are here every year. Regarding the event itself, it's pretty well attended considering the recent issues with volcanic ash. It usually takes place in Paris, but I'm sure the organisers were pleased that they chose London this year as the London contingent was able to attend without disruption.

This years big theme really seems to be market structure and regulation. In the third year after MiFID, an event which brought competition into European equity markets, and after the credit crunch, issues about how the market is working, the influence of alternative venues such as dark pool,  and how high-frequency trading is affecting the market are issues front of mind.

What's interesting is how some things stay the same. Richard Balarkas, old Tradetech hand and CEO of Instinet Europe, talked about trading liberalisation in the late 19th and early 20th century. Then, vested interests were complaining about the rise of "bucket shops", giving access to trading on the Chicago Board of Trade via telegraph to people that wouldn't previously have traded. In the view of some at the time, this lead to speculation and "gambling". Regulators were wrestling at the time with the fact that only 1% of CBOT trades resulted in actual delivery of goods - the rest were purely financial transactions and therefore arguably speculative. This reminds me of some of the current debate around the "social usefullness" of high frequency trading which is going on now.

European equities trading has changed a lot. Vodafone, a UK listed stock, has now only about 30% of its average European daily volume traded on the London Stock Exchange (LSE). The rest is traded on alternative trading venues across Europe. However, Xavier Rolet, CEO of the LSE, believes that there's a long way to go. He stated  that "the European equities market remains anaemic when compared to the US". Volumes, adjusted for relative market capitalisation, are about 15% of that in the US.

Regulation of European markets is a thorny issue. Regulation is fragmented, together with the market itself. CESR - the Committee of European Securities Regulators, the nearest Europe has to a single regulator - is taking evidence on a whole range of issues and will recommend a set of reforms to the European Commission in July this year. These recommendations will relate to post-trade transparency and information quality and enhanced information about systematic internalisers and broker crossing systems. CESR is also looking at other issues such as algorithmic trading and co-location. Legislation will follow towards the end of 2010.

Equity markets are in a sensitive place. There's still more deregulation to do, more competition to be encouraged and yet, with sentiment as it is, regulators may decide to introduce more rules and regulations to prevent this taking place. The CESR proposals will be about "transparency, transparency, transparency" - as part of this we believe that more real-time market monitoring and surveillance by all participants is key to bringing back confidence in the markets and ensuring that draconian rules don't have to be introduced.

Emerging markets were talked about in one session, and Cathryn Lyall from BM&FBovespa in the UK, talked about Brazil in particular. We've seen Brazil become a pretty significant market recently. Not only have demand grown for all Progress products substantially but Apama is now being used by 18 clients for algorithmic trading of both equities and derivatives. Brazil is the gorilla in the Latin American region. It accounts for 90% of cash equities and 95% of derivatives business in Latin America. 90% of Brazilian trading is on exchange. Brazil emerged largely unscathed from the credit crunch and it's taken only 2-3 years to achieve the level of trading infrastructure that took perhaps 10-15 years to evolve in the US and Europe. More still needs to happen. Although the regulatory regime has an enviable reputation, it is moving slowly. Concerns regarding naked and sponsored access are holding up liberalisation that would lead to DMA and co-located access to the equities market, something which is place already for derivatives.

So, that's what I saw as highlights from the day. Tradetech seems, still, to be the place the whole industry gathers.

Monday, March 08, 2010

Rumblings in the Cloud

Posted by Louis Lovas

Rumblings in the Cloud
Cloud computing... it's on everyone's mind these days. Personally I think it's a term that has attained such aggrandized acclaim that vendors, analysts, bloggers and anyone with marketing muscle has pulled and stretched its definition to such and extent that it could mean just about anything hosted. Cloud Computing Journal polled twenty-one experts to define Cloud Computing.  Just the fact they had to ask the question of twenty-one experts is rather telling in itself.  Well I read what the experts had to say.

So armed with my newly minted (yet fully stretched, but not of my own making) Cloud definition I happened upon this commentary about CEP in the Cloud or the lack thereof.  There's a great quote in the article: "I don’t care where a message is coming from and I don’t care where it’s going”. Correctly indicated, this in a sense defines a key aspect of CEP. Event-based applications should be transparent to messages (or events to which messages transform) origin and destination (sans a logical or virtual name).  However, unlike the author Colin Clark, I do believe the current crop of vendor products, most notably Progress Apama maintain this separation of the physical from the virtual.

The rationale behind the lack of CEP-based applications in the Cloud (ok, there's that word again) are found in other factors. To explain my reasoning I'll start by dividing CEP-based applications into two categories. Of course there are many ways to categorize CEP-based applications, but for the sake of this discussion, I'll use these two:

CEP-based Application Categories
  1. Those that do things
  2. Those that observe other applications doing things
Not sure I could make a simpler layman-like description, but needless to say it warrants further explanation (or definition in sticking with our theme)

CEP-based applications that do things
This category is best explained by example. Typical of event processing applications that do things are those in Capital Markets like algorthmic trading, pricing and market making. These applications perform some business function, often critcal in nature in their own right. Save connectivity to data sources and destinations, they are the key ingredient or the only ingredient to a business process.  In the algo world CEP systems tap into the firehose of data, and the data rates in these markets (Equities, Futures & Options, etc.) is increasing at a dizzying pace. CEP-based trading systems are focused on achieiving the lowest latency possible. Investment banks, hedge funds, and others in the arms race demand the very best in hardware and software platforms to shave microseconds off each trade. Anything that gets in the (latency) way is quickly shed.

In other verticals, an up and coming usage of CEP is location-based services. This is one that leveraging smart mobile devices (i.e "don't care where the message is going") to provide promotions and offers.  
    • Algo Trading, Pricing, Market Aggregation
    • Location Based Services (providing promotional offers and alerts)
CEP-based applications that observe other applications doing things
Conversely, event-based applications that observe other applications doing things are classified as providing visibility or greater insight into some existing business function. These event-based applications overlay business processes to take measures to improve their effectiveness. As is often the case critical business applications provide little visibility or the information is silo’ed. There is a need to provide a broader operational semantic across a heterogeneous mix of business applications and processes.  Here are a few typical examples of event-based visibility applications observing other business systems.
    • Telco Revenue Assurance
    • Click Stream Analysis
    • Fraud Detection
    • Surveillance
Of  course the demarcation line between these two classifications is not clear cut. Providing greater visibility is just a starting point, monitoring for opportunities to take action is just as important such as kicking-off a fraud watch if a suspected wash-trade occurred  (so in a sense they are doing things).

Where for art thou oh CEP
When considering the Cloud, an important point to consider is dependency. Specifically, there is a dependency that the underlying applications and business processes exist in the Cloud for (observing) CEP to overlay them.  I would offer that Enterprise business has not yet migrated their key business processes to the Cloud on a widespread scale just yet. Why not? What are the barriers? Security, regulatory compliance, DR, investment costs, limited skill sets are just a few of the challenges mentioned in this ITProPortal article.  I suspect these barriers are far reaching, keeping the pace of Cloud deployment in check to the point where it's not as yet strategic to many.
 
One of key things that makes the Cloud a reality is virtualization, it has clearly revolutionized PaaS as the Cloud. Virtualization does come at a cost, there is a latency penality for the conveinence, no matter how small for some use-cases that cost is too great.

Make no mistake, I am certain the Cloud with all it's twenty-one definitions is the future of computing. It's an imperative that will knock down the barriers and change the face of the Enterprise and when it reaches critical mass CEP will be there.

Once again thanks for reading, you can follow me at twitter, here.
Louie