Giles Nelson

Wednesday, June 22, 2011

A foray into Beijing

Posted by Giles Nelson

Beijing was the last stop on my three city Asian tour and, from a personal perspective, the most exciting one as I’d never visited mainland China before.

China’s seemingly inexorable economic rise has been well documented. In the last 20 years, China’s GDP growth has averaged over 9%. As I travelled from the airport into Beijing’s central business district I saw few older buildings. Virtually everything, including the roads, looked as if it had been built in the last 10 years.

The Chinese stock market is big. In terms of the total number of dollars traded, the combined size of the two stock exchanges, Shanghai and Shenzhen, is approximately double that traded in the next biggest Asian market, Japan. The increase in stock trading has been very rapid. Trading volumes on Shanghai and Shenzhen have risen by approximately 20 fold in the past 5 years, although there has been significant volatility in this rise. 

The domestic algorithmic trading market is nascent. Currently, intra-day trading in company shares is not allowed. It is the recently established futures markets therefore where algorithmic and high-frequency trading are taking place. No figures exist on the proportion of trading done algorithmically in China currently, but I’m going to estimate it at 5%.

I was in Beijing to participate in the first capital markets event Progress has held there. Although Shanghai is the finance capital of China, we chose to hold the event in Beijing to follow up on previous work we'd done there. In the end, we had about 60 people along from domestic sell-side and buy-side firms attending which was a great result considering the relatively low profile Progress has at present in this market. There was optimism and an expectation that algorithmic trading had a bright future in China. 

I believe it's a practical certainty that the Chinese market will adopt algorithmic and high frequency trading. In every developed market a high, or very high, proportion of trading is done algorithmically and, although different regulations and dynamics make each market unique, nothing except an outright ban will prevent widespread adoption in every market in time. Liberalisation in China is occurring. For example, stock index futures are now traded, exchanges are supporting FIX, short-selling has been trialled and it is now easier for Chinese investors to access foreign markets. Also, earlier this year, the Brazilian exchange, BM&FBovespa, and the Shanghai exchange signed an agreement which may result in company cross listings. Only some of these changes support electronic trading growth directly but all are evidence that the liberalisation necessary to support such growth is happening. Inhibitors remain: no intra-day stock trading, restrictions on foreign firms trading on Chinese markets thus preventing competition and knowledge transfer from developed markets, and tight controls on trading in Renminbi. The Chinese regulators will continue to move cautiously. 

The question is not if, but when. We expect to sign our first Chinese customers soon. China is becoming a very important blip on the radar. 

 

Friday, June 17, 2011

Still big room for growth in Japan

Posted by Giles Nelson

And from Mumbai on to Tokyo. In so many ways, a bigger contrast between cities is difficult to imagine. 

Japan has, of course, had a tough time of it recently. Not only has the recent earthquake and tsunami knocked Japan back, but also Japan has had a long period of relative economic stagnation, compared to other Asian economies.  Its development in algorithmic trading also sets it apart from other developed economies due to the relatively low proportion of trading which is done algorithmically.

There’s little consensus on what this proportion is however. In a recent report, Celent, the financial market analyst firm, reported that around 25% of trading was algorithmic in 2010. Figures from the Tokyo Stock Exchange (TSE) report that around 35% of orders submitted to the exchange are from co-location facilities – it is reasonable to assume that nearly all of these could be regarded as “algorithmic”. From my conversations with people in the days I was in Tokyo who worked at exchanges, sell-side firms and our customers and partners, I’m going to put the figure at between 40-50%. 

That means there’s a lot of room for growth when you consider that the proportion of securities traded in the US and Europe algorithmically, in one way or another, nears 100%. One inhibitor to growth has now been removed. In 2010, the TSE launched a new exchange platform, Arrowhead, which reduced latency from up to two seconds down to around 5 milliseconds. In other words, the TSE is now “fit for purpose” for algorithmic trading and, in particular, for high frequency trading. Previously, with latencies being so long, high frequency firms who, for example, wanted to market-make on the TSE, simply weren’t prepared to take the risk of the exposure and uncertainty that such high latencies bring. Since Arrowhead's launch in January 2010, and according to the TSE’s own figures, the total number of orders on the TSE has risen by a modest 25%, but the proportion of orders submitted from co-location facilities has more than doubled. 

Progress exhibited and spoke at Tradetech Japan and we were joined on our stand by our partner, Tosho Computer Systems, that we’re working with on a service in Japan which we will be launching later this year. They’ll be more news on that nearer the time. Attendance wise, Tradetech was down on its peak in Japan in 2008, but up on previous years – a reflection of the renewed interest in trading technology generally.

Market surveillance was one of the key topics that came up in Tradetech Q&A and panel discussions. This is common across pretty much any market now, with the essential question being: how can markets be kept safe as markets get faster and more complex? Some say there should be restrictions and whilst circuit breakers, insistence on pre-trade risk checks and similar are important, over emphasis on “the dangers” can hold markets back. Progress’ view is that regulators and exchanges should all move towards real-time market surveillance. (Find more on this here and here).

There’s a lot of emphasis at the moment in European and US markets on OTC derivatives regulation and the move of trading in such instruments onto exchanges. Japan is relatively advanced in this regard with regulation requiring many domestic OTC derivatives to be cleared, a trend which is happening elsewhere in Asia too more quickly than in Europe and the US. 

Regionally, Japan is the biggest developed market in terms of share trading volume. Twice as big in dollar terms than the next biggest, Hong Kong. But Japan is itself dwarfed now by China and I’ll be writing about that next.

 

Wednesday, June 15, 2011

The rise of algo trading in Asia - first stop Mumbai

Posted by Giles Nelson

I’ve just completed a three city Asian tour, in Mumbai, Tokyo and Beijing. The purpose was to promote Progress’s business in capital markets, in particular both Apama and our Responsive Process Management suite. I’m going to give my impressions on those markets – first up is Mumbai and the Indian market for electronic trading.

Before I start with India though, I’d like to share an interesting piece of academic work I came across. We do hear a lot about the rise of Asia economically, particularly the “big two” of India and China.  Many predict that China will become the world’s biggest economy in the next 10 years and that India’s population will exceed China’s by mid century. Both economies are predicted to grow between 8-10% in 2011. The predicted long-term shift in relative economic fortunes is nicely illustrated by some work done at the London School of Economics in 2010, a graphic from which is shown below.

  Screen shot 2011-06-15 at 15.57.25
© Danny Quah, London School of Economics, October 2010

 

Professor Danny Quah calculated the “the average location of economic activity across geographies on Earth”. Put another way, this shows the “economic centre of gravity” and shows its mid-Atlantic position in 1980 and its predicted position in the middle of Asia by 2040. The economic power shift is on.

Back to my trip, and my first stop in Mumbai, commercial and financial capital of India. I wrote on the growth of electronification and algorithmic trading in India last July. Since then the market has progressed. Progress has acquired its first clients in capital markets in India using Apama for trading and the market itself has evolved and liberalised. Last year I criticised the policy of the biggest stock exchange, the National SE, for its “algo approval process”. I thought this was a poorly structured attempt to protect the markets from “rogue algos” and would stymie development. I'm pleased to say that the NSE has now relaxed this policy (the latest version of which can be found here). A further significant development has been that now both equity exchanges, the NSE and the Bombay SE, allow smart order routing from co-location facilities at each exchange to one  another. This was previously not allowed by the NSE. There is also evidence that the adoption of algorithmic trading is changing the way the market works – manual arbitragers are heading towards extinction and arbitrage opportunities in general are becoming more difficult to find – evidence that information flow between markets is becoming more efficient. 

Together with CNBC, Progress held a well attended event for the industry in Mumbai. The highlight was a panel session which had the deputy CEO of the BSE, a senior member of the Indian securities regulator and a practitioner from a buy-side firm participating. There was a consensus that the continued development of algo trading was welcome in India – bringing technological innovation, putting pressure on costs, bringing liquidity and more competition. There is some caution, particularly when it comes to the unfettered use of algos.  Reference was made to recent talk in the US and Europe about introducing algo “inspection” to provide justification for caution. As I’ve said previously, algo inspection is inherently flawed – it is far better for markets to be protected through good pre-trade risk checks and real-time market surveillance (as discussed at length recently on this blog). The panel acknowledged that real-time surveillance was key for markets to operate well.

Despite the Indian market progressing, there are still challenges. There is obvious enmity between the two stock exchanges that should lessen if and when the organisation behind the Multi Commodity Exchange (MCX) receives an equity trading license, something which is expected soon. The granting of such a license and the introduction of more competition into the market can only be a positive move. Trading costs for equities in India are still high. There are two equity clearers, each owned by the two exchanges which do not interoperate and foreign financial institutions need a license from the regulator to trade and an Indian local legal entity – a process which inhibits foreign firms entering the market and thus reduces competitive pressure on domestic firms.

In my view one change that India will see in the coming years is significant broker consolidation. Currently there are around 3000 brokers in the Indian market. Many of these are small and, in my opinion, will not survive in a market where access to technology will be needed to compete. Many will therefore go out of business or be forced to merge.

The market for algo trading in India is growing. Although it hasn’t reached a tipping point yet, it has all the promise to be one of the most important markets in Asia.

Next stop was Tokyo for Tradetech Japan 2011. I’ll talk about that tomorrow.

 

 

Thursday, November 04, 2010

A postcard to Jeremy Grant

Posted by Giles Nelson

Jeremy Grant, editor of FT Trading Room at the Financial Times, recently asked for explanations "on a postcard" about why speed is a force for good in financial markets, or put another way, to explain what the benefits are of high frequency trading. I've just come back from Mexico where I was addressing the Association of Mexican Brokers and during my visit I thought I'd write that postcard. So here it is:

 

Dear Jeremy

I saw your request for postcards recently, and as I'm travelling I thought I'd drop you one. There's not a lot I like doing more than explaining the benefits of so-called "high frequency trading".

I would suggest that you think of high frequency trading, or HFT, as being just the latest stage in the evolution of electronic trading. And this, as you know, has evolved very rapidly over the last decade because of cheaper and faster computers and networks. It's led to many innovations and benefits: electronic crossing networks, algorithmic trading, online retail trading, smaller order sizes, the overall increase in trading volume, more price transparency, greater trader productivity, more accessible liquidity, spreads between buy and sell prices tightening, broker commissions reducing, competition between exchanges and so smaller exchange fees - none of these things would have happened without electronic trading. MiFiD couldn't have happened; it simply wouldn't have been financially viable for the many alternative European equity-trading venues to launch without cheap access to networks and computers. Without these we would still have greedy, monopolistic exchanges with high transaction prices.

HFT is just the latest step in a technology driven evolution. You can't just look at it in isolation.

"Ah", you exclaim, "but high frequency trading is a step too far. Trades happening far faster than the blink of an eye. Surely that can't be right?"

So what if trades happen quickly? Things "going too fast" is a common concern. In 19th century Britain, people were worried about trains going faster than 30mph. They thought that passengers would suffocate or that as the train reached a corner it would simply come off the rails! And to those that say trading happens too quickly, at what speed should it occur? If not micro or milliseconds, should it be a second, a minute, an hour? Who's going to decide? Any choice is entirely arbitrary anyway; time is infinitely divisible.

There are plenty of things that happen too fast for humans to comprehend - human nerve impulses travel at more than 100m per second, yet we function successfully. Why? Because we have the monitoring systems in place that ensure the information from the nerves is processed correctly. Put a finger on a hot coal and it will be retracted immediately - quicker than we can consciously think. And if a 200mph train goes through a red light then warning bells will ring and the train will be automatically stopped.

And so to the main point. Trading speed, per se, is not the problem. But, yes, problems there are. Markets, particularly in Europe and the US are now very complex. These markets are fast moving, multi-exchange, with different, but closely interlinked asset classes. It is this complexity we find difficult to understand. Speed is only one facet of this. We imagine that an armageddon incident could occur because we know that the markets are not being monitored properly. Regulators freely admit this - Mary Schapiro recently said that the SEC was up to two decades behind in its use of technology to monitor markets. And because we know that the people in charge don't know what's going on, we get scared.

It doesn't have to be like this. The same technological advances that led to the evolution of HFT can be used to ensure that the markets work safely, by ensuring that limits are not exceeded, that an algorithm "going crazy" can't bring down an exchange, that a drunken trader can't move the oil price and that traders are dissuaded from intentionally trying to abuse the markets.

Doing things faster is a human instinct. Faster, higher, stronger. The jet engine, the TGV, the motorway. Would we really go back to a world without these?

Wednesday, October 13, 2010

New Wall Street film shows that technology never sleeps

Posted by Giles Nelson

After encountering the latest Wall Street film ‘Money Never Sleeps’ earlier this week, it’s apparent that it's not just the then-brick-sized mobile phones that have changed since the 1987 installment.

The movie opens with Gordon Gekko, the man who so famously stated, “greed is good” in the first film, being released from jail. It’s a comical scene contrasting the technology of the ’80s to the tech of today, as the guard returns Gekko’s bulky mobile phone. Gekko is released into a world where the nature of how the financial world is run has completely changed. However, it was only after recently revisiting the original movie, you come to realise just how much advances in technology has fundamentally changed the way in which the trading floor environment operates.

Take High frequency trading (HFT), the use of technology to monitor and submit orders to markets extremely quickly, which has been receiving a lot of bad press recently and is sometimes described as “abusive”. It is no more abusive than two traders making trades using only the telephone, as was the case in a scene from the original Wall Street film.

Yes, it can be used for rogue trading by the likes of Gekko, but so can any other technology. Technology itself is morally neutral. Similarly, algorithmic trading is also seen by some as an industry curse. Credit Suisse has been fined this year by an exchange after its algorithmic trading system went out of control and bombarded the exchange with hundreds of thousands of erroneous orders. However, this wasn’t a deliberate attempt to manipulate the market. It was a mistake, albeit a careless one. There just weren’t proper controls in place to protect the market from what, ultimately, was human error – the algorithms hadn’t been tested sufficiently.

There is no doubting that technology has generated enormous benefits for trading – greater efficiencies, more market liquidity, tighter spreads and better prices for all. To lose these benefits because of perception would be very dangerous. Having said this, technology has also made the markets faster and more complex. Therefore, all market participants need to up their game by deploying modern day monitoring capabilities to spot trading anomalies to help capture the next generation Gekkos.

Monday, October 04, 2010

No evil algo-trader behind the flash crash

Posted by Giles Nelson

The long anticipated joint SEC and CTFC report on the 6 May 2010 flash-crash came out last Friday.

After reading much of the report and commentary around it, I'm feeling rather underwhelmed.

The root cause of the flash-crash, the most talked about event in the markets this year, was a boring old "percentage-by-volume" execution algorithm used by a mutual fund to sell stock market index futures. How banal.

The algorithm itself was simple. It just took into account volume, not price, and it didn't time orders into the market. Many commentators have pejoratively described this algorithm as "dumb". It may be simple, but it's one of the most common ways that orders are worked - buy or sell a certain amount of an instrument as quickly as possible but only take a certain percentage of the volume available so the market isn't impacted too much. The problem was the scale. It was the third largest intra-day order in the E-mini future in the previous 12 months - worth $4.1Bn. The two previous big orders were worked taking into account price and time and executed over 5 hours. The flash-crash order took only 20 minutes to execute 75,000 lots.

It wasn't this order on its own of course. Fear in the markets created by the risk of Greece defaulting was already causing volatility. Stub quotes (extreme value quotes put in by market makers to fulfill their market making responsibilities) appear to have contributed. There was the inter-linking between the futures market and equity markets. There was the very rapid throwing around of orders - described as the "hot potato" effect, certainly exacerbated by the many high-frequency traders in the market. There was the lack of coordinated circuit breakers in the many US equity markets. There was the lack of any real-time monitoring of markets to help regulators identify issues quickly.

High-frequency and algorithmic trading have been vilified in many quarters over the last months. I think many were expecting that the flash-crash cause would be a malignant algo, designed by geeks working in a predatory and irresponsible hedge fund, wanting to speculate and make profits from "mom and pop" pension funds. It just wasn't anything of the kind.

The flash crash has raised important issues about the structure of multi-exchange markets, the role of market makers, the lack of real-time surveillance and how a simple execution strategy could precipitate such events. I do hope that the findings in the flash-crash report will ensure a more balanced view on the role of high-frequency and algo trading in the future.

Thursday, July 22, 2010

India – big potential for algorithmic trading

Posted by Giles Nelson

I spent last week in India, a country that, by any standards, is growing fast.  Its population has doubled in the last 40 years to 1.2B and economic growth has averaged more than 7% per year since 1997.  It’s projected to grow at more than 8% in 2010. By some measures, India has the 4th biggest economy in the world. 

Progress has a significant presence in India. In fact, people-wise, it’s the biggest territory for Progress outside the US with over 350 people. Hyderabad is home to a big development centre and Mumbai (Bombay) has sales, marketing and a professional services team.

The primary purpose of my visit was to support an event Progress organised in Mumbai on Thursday of last week on the subject of algorithmic trading. It was also our first real launch of Progress and Apama, our Complex Event Processing (CEP) platform, into the Indian capital markets. We had a great turnout, with over 100 people turning up. I spoke about what we did in capital markets and then participated in a panel session where I was joined by the CTO of the National Stock Exchange, the biggest in India, a senior director of SEBI, the regulator, and representatives from Nomura and Citigroup. A lively debate ensued.

The use of algorithmic trading is still fairly nascent in India, but I believe it has a big future. I’ll explain why soon, but I’d like first to give some background on the Indian electronic trading market, particularly the equities market, which is the largest.
 

The market
India has several, competing markets for equities, futures and options, commodities and foreign exchange too.  In equities, the biggest turnover markets are run by the National Stock Exchange (NSE) and the Bombay Stock Exchange (BSE), with market shares (in the number of trades) of 74% and 26% respectively. Two more equity exchanges are planning to go live soon – the Delhi Stock Exchange is planning to relaunch and MCX is also currently awaiting a licence to launch. This multi-market model, only recently adopted in Europe for example, has been in place in India for many years.

It was only two years ago that direct market access (DMA) to exchanges was allowed. Although official figures don’t exist, the consensus opinion is that about 5% of volume in equities is traded algorithmically and between 15% and 25% in futures and options. Regulation in India is strong - no exchange allows naked access and the BSE described to me some of the strongest pre-trade risk controls I’ve come across - collateral checks on every order before they are matched. The NSE has throttling controls which imposes a limit on the number of orders a member organisation can submit per second. Members can be suspended from trading intra-day if this is exceeded. The NSE also forces organisations who want to use algorithms to go through an approval process. I’ll say more about this later. Controversially, the NSE will not allow multi-exchange algorithmic strategies so cross-exchange arbitrage and smart-order routing cannot take place. Lastly, a securities transaction tax (STT) is levied on all securities sales.

So, with the above restrictions, why do I think that the Indian market for algorithmic trading has massive potential?
 

The potential
The Indian market is very big. Surprisingly so to many people. Taking figures from the World Federation of Stock Exchanges (thus I’m not counting trading on alternative equity venues such as European multi-lateral trading facilities), the Indian market, in dollar value, may still be relatively modest – it’s the 10th largest. However, when you look at the number of trades, India’s the 3rd largest market, only beaten by the US and China. The NSE, for example, processes 10 times the number of trades as the London Stock Exchange. So why isn’t more traded in dollar terms? That’s because trade sizes on Indian exchanges are very small. The median figure worldwide is about $10K per trade. The figure in India is about $500 per trade, a 20th of the size. In summary, surely the task of taming the complexity of this number of trades and the orders that go with them is ideal for algorithmic trading to give an edge? To compare to another emerging, “BRIC”, economy, that of Brazil, where the number of firms using Apama has gone from zero to over 20 in as many months, the dollar market size is fairly similar but the number of equity trades in India is 33 times more. The potential in India is therefore enormous.

India is already there in other ways. All exchanges are offering co-location facilities for their members and debate has already moved on to that common in more developed markets on whether this gives certain firms an unfair advantage or not and whether co-location provision should be regulated.

 

The challenges
There are some difficulties. The STT is seen by some as an inhibitor. However, its effect is offset somewhat by the fact that securities traded on exchange are not subject to capital gains tax. 

The NSE process for approving algorithms is more controversial. Firms that want to algorithmically trade must show to the NSE that certain risk safeguards are in place and “demonstrate” the algorithm to the exchange. As the biggest exchange, the NSE wields considerable power and thus its decision to vet algorithms puts a brake on market development. I believe this process to be unsustainable for the following reasons:

  1. As the market develops there will simply be too many algorithms for the NSE to deal with in any reasonable timeframe. Yes, India is a low-cost economy, but you need highly trained people to be able to analyse algorithmic trading systems. You can’t simply throw more people at this. Firms will want to change the way algorithms work on a regular basis. They can’t do this, with this process in place.
  2. It raises intellectual property issues. Brokers will increasingly object to revealing parts of their algorithms and their clients, who may want to run their alpha seeking algorithms on a broker-supplied co-location facility, will most definitely object. 
  3. It puts the NSE in an invidious position. Eventually an algo will “pass” the process and then go wrong, perhaps adversely affecting the whole market. The NSE will have to take some of the blame.
  4.  Competition will force the NSE’s hand. The BSE is trying to aggressively take back market share and other exchanges are launching which will not have these restrictions.

It strikes me that the NSE should spend its efforts into ensuring that it protects itself better. Perhaps a reasonable comparison is a Web site protecting itself from hacking and denial of service attacks. If they can do it, so can an exchange. And it would offer much better protection for the exchange and the market in general.
 

In conclusion
I’m convinced of the growth potential in India for algo trading. The market is large, the user base is still relatively small and many of the regulatory and technical prerequisites are in place. There are some inhibitors, outlined above, but I don’t think they’ll hold the market back significantly. And finally, why should India not adopt algo trading when so many other, and diverse, markets have?

Progress has its first customers already in India. I look forward to many more. 

Thursday, June 03, 2010

Optimism in the world of financial services regulation

Posted by Giles Nelson

It seems that we’re finally making some progress on making the financial markets function more safely. 

After the “flash-crash” of 6 May, US equity market operators have agreed to bring in coordinated circuit-breakers to avoid a repeat of this extreme event. There is widespread agreement on this. Industry leaders from brokers and exchanges yesterday made supportive statements as part of submissions to the SEC.

Regulators are going public with their use of real-time monitoring technology. Alexander Justham, director of markets at the Financial Services Authority, the UK regulator, told the Financial Times that the use of complex event processing technology will give the FSA “a more proactive machine-on–machine approach” to market surveillance (the FSA is a Progress customer). Other regulators are at least admitting they have a lot of work to do. Mary Schapiro, the SEC chair, believes that the technology used for monitoring markets is “as much as two decades behind the technology currently used by those we regulate”. Scott O’Malia, a commissioner at the Commodity Futures Trading Commission admitted that the CTFC continues to receive account data by fax which then has to be manually entered. 

The use of real-time pre-trade risk technology is likely to become much more widespread. “Naked” access, where customers of brokers submit orders directly to the market without any pre-trade checks, is likely to be banned. This is an important change as late last year Aite Group, an analyst firm, estimated that naked access accounted for 38% of the average daily volume in US stocks. The SEC is also proposing that regulation of sponsored access is shorn up – currently it has evidence that brokers rely upon oral assurances that the customer itself has pre-trade risk technology deployed. The mandated use of pre-trade risk technology will level the playing field and will prevent a rush to the bottom. Personally I’ve heard of several instances of buy-side customers insisting to brokers that pre-trade risk controls are turned off as they perceive that such controls add latency and therefore will adversely affect the success of their trading.

The idea of real-time market surveillance, particularly in complex, fragmented markets as exist in the US and Europe is gaining credence. The SEC has proposed bringing in a “consolidated audit trail” which would enable all orders in US equity markets to be tracked in real-time. As John Bates said in his previous blog post, it’s likely that the US tax-payer will not be happy paying the $4B the publically funded SEC estimates that such a system would need to get up and running. Perhaps the US could look at the way the UK’s FSA is funded. The FSA reports to government but is paid for by the firms it regulates.

As I mentioned in my last blog our polling in April at Tradetech, a European equities trading event, suggests that market participants are ready for better market monitoring. 75% of respondents to our survey believed that creating more transparency with real-time market monitoring was preferable to the introduction of restrictive new rules.

CESR, the Committee of European Securities Regulators, is currently consulting on issues such as algorithmic trading and high frequency trading. It will be interesting to see the results of their deliberations in the coming months.

I’m so pleased the argument has moved on. This time last year saw a protracted period of vilifying “high frequency trading” and “algo trading”. Now, there is recognition of the benefits as well as the challenges that high frequency trading has brought to equity markets and regulators seem to understand that to both prevent disastrous errors and deliberate market manipulation occurring it is better for them to get on board with new technology rather than try to turn the clock back to mediaeval times. 

New approaches are sorely needed. Yesterday saw the conclusion of another investigation into market manipulation when the FSA handed out a $150,000 fine and a five-year ban to a commodity futures broker.

Wednesday, April 21, 2010

Observations from Tradetech 2010

Posted by Giles Nelson

Day one of Tradetech Europe 2010 has nearly finished. I won't be here tomorrow, so here are some thoughts and take-aways from today's event.

It's fair to say that Tradetech is the premier European equities trading and technology event, and thus very relevant for Progress' business in capital markets, particularly customers using Apama. Progress has a substantial presence as always. It's a good event to meet brokers, hedge funds, exchanges and pretty much every one within the industry. Lots of old friends are here every year. Regarding the event itself, it's pretty well attended considering the recent issues with volcanic ash. It usually takes place in Paris, but I'm sure the organisers were pleased that they chose London this year as the London contingent was able to attend without disruption.

This years big theme really seems to be market structure and regulation. In the third year after MiFID, an event which brought competition into European equity markets, and after the credit crunch, issues about how the market is working, the influence of alternative venues such as dark pool,  and how high-frequency trading is affecting the market are issues front of mind.

What's interesting is how some things stay the same. Richard Balarkas, old Tradetech hand and CEO of Instinet Europe, talked about trading liberalisation in the late 19th and early 20th century. Then, vested interests were complaining about the rise of "bucket shops", giving access to trading on the Chicago Board of Trade via telegraph to people that wouldn't previously have traded. In the view of some at the time, this lead to speculation and "gambling". Regulators were wrestling at the time with the fact that only 1% of CBOT trades resulted in actual delivery of goods - the rest were purely financial transactions and therefore arguably speculative. This reminds me of some of the current debate around the "social usefullness" of high frequency trading which is going on now.

European equities trading has changed a lot. Vodafone, a UK listed stock, has now only about 30% of its average European daily volume traded on the London Stock Exchange (LSE). The rest is traded on alternative trading venues across Europe. However, Xavier Rolet, CEO of the LSE, believes that there's a long way to go. He stated  that "the European equities market remains anaemic when compared to the US". Volumes, adjusted for relative market capitalisation, are about 15% of that in the US.

Regulation of European markets is a thorny issue. Regulation is fragmented, together with the market itself. CESR - the Committee of European Securities Regulators, the nearest Europe has to a single regulator - is taking evidence on a whole range of issues and will recommend a set of reforms to the European Commission in July this year. These recommendations will relate to post-trade transparency and information quality and enhanced information about systematic internalisers and broker crossing systems. CESR is also looking at other issues such as algorithmic trading and co-location. Legislation will follow towards the end of 2010.

Equity markets are in a sensitive place. There's still more deregulation to do, more competition to be encouraged and yet, with sentiment as it is, regulators may decide to introduce more rules and regulations to prevent this taking place. The CESR proposals will be about "transparency, transparency, transparency" - as part of this we believe that more real-time market monitoring and surveillance by all participants is key to bringing back confidence in the markets and ensuring that draconian rules don't have to be introduced.

Emerging markets were talked about in one session, and Cathryn Lyall from BM&FBovespa in the UK, talked about Brazil in particular. We've seen Brazil become a pretty significant market recently. Not only have demand grown for all Progress products substantially but Apama is now being used by 18 clients for algorithmic trading of both equities and derivatives. Brazil is the gorilla in the Latin American region. It accounts for 90% of cash equities and 95% of derivatives business in Latin America. 90% of Brazilian trading is on exchange. Brazil emerged largely unscathed from the credit crunch and it's taken only 2-3 years to achieve the level of trading infrastructure that took perhaps 10-15 years to evolve in the US and Europe. More still needs to happen. Although the regulatory regime has an enviable reputation, it is moving slowly. Concerns regarding naked and sponsored access are holding up liberalisation that would lead to DMA and co-located access to the equities market, something which is place already for derivatives.

So, that's what I saw as highlights from the day. Tradetech seems, still, to be the place the whole industry gathers.

Tuesday, April 20, 2010

Predictions for increased transparency in Capital Markets

Posted by Giles Nelson

  It is my view that one of the most significant causes of the global financial crisis was a lack of transparency in financial markets.  Put simply, that means no one, not regulators or market participants, knew what the size of certain derivatives markets (like credit default swaps) was, who held what positions, or what the consequences of holding positions could be.  If financial reform brings nothing else, it should at least hold banks accountable for the business they conduct, and that means full disclosure and constant monitoring by responsible regulators.  

This action would help provide the basis for preventing future crises. No matter how inventive financial products may become, if regulators have complete and detailed information about financial markets and banks’ activities there, better assessments of risk can be made. This means that if necessary, banks’ activities can be reigned in through higher capital requirements or similar measures.  Simply limiting banks’ ability to conduct certain business is a blunt instrument that does not resolve the lack of transparency and likely will hamper economic growth.

Market transparency exhibits itself in many forms. Particularly relevant is that related to electronic trading. Therefore, I predict that regulators will require banks to implement relevant stronger pre-trade risk mechanisms. Regulators, such as the FSA & SEC, will ultimately bring in new rules to mitigate against, for example, the risk of algorithms ‘going mad’. This is exemplified by Credit Suisse, which was fined $150,000 by the NYSE earlier this year for “failing to adequately supervise development, deployment and operation of proprietary algorithms.”

Furthermore, volumes traded via high frequency trading will increase, although at a much slower pace than last year, and at the same time the emotive debates about high frequency trading creating a two-tier system and an unfair market will die down.

In addition, with regards to mid market MiFID monitoring, greater responsibility for compliance will be extended from exchanges to the banks themselves. Banks and brokers will soon be mandated to implement more trade monitoring and surveillance technology. There will also be no leeway on Dark Pools; they just simply have to change and be mandated to show they have adequate surveillance processes and technology in place. They will also have to expose more pricing information to the market and regulators.

This year will see a definite shift to an increasingly transparent – and therefore improved – working environment within capital markets. The ongoing development of market surveillance technologies and changes in attitudes to compliance will drive this forward, creating a more open and fairer marketplace for all.