Thursday, March 29, 2012

Today in Event Processing

Posted by The Progress Guys

In “Therapy for Toxic FX Order Flow”, Dan Hubscher discusses the challenges that equity markets are facing due to high frequency and algorithmic trading in foreign exchange markets.  He explains that the rise in high order cancellation has resulted in the discouragement of cancelled trades through increased controls, as well as incentives for those who have high fill ratios. 

6

Read Dan’s full commentary here.

 

Thursday, December 01, 2011

Today in Event Processing

Posted by The Progress Guys

Dr. John Bates lists his top 9 predictions for the financial markets in 2012. No. 1: a financial institution will take a billion dollar hit and particularly focuses on the effect of regulations. To find out what else is in store for regulation, fraud and market manipulation, check out the full post

5

Friday, November 11, 2011

Can market surveillance help to keep traders on track?

Posted by Richard Bentley

Richard BentleyBy Richard Bentley, Vice President, Capital Markets, Progress Software

There’s no doubt that today's high speed capital markets and cross product, cross market trade volumes mean regulation struggles to keep up with changes in the market.  MiFID II is an example of a financial regulatory directive that is seen by many as lacking real detail and remaining open to interpretation - and misinterpretation. In a panel discussion at the TABB Group Trading Surveillance event in London on last Wednesday evening, industry experts agreed that, in Europe at least, few financial services firms are afraid of regulators.

So as many new regulations remain wooly, ignored or have yet to be implemented - or in the case of ESMA (the European Securities and Markets Authority) the regulation is simply statements of clarification – the panel was asked how surveillance and risk is going to be managed moving forward? Questions were also  raised about the regulatory burden in the future and whether those outside of the "Big Five" would be able to resource the demands for growing compliance departments. Will this lead to an uneven playing field?

According to TABB Group new compliance costs are indicated at between 512 and 732 million euro, with ongoing costs between 312 and 586 million euros.  But while regulators are still determining what regulation will look like, the need for market surveillance is undiminished. Traders made about 13.3 billion euros ($18.2 billion) from market manipulation and insider dealing on EU equity markets in 2010, according to an EU commission study.  With some arguing that firms can only do so much to survey markets themselves as trades cross multiple brokers and gateways, the panel discussed the need for fragmented market data to be brought together in a consolidated tape and surveillance performed at an aggregate market-wide level. 

With respect to High Frequency Trading, there was discussion and agreement that pre-trade checks should be built in and regulators should be feared, as in some Asian markets where some market participants adopt a mindset that constantly asks "will I be allowed to trade today". That "Fear Factor" is key and there isn't fear of regulation yet in Europe.

The timeliness of market surveillance was discussed with the panel suggesting that transactions should be monitored retrospectively, but also in real-time as they happen. Clearly, there’s still a role for historic analysis of the market as some abuse takes place over an extended period of time and new abuse scenarios are discovered which can then be applied to historical data. It’s a little like having your DNA stored on file for a time in the future when forensic techniques improve. But there is also no doubt that the need for real-time surveillance to spot manipulation as it happens can be a significant factor for organisations looking to protect themselves and the market, which is one of the reasons it is mandated by Dodd-Frank and MiFID II.

Finally, the panel discussed how turbulent markets and highly publicised breaches of banking controls have demonstrated the importance of protecting market integrity. So while an increase in the complexity of market surveillance inevitably leads to an increase in cost, the panel felt that the punitive and reputational risks associated with surveillance failures justify the business case for improving compliance training, processes and technology.  After all, just as you wouldn’t expect the police to prevent all crime by themselves, it’s clear that investment is needed in surveillance technology to give the regulators a helping hand.

Wednesday, September 14, 2011

Is Revolution the Path to Transparency?

Posted by Dan Hubscher

Revolutions are proliferating.  When you watch a revolution happening elsewhere, political or otherwise, it’s a good time to contemplate the revolution in your own history, or in your future.  There are few among us that can’t point to one or the other.  One of the common drivers is the fear that something is happening where we can’t see it happen, and we want transparency – of process, of government – of whatever seems to be wrong.

The capital markets globally are experiencing a similar revolution now with regulatory change, and the current climate threatens to create a revolt as well.  Market participants may push back on reforms to the point of creating a new state of stress.  Either way, the future presents very real threats to companies that aren’t prepared.  We’re observing a vast expansion of global rulemaking, and a coming deluge of data - especially in the derivatives markets. It’s very expensive and distracting to fix problems after the fact, so we need to act now.  “Hope is not a strategy” – as is often said to have been uttered by famed (American) football coach Vince Lombardi.

In an open letter to Barack Obama published on January 23, 2009, Benjamin Ola Akande advised, "Yet, the fact remains that hope will not reduce housing foreclosures. Hope does not stop a recession. Hope cannot create jobs. Hope will not prevent catastrophic failures of banks. Hope is not a strategy."

Now we have the Dodd-Frank Act in the U.S., MiFID II and EMIR in Europe, all preceded by the de Larosiere Report (EC, 2009), Turner Report (FSA, 2009), Volcker Report (G30, 2009), G20 – Feb 2009 Declarations, Financial Stability Forum Report (FSF, 2009), INF Report (IMF, 2009), Walker Review (UK, 2009), Basel / IOSCO Reviews… the list goes on.  And the rest of the world is watching, waiting, for another revolution.  The intended scope of the most recent reforms seems to almost be panacea, and transparency is the first step.

The next Revolution is happening in Boston, fittingly.  Progress Revolution 2011, from September 19th through the 22nd, offers the chance to learn from industry innovators on how they have successfully tackled these challenges within the capital markets.  Customers including PLUS Markets and Morgan Stanley will be there to share success stories.  And Kevin McPartland, Principal at the TABB Group, will be there too.  I’ve included a sneak peek into Kevin’s “Path to Transparency” below.

According to the New York Times, at the Republican Convention in 2008, Rudy Giuliani once said while contemplating Barack Obama’s candidacy, “… ‘change’ is not a destination ... just as ‘hope’ is not a strategy.”  Rudy will be speaking at our Revolution too.  Will you be there?  It will be a lively conference – I hope that you can join us!

-Dan

The Path to Transparency

By Kevin McPartland, Principal, TABB Group

Managing the vast quantities of data born into existence by the Dodd Frank Act and related regulation will present a challenge in the post-DFA environment; but collecting and producing the required data is just the tip of the iceberg. The ability to analyze and act on that data is what will separate the survivors from the winners. This is already true in many other parts of the global financial markets, but the complexities inherent in swaps trading coupled with the speed at which these changes will take place creates unique challenges. Spread this across all five major asset classes and three major geographies, and the complexities become more pronounced.

Margin calculations are proving to be one of the biggest concerns for those revamping their OTC derivatives infrastructure. In a non-cleared world, dealers determine collateral requirements for each client and collect variation margin on a periodic schedule—in some cases once a month, and in other cases once a year. When those swaps are moved to a cleared environment, margin calculations will need to occur at least daily. The result is an upgrade of the current batch process with dozens of inputs to a near-real time process, with hundreds of inputs. Whereas before major dealers could perform margin analysis, client reporting and risk management in a single system, those systems now need to operate independently within an infrastructure that provides the necessary capacity and speed.

The trading desk will require a similar seismic shift, as flow businesses will provide liquidity across multiple trading venues to an expanding client base. Most major dealers are at some stage of developing liquidity aggregation technology intended to provide a single view of liquidity across multiple swap execution venues. Creating this type of virtual order book requires receiving multiple real-time data feeds and aggregating the bids and offers in real time.

Furthermore, rather than comparing model-derived prices to the last trade price to produce quotes, inputs from SEFs, CCPs, SDRs, internal models, third-party models and market data providers will be required inputs to real-time trading algorithms once reserved for exchange-traded derivatives.

Providing clients with execution services presents other challenges. Executing on multiple platforms also means tracking and applying commission rates per client per venue in real time. Trade allocations also complicate the execution process.  In the bilateral world a big asset manager can do a $100 million interest rate swap and spread that exposure across multiple funds as it sees fit. Under the DFA, the executing broker must know which funds are getting how much exposure. Account allocation in and of itself is not new, but cost averaging multiple swap trades and allocating the right exposure at the right price to the proper account presents complex challenges, especially in a near-real time environment.

Risk management, compliance and back-testing data will also require huge increases in processing power, often at lower latencies. Risk models and stress tests, for example, are much more robust than they were before the financial crisis, requiring a considerably higher amount of historical data.

Compliance departments now must store the requisite seven years of data so they can reconstruct any trade at any moment in the past. This is complicated enough in listed markets, when every market data tick must be stored, but for fixed-income securities and other swaps, storing the needed curves means that billions of records must not only be filed away but retrievable on demand. Similar concerns exist for quants back-testing their latest trading strategies: It is not only the new data being generated that must be dealt with. Existing data, too, is about to see a huge uptick in requirements.

In the end these changes should achieve some of the goals set forth by Congress as they enacted Dodd Frank – increased transparency and reduced systemic risk.  The road there will be bumpy and expensive, but the opportunities created by both the journey and the destination will outweigh any short term pain.

This perspective was taken from the recent TABB Group study Technology and Financial Reform: Data, Derivatives and Decision Making.

Wednesday, August 31, 2011

Progress Revolution Session Sneak Peek: Transforming Your Bank to Become Operationally Responsive

Posted by Richard Bentley

Within the banking industry today, we’re observing a significant expansion of choices available to customers, and subsequently loyalty to the primary bank is eroding. What specifically is driving attrition? 

Results from a 2011 customer survey conducted by Capgemini show that “quality of service” and “ease of doing business” are the factors that most strongly influence a customer to both initially select AND leave a bank.

What does this mean for a bank conducting business in the current competitive landscape? It means that connecting with your customer at the right time and right place is critically important. The question that remains is exactly how this seamless connection between the organization and the customer is achieved in the real world.

Immediacy is the critical overlay, and immediacy is achieved through real-time data capture and reporting translated into highly targeted, more relevant offers. Real-time data capture and responsiveness put you in a position to move fluidly with your customer base.  There’s no lag time where a lucrative window of opportunity closes before you have the opportunity to act.

It’s all about connecting with customers on their terms, and this is best achieved through automating the process.  For instance, if you have a customer that has deposited a large check outside the parameters of their typical deposit cycles, you can present an offer for a high yield savings account. This type of offer works well when you connect with your customer immediately because the large deposit is still top of mind, and other choices for investment have likely not yet been identified.

Essentially you’re able to capitalize on your preemptive knowledge that this customer is likely in the market for a savings account product BEFORE the customer is able to research competitive options. 

By anticipating your customers’ needs you’re delivering on the most important factor in retention – ease of doing business.

When you have the tools in place to serve up the right offering to the right customer at the right time, everyone wins.  You don’t want your customer to jump ship because they think you don’t have what they need.  And your customer really doesn’t want to spend their valuable time and energy searching for a solution that they could quickly and easily secure from you.

The beauty of Responsive Customer Engagement is that it evolves operational efficiency into operational agility.  It’s not enough to streamline processes and collect data – it’s about how quickly and effectively you can translate data capture and analysis into real-time, relevant communication with your customers. Believe me, if you’re not focused on real-time customer engagement, it’s highly likely that your competitors are.

I hope that you can join me in Boston on September 21st at Progress Revolution Boston 2011 to discuss how to leverage software solutions to not only improve operational efficiency but also increase customer engagement and loyalty.  I’m going to be covering cross-sell and up-sell marketing, payment management, and customer on-boarding and other timely topics, and look forward to a lively exchange of ideas!   

 

 

 

 

 

 

Wednesday, August 17, 2011

DON'T PANIC! Maybe a Depressed Robot is to Blame

Posted by Dan Hubscher

The market frenzy that whiplashed global markets last week has been - unsurprisingly  - blamed on high frequency trading. Headlines such as these fuelled the fire:

  • "The Trading Game is Causing Market Panic" - The Atlantic.
  • "High speed traders are exacerbating volatility" -  the Financial Times.
  • "High Frequency Traders Win in Market Bloodbath" - the Wall Street Journal.
  • "High Frequency Trading May Be Making Things Worse On Stock Markets" - HuffPo.
  • "Black box trading is the real hazard to markets, says Lord Myners" - The Guardian

As you can see, the media, brokers, and investors were quick to point fingers at HFT and, in the meantime, the SEC has sent out subpoenas to high-frequency trading firms in relation to last year's flash crash probe, according to the Wall Street Journal.

But does HFT cause panic? Panic, which is an emotional reaction to fear,  is an inflammatory word. Panic is contagious, especially in markets. Because panic is ultimately a human emotion it may not be the most accurate word to describe the moves of cold-blooded trading algorithms. But when you see trading robots hit stop-loss after stop-loss, triggering buying or selling in light-speed over and over again, the voluminous activity is bound to mimic actual panic.

The markets today make the struggle between human investors versus high frequency trading firms look like an intergalactic battle. But while the humans are sitting behind the wheel of a broken-down old car, HFTs are soaring on a spaceship equipped with an Infinite Improbability Drive. 

And the algorithms that dominate the markets - along with their HFT owners - are getting a bad reputation. They are being portrayed as the villains in this uber-sensitive economic environment.

The current dialog casts them as Vogons, the bad guys in the brilliant Hitchhikers Guide to the Galaxy (BBC series and books by Douglas Adams). The Vogons are described as "one of the most unpleasant races in the galaxy. Not actually evil, but bad-tempered, bureaucratic, officious, and callous."

High frequency trading is not actually evil either, it is the natural development of quantitative trading, and algorithmic trading which grew out of the new market structure and automation. Although quantitative trading takes the emotion OUT of trading, something has gone wrong. The effect on volatility is a matter of debate; perceptions around this question are creating panic and upsetting investors.

Investors boycotted HFT in the week ended August 10th, dragging $50 or $60 billion out of the stock market and walking it over into safer money market accounts. So what can be done to restore investor confidence? One bemused market participant said last week that what was needed was an algorithm with a built-in panic button. Instead of tumbling after the market when a stop is hit, the algo could pause and take a breath and think about what it is doing.  But would that lessen volatility? Because, again, thinking is a human activity.

Ford Prefect, one of the protagonists in Hitchhikers, said about Vogons: "They don't think, they don't imagine, most of them can't even spell, they just run things." Like the Vogons, algorithms are not great thinkers. And, with upwards of 70% of equities trading being done by algorithms, they do pretty much run things.

But blaming HFT (or Vogons) for market swings is like blaming social media for the riots in England. Yes, it was easier to get everyone together for a looting-and-burning session by using Twitter and Facebook, which made the situation worse.  And social media also helped people organize the clean-up. Would the same have happened, differently, without social media?  Likely yes.  HFT and social media have another thing in common - because they are automated they can easily be monitored and controlled, if regulations allow it. What if authorities could recognize the electronic early signs of a riot, send a warning to citizens, and arrive at the scene faster?  In the UK, the government is considering blocking Twitter and Facebook during a major national emergency. 

I am not suggesting that the government or the regulators should block HFT when markets run riot. And our markets are still tinkering with just the right recipe for actual panic buttons; witness the dislocations of the May 6th flash crash and the ensuing coordinated circuit breaker and other limit debates.

There a number of government investigations - including the US and the UK - into high frequency trading practices. Some are trying to shine a light on algorithms and determine whether their behavior is perhaps predatory or disruptive. Lifting the veil of secrecy on algorithmic and high frequency trading could be a way of regaining investor confidence. But that is a double-edged sword; if firms offer up too much about the methodologies and strategies behind their algos they could also be giving away their secret formulae.

Giving regulators a balanced peek under the covers might help. Regulators already can and do deploy market surveillance and monitoring tools to help prevent market abuse at high speed. That may be a difficult pill to swallow, but the Hitchhiker’s Guide would tell us – DON’T PANIC.  As many damning studies as there are, others come to a different conclusion ("Study gives backing to high-speed trading" – Financial News).  If something isn't done proactively - and soon - then investors, politicians, regulators and other nay-sayers are going to be calling for an end to HFT completely.

As Marvin the depressed robot in Hitchhikers said: "I've calculated your chance of survival, but I don't think you'll like it." 

-Dan

 

Wednesday, July 20, 2011

HFT Volume: Cool Liquidity or Just Hot Air?

Posted by Dan Hubscher

There have been some heated public debates recently about the kind of liquidity that is being provided by high frequency trading. In an earlier post on Tabb Forum Candyce Edelen from Propel Growth asks if HFTs provide liquidity or do they just provide volume?

Liquidity is bread and butter for firms associated with financial markets; it provides an easy way to get in and out of markets while - hopefully - making money. It is essential to investors who want to know that they can get in or out of a market when they choose. Therefore liquidity is good. (The icing on the cake would be over the counter derivatives markets where firms can use opacity to their profitable advantage.)

Liquidity, from the word liquid, depicts a flow of activity that is smooth and constant.  Investopedia says that liquidity is characterized by a high level of trading activity.

It is clear is that HFT provides that high level of trading activity. What is less clear is: is it really liquidity? High quote-to-cancel rates, with pre-programmed cancellations sometimes happening in 2 to 5 milliseconds, lead to perceived rather than actual volume. Liquidity only occurs when buyer and seller can get together and deal. If a sell quote disappears before the buyer is even aware of it, what advantage does that offer the buyer?

I applaud Edelen for asking outright whether HFTs increase liquidity or just volume.  The “just volume” replies were no surprise, but some thoughtful counter-arguments appeared in the comments to support the liquidity argument as well.  Interesting, but no consensus.  So how we expect regulators to figure it out is anybody’s guess.

There is also a lack of clarity as to whether HFTs should be market makers, thereby ensuring liquidity even in volatile markets. Edelen asks if we need new obligations for the new breed of market makers. The May 6th flash crash demonstrated what happens when uncertainty and extreme price movements occurred - the market makers fled. Computerized trading algorithms quit the marketplace even when their owners were designated as market makers, thus steepening the fall. But, since traditional market makers have all but disappeared, HFT market makers are what we have left. If you take them away, what remains?

In the flurry of comments to Edelen's article, it seems that on this at least the pundits seem stumped.  They remarked on what the obligations and incentives are or are not, but could not seem to find an answer to what they should be.  Instead they go away from market makers to seemingly trying to redefine what the market itself ought to be instead – with little success.  This may give us a clue as to why regulators and legislators can’t figure it out either.

And do we need a level playing field as some are calling for? If everyone gets the same data at the same time and can only trade at the same speed, will that solve the problems? Intuitively people seem to feel the answer is “yes” but the commentators could not even agree what the playing field or fields would look like.  Most seemed to shy away from trying to put the genie back in the bottle and going back to the way things were.  Dark pools are another sore subject, with some claiming that the growing level of volume traded in dark pools (possibly 30% of volume) is due to the fear that HFTs are creating in the lit markets.

High frequency trading is here to stay, and debate over its role and future is healthy. But are all HFTs the same, asks Edelen? Here finally we seem to have some consensus from the pundits, who appear to have argued themselves into this position:  There is Good HFT and there is Bad HFT.  We can’t define it, but we think we know it when we see it. This is the same as with any tool or technology or trading model – there are beneficial and irresponsible or harmful uses; the line between them is blurry, and the distinction springs from the user of the tool, not the tool itself.

So what are we to do with the “Bad HFT?”  This seems to be the real question.  So far we’ve heard:

  • The regulators haven’t figured it out and can’t – regulators worldwide have managed to define a small number of abusive trading scenarios, and demand compliance, but they can’t hope to cover it all
  • The markets themselves haven’t figured it out and don’t have much incentive to do so
  • The market participants can't come to a consensus - and probably won’t
  • Wishing HFT away won’t fix it – either by slowing it down with a Tobin Tax, or a liquidity fee, or by outlawing it altogether – that is throwing out the baby with the bath water and someone will always find a way to abuse what remains.

What we are left with is the capitalist mantra of “buyer beware.”  But there are ways that the buyer can equip himself to join the 21st Century and - at the same time - protect himself from high frequency errors, fraud or predators.  

As we’ve often said, high speed trading needs high speed controls.  Like a car racing down the highway, some controls will come from the lurking cop (the regulator). But the driver needs to make sure the brakes work as well as the gas pedal in order to avoid an accident.  Of course, the speed itself begs the question of whether HFT needs to have a speed limit in order to stay under control.  We don't think a speed limit is necessary.

Instead, the issue is detecting abuse and mistakes, regardless of speed or source.  If you have the market surveillance and monitoring tools that can alert you to problems occurring at high speed - if you can monitor your own car, and the cars on the road around you and see a problem coming - an accident can be avoided. And if there is no accident, despite how fast the car is going, then the cops won't have to be called out.

-Dan

 

Wednesday, June 22, 2011

A foray into Beijing

Posted by Giles Nelson

Beijing was the last stop on my three city Asian tour and, from a personal perspective, the most exciting one as I’d never visited mainland China before.

China’s seemingly inexorable economic rise has been well documented. In the last 20 years, China’s GDP growth has averaged over 9%. As I travelled from the airport into Beijing’s central business district I saw few older buildings. Virtually everything, including the roads, looked as if it had been built in the last 10 years.

The Chinese stock market is big. In terms of the total number of dollars traded, the combined size of the two stock exchanges, Shanghai and Shenzhen, is approximately double that traded in the next biggest Asian market, Japan. The increase in stock trading has been very rapid. Trading volumes on Shanghai and Shenzhen have risen by approximately 20 fold in the past 5 years, although there has been significant volatility in this rise. 

The domestic algorithmic trading market is nascent. Currently, intra-day trading in company shares is not allowed. It is the recently established futures markets therefore where algorithmic and high-frequency trading are taking place. No figures exist on the proportion of trading done algorithmically in China currently, but I’m going to estimate it at 5%.

I was in Beijing to participate in the first capital markets event Progress has held there. Although Shanghai is the finance capital of China, we chose to hold the event in Beijing to follow up on previous work we'd done there. In the end, we had about 60 people along from domestic sell-side and buy-side firms attending which was a great result considering the relatively low profile Progress has at present in this market. There was optimism and an expectation that algorithmic trading had a bright future in China. 

I believe it's a practical certainty that the Chinese market will adopt algorithmic and high frequency trading. In every developed market a high, or very high, proportion of trading is done algorithmically and, although different regulations and dynamics make each market unique, nothing except an outright ban will prevent widespread adoption in every market in time. Liberalisation in China is occurring. For example, stock index futures are now traded, exchanges are supporting FIX, short-selling has been trialled and it is now easier for Chinese investors to access foreign markets. Also, earlier this year, the Brazilian exchange, BM&FBovespa, and the Shanghai exchange signed an agreement which may result in company cross listings. Only some of these changes support electronic trading growth directly but all are evidence that the liberalisation necessary to support such growth is happening. Inhibitors remain: no intra-day stock trading, restrictions on foreign firms trading on Chinese markets thus preventing competition and knowledge transfer from developed markets, and tight controls on trading in Renminbi. The Chinese regulators will continue to move cautiously. 

The question is not if, but when. We expect to sign our first Chinese customers soon. China is becoming a very important blip on the radar. 

 

Friday, June 17, 2011

Still big room for growth in Japan

Posted by Giles Nelson

And from Mumbai on to Tokyo. In so many ways, a bigger contrast between cities is difficult to imagine. 

Japan has, of course, had a tough time of it recently. Not only has the recent earthquake and tsunami knocked Japan back, but also Japan has had a long period of relative economic stagnation, compared to other Asian economies.  Its development in algorithmic trading also sets it apart from other developed economies due to the relatively low proportion of trading which is done algorithmically.

There’s little consensus on what this proportion is however. In a recent report, Celent, the financial market analyst firm, reported that around 25% of trading was algorithmic in 2010. Figures from the Tokyo Stock Exchange (TSE) report that around 35% of orders submitted to the exchange are from co-location facilities – it is reasonable to assume that nearly all of these could be regarded as “algorithmic”. From my conversations with people in the days I was in Tokyo who worked at exchanges, sell-side firms and our customers and partners, I’m going to put the figure at between 40-50%. 

That means there’s a lot of room for growth when you consider that the proportion of securities traded in the US and Europe algorithmically, in one way or another, nears 100%. One inhibitor to growth has now been removed. In 2010, the TSE launched a new exchange platform, Arrowhead, which reduced latency from up to two seconds down to around 5 milliseconds. In other words, the TSE is now “fit for purpose” for algorithmic trading and, in particular, for high frequency trading. Previously, with latencies being so long, high frequency firms who, for example, wanted to market-make on the TSE, simply weren’t prepared to take the risk of the exposure and uncertainty that such high latencies bring. Since Arrowhead's launch in January 2010, and according to the TSE’s own figures, the total number of orders on the TSE has risen by a modest 25%, but the proportion of orders submitted from co-location facilities has more than doubled. 

Progress exhibited and spoke at Tradetech Japan and we were joined on our stand by our partner, Tosho Computer Systems, that we’re working with on a service in Japan which we will be launching later this year. They’ll be more news on that nearer the time. Attendance wise, Tradetech was down on its peak in Japan in 2008, but up on previous years – a reflection of the renewed interest in trading technology generally.

Market surveillance was one of the key topics that came up in Tradetech Q&A and panel discussions. This is common across pretty much any market now, with the essential question being: how can markets be kept safe as markets get faster and more complex? Some say there should be restrictions and whilst circuit breakers, insistence on pre-trade risk checks and similar are important, over emphasis on “the dangers” can hold markets back. Progress’ view is that regulators and exchanges should all move towards real-time market surveillance. (Find more on this here and here).

There’s a lot of emphasis at the moment in European and US markets on OTC derivatives regulation and the move of trading in such instruments onto exchanges. Japan is relatively advanced in this regard with regulation requiring many domestic OTC derivatives to be cleared, a trend which is happening elsewhere in Asia too more quickly than in Europe and the US. 

Regionally, Japan is the biggest developed market in terms of share trading volume. Twice as big in dollar terms than the next biggest, Hong Kong. But Japan is itself dwarfed now by China and I’ll be writing about that next.

 

Wednesday, June 15, 2011

The rise of algo trading in Asia - first stop Mumbai

Posted by Giles Nelson

I’ve just completed a three city Asian tour, in Mumbai, Tokyo and Beijing. The purpose was to promote Progress’s business in capital markets, in particular both Apama and our Responsive Process Management suite. I’m going to give my impressions on those markets – first up is Mumbai and the Indian market for electronic trading.

Before I start with India though, I’d like to share an interesting piece of academic work I came across. We do hear a lot about the rise of Asia economically, particularly the “big two” of India and China.  Many predict that China will become the world’s biggest economy in the next 10 years and that India’s population will exceed China’s by mid century. Both economies are predicted to grow between 8-10% in 2011. The predicted long-term shift in relative economic fortunes is nicely illustrated by some work done at the London School of Economics in 2010, a graphic from which is shown below.

  Screen shot 2011-06-15 at 15.57.25
© Danny Quah, London School of Economics, October 2010

 

Professor Danny Quah calculated the “the average location of economic activity across geographies on Earth”. Put another way, this shows the “economic centre of gravity” and shows its mid-Atlantic position in 1980 and its predicted position in the middle of Asia by 2040. The economic power shift is on.

Back to my trip, and my first stop in Mumbai, commercial and financial capital of India. I wrote on the growth of electronification and algorithmic trading in India last July. Since then the market has progressed. Progress has acquired its first clients in capital markets in India using Apama for trading and the market itself has evolved and liberalised. Last year I criticised the policy of the biggest stock exchange, the National SE, for its “algo approval process”. I thought this was a poorly structured attempt to protect the markets from “rogue algos” and would stymie development. I'm pleased to say that the NSE has now relaxed this policy (the latest version of which can be found here). A further significant development has been that now both equity exchanges, the NSE and the Bombay SE, allow smart order routing from co-location facilities at each exchange to one  another. This was previously not allowed by the NSE. There is also evidence that the adoption of algorithmic trading is changing the way the market works – manual arbitragers are heading towards extinction and arbitrage opportunities in general are becoming more difficult to find – evidence that information flow between markets is becoming more efficient. 

Together with CNBC, Progress held a well attended event for the industry in Mumbai. The highlight was a panel session which had the deputy CEO of the BSE, a senior member of the Indian securities regulator and a practitioner from a buy-side firm participating. There was a consensus that the continued development of algo trading was welcome in India – bringing technological innovation, putting pressure on costs, bringing liquidity and more competition. There is some caution, particularly when it comes to the unfettered use of algos.  Reference was made to recent talk in the US and Europe about introducing algo “inspection” to provide justification for caution. As I’ve said previously, algo inspection is inherently flawed – it is far better for markets to be protected through good pre-trade risk checks and real-time market surveillance (as discussed at length recently on this blog). The panel acknowledged that real-time surveillance was key for markets to operate well.

Despite the Indian market progressing, there are still challenges. There is obvious enmity between the two stock exchanges that should lessen if and when the organisation behind the Multi Commodity Exchange (MCX) receives an equity trading license, something which is expected soon. The granting of such a license and the introduction of more competition into the market can only be a positive move. Trading costs for equities in India are still high. There are two equity clearers, each owned by the two exchanges which do not interoperate and foreign financial institutions need a license from the regulator to trade and an Indian local legal entity – a process which inhibits foreign firms entering the market and thus reduces competitive pressure on domestic firms.

In my view one change that India will see in the coming years is significant broker consolidation. Currently there are around 3000 brokers in the Indian market. Many of these are small and, in my opinion, will not survive in a market where access to technology will be needed to compete. Many will therefore go out of business or be forced to merge.

The market for algo trading in India is growing. Although it hasn’t reached a tipping point yet, it has all the promise to be one of the most important markets in Asia.

Next stop was Tokyo for Tradetech Japan 2011. I’ll talk about that tomorrow.

 

 

Previous | Next