Event Processing Technology

Tuesday, July 24, 2012

John Bates Recognized as Financial Technology Leader by Institutional Investor

Posted by Richard Bentley

Institutional Investor Tech 50
Today we’d like to congratulate our own CTO John Bates on his recognition as one of Institutional Investor’s top Tech 50: The Difference Makers. This is the second year that John’s work with Progress Apama and Complex Event Processing has been recognized by Institutional Investor as a top financial technology leader. This is an especially exciting award, given that Progress was the highest-ranking technology provider on this year’s list.

Institutional Investor recognizes each leader for their vision and agility in translating innovation into operational and competitive advantage. They recognize John’s membership on the U.S. Commodity Futures Trading Commission's technology advisory committee and Progress’s contribution to performance gains. Due to economic outlook, technologists today have to “do more with less”, says Institutional Investor, and John’s work with Progress Apama and Complex Event Processing contributes to that process.

They also recognize Progress’s work in two geographical hot spots, China and Brazil, which has been a hot topic in the capital markets world lately. They quote Bates saying, "Countries like China and Brazil are starting to accelerate their adoption of algo trading.”

Congratulations John, we’re proud to have you on our team!

John Bates Video Institutional Investor


Thursday, July 19, 2012

Scooping FX Bubbles Out of a Boiling Pot

Posted by Ben Ernest-Jones

Foreign exchange trading appears to be moving from a beneath-the-radar, bank-dominated activity into the international trading limelight. There has been an explosion of new trading platforms and a wave of newer participants lately, partly thanks to new transparency afforded by Dodd-Frank and partly because of the relentless hunt for alpha. Increasingly automated, FX is also becoming the next go-to asset class for high frequency and algorithmic trading.

It wasn't always that way. As TABB Group's Larry Tabb said in an article in Wall Street & Technology: "FX has always been different. Be it that currency is a bank’s core product, be it that banks control the payments infrastructure, or be it that banks are critical in implementing Central Banks’ monetary policy, the banks have historically dominated FX."

Because FX is mainly traded via single dealer platforms, multi-dealer platforms such as FXall, and interdealer marketplaces, it is fragmented in a different way from equities.  Traditional trading platforms along with a couple of the sturdier newcomers like multi-dealer platforms FXall (which Thomson Reuters is buying) andCurrenex have been the dominant destinations for electronic trading of FX.

But now that the SEC and CFTC have clarified that forex contracts will be determined to be swaps, they will become part of the centrally cleared instrument pool. This means a whole new layer of banks, brokers, and venues are already popping their heads up. FX will soon emulate the expansion, consolidation, and then contraction of destinations experienced by the equities markets. 

There will be more opportunities for market participants to trade, hedge, arbitrage and manage risk. Algorithmic strategies will dominate, attracting more and more destination venues - and then fragmentation will be the mantra. So how are traders going to position themselves to scoop the profitable FX bubbles out? It is not as easy as you would think. In many cases, what appears to be an increase in liquidity is actually an increase in “phantom” orders, as institutions advertise the same underlying liquidity across an increasing number of locations. Trading algorithms will need to be smarter, and tuned over time to counteract this as the landscape changes.

Bank traders are looking for ways to handle the new world order of FX. Because their clients want to be able to trade forwards, swaps, spot and even options on the same system, banks are having to do the once-unthinkable: merge their forwards desks with their spot desks.

In the old days of voice trading, forwards and spot traders ran completely separate books and dealt with (mostly) different customers. Today clients are asking to hedge forwards and spot on the same system at the same time. Some want to trade using forward-to-spot conversions against aggregated spot prices from several platforms and some want to use aggregated forward rates directly. Some want a blending of both. The opportunities for banks are plentiful, if they can harmonize FX products, trading and hedging across trading systems successfully.

Many bank clients have seen what has happened in the equities markets; with high frequency trading and algorithmic strategies becoming problematic and largely vilified. When the world’s largest interdealer brokersaid recently that it would tackle “disruptive” practices by high-frequency traders on its foreign exchange platform it became clear that some of the lesser-loved equities issues were already creeping into FX markets.

The Wall Street Journal says that there are already fears of an FX "boom" reminiscent of the equities venue explosion in 2001. "Some market insiders fear the trend for highly specialized new systems aimed at separate pockets of clients could end up splitting the liquidity that underpins this $4 trillion-a-day market, making it harder to trade," said the paper. Pigeon-holing traders, whether it be by class of trader, asset class or by delivery date, only creates more fragmentation. This could equate to lower volumes (like equities), more volatility (like equities) and an increase in manipulative practices (like equities). Regulators will no doubt be watching, and new rules will be implemented even faster than has happened in equities.

In a discussion at the FX Week USA event in NYC recently one FX trading platform provider said that you need a full market ecology to provide proper efficiency. This means having all market participants operate in the same liquidity pool.  In the end the unique self-regulating properties of the FX markets mean that the market will shift towards what is best for the market - because it can. Preparation for this inevitability will determine who survives. 

Wednesday, June 22, 2011

A foray into Beijing

Posted by Giles Nelson

Beijing was the last stop on my three city Asian tour and, from a personal perspective, the most exciting one as I’d never visited mainland China before.

China’s seemingly inexorable economic rise has been well documented. In the last 20 years, China’s GDP growth has averaged over 9%. As I travelled from the airport into Beijing’s central business district I saw few older buildings. Virtually everything, including the roads, looked as if it had been built in the last 10 years.

The Chinese stock market is big. In terms of the total number of dollars traded, the combined size of the two stock exchanges, Shanghai and Shenzhen, is approximately double that traded in the next biggest Asian market, Japan. The increase in stock trading has been very rapid. Trading volumes on Shanghai and Shenzhen have risen by approximately 20 fold in the past 5 years, although there has been significant volatility in this rise. 

The domestic algorithmic trading market is nascent. Currently, intra-day trading in company shares is not allowed. It is the recently established futures markets therefore where algorithmic and high-frequency trading are taking place. No figures exist on the proportion of trading done algorithmically in China currently, but I’m going to estimate it at 5%.

I was in Beijing to participate in the first capital markets event Progress has held there. Although Shanghai is the finance capital of China, we chose to hold the event in Beijing to follow up on previous work we'd done there. In the end, we had about 60 people along from domestic sell-side and buy-side firms attending which was a great result considering the relatively low profile Progress has at present in this market. There was optimism and an expectation that algorithmic trading had a bright future in China. 

I believe it's a practical certainty that the Chinese market will adopt algorithmic and high frequency trading. In every developed market a high, or very high, proportion of trading is done algorithmically and, although different regulations and dynamics make each market unique, nothing except an outright ban will prevent widespread adoption in every market in time. Liberalisation in China is occurring. For example, stock index futures are now traded, exchanges are supporting FIX, short-selling has been trialled and it is now easier for Chinese investors to access foreign markets. Also, earlier this year, the Brazilian exchange, BM&FBovespa, and the Shanghai exchange signed an agreement which may result in company cross listings. Only some of these changes support electronic trading growth directly but all are evidence that the liberalisation necessary to support such growth is happening. Inhibitors remain: no intra-day stock trading, restrictions on foreign firms trading on Chinese markets thus preventing competition and knowledge transfer from developed markets, and tight controls on trading in Renminbi. The Chinese regulators will continue to move cautiously. 

The question is not if, but when. We expect to sign our first Chinese customers soon. China is becoming a very important blip on the radar. 


Friday, June 17, 2011

Still big room for growth in Japan

Posted by Giles Nelson

And from Mumbai on to Tokyo. In so many ways, a bigger contrast between cities is difficult to imagine. 

Japan has, of course, had a tough time of it recently. Not only has the recent earthquake and tsunami knocked Japan back, but also Japan has had a long period of relative economic stagnation, compared to other Asian economies.  Its development in algorithmic trading also sets it apart from other developed economies due to the relatively low proportion of trading which is done algorithmically.

There’s little consensus on what this proportion is however. In a recent report, Celent, the financial market analyst firm, reported that around 25% of trading was algorithmic in 2010. Figures from the Tokyo Stock Exchange (TSE) report that around 35% of orders submitted to the exchange are from co-location facilities – it is reasonable to assume that nearly all of these could be regarded as “algorithmic”. From my conversations with people in the days I was in Tokyo who worked at exchanges, sell-side firms and our customers and partners, I’m going to put the figure at between 40-50%. 

That means there’s a lot of room for growth when you consider that the proportion of securities traded in the US and Europe algorithmically, in one way or another, nears 100%. One inhibitor to growth has now been removed. In 2010, the TSE launched a new exchange platform, Arrowhead, which reduced latency from up to two seconds down to around 5 milliseconds. In other words, the TSE is now “fit for purpose” for algorithmic trading and, in particular, for high frequency trading. Previously, with latencies being so long, high frequency firms who, for example, wanted to market-make on the TSE, simply weren’t prepared to take the risk of the exposure and uncertainty that such high latencies bring. Since Arrowhead's launch in January 2010, and according to the TSE’s own figures, the total number of orders on the TSE has risen by a modest 25%, but the proportion of orders submitted from co-location facilities has more than doubled. 

Progress exhibited and spoke at Tradetech Japan and we were joined on our stand by our partner, Tosho Computer Systems, that we’re working with on a service in Japan which we will be launching later this year. They’ll be more news on that nearer the time. Attendance wise, Tradetech was down on its peak in Japan in 2008, but up on previous years – a reflection of the renewed interest in trading technology generally.

Market surveillance was one of the key topics that came up in Tradetech Q&A and panel discussions. This is common across pretty much any market now, with the essential question being: how can markets be kept safe as markets get faster and more complex? Some say there should be restrictions and whilst circuit breakers, insistence on pre-trade risk checks and similar are important, over emphasis on “the dangers” can hold markets back. Progress’ view is that regulators and exchanges should all move towards real-time market surveillance. (Find more on this here and here).

There’s a lot of emphasis at the moment in European and US markets on OTC derivatives regulation and the move of trading in such instruments onto exchanges. Japan is relatively advanced in this regard with regulation requiring many domestic OTC derivatives to be cleared, a trend which is happening elsewhere in Asia too more quickly than in Europe and the US. 

Regionally, Japan is the biggest developed market in terms of share trading volume. Twice as big in dollar terms than the next biggest, Hong Kong. But Japan is itself dwarfed now by China and I’ll be writing about that next.


Wednesday, June 15, 2011

The rise of algo trading in Asia - first stop Mumbai

Posted by Giles Nelson

I’ve just completed a three city Asian tour, in Mumbai, Tokyo and Beijing. The purpose was to promote Progress’s business in capital markets, in particular both Apama and our Responsive Process Management suite. I’m going to give my impressions on those markets – first up is Mumbai and the Indian market for electronic trading.

Before I start with India though, I’d like to share an interesting piece of academic work I came across. We do hear a lot about the rise of Asia economically, particularly the “big two” of India and China.  Many predict that China will become the world’s biggest economy in the next 10 years and that India’s population will exceed China’s by mid century. Both economies are predicted to grow between 8-10% in 2011. The predicted long-term shift in relative economic fortunes is nicely illustrated by some work done at the London School of Economics in 2010, a graphic from which is shown below.

  Screen shot 2011-06-15 at 15.57.25
© Danny Quah, London School of Economics, October 2010


Professor Danny Quah calculated the “the average location of economic activity across geographies on Earth”. Put another way, this shows the “economic centre of gravity” and shows its mid-Atlantic position in 1980 and its predicted position in the middle of Asia by 2040. The economic power shift is on.

Back to my trip, and my first stop in Mumbai, commercial and financial capital of India. I wrote on the growth of electronification and algorithmic trading in India last July. Since then the market has progressed. Progress has acquired its first clients in capital markets in India using Apama for trading and the market itself has evolved and liberalised. Last year I criticised the policy of the biggest stock exchange, the National SE, for its “algo approval process”. I thought this was a poorly structured attempt to protect the markets from “rogue algos” and would stymie development. I'm pleased to say that the NSE has now relaxed this policy (the latest version of which can be found here). A further significant development has been that now both equity exchanges, the NSE and the Bombay SE, allow smart order routing from co-location facilities at each exchange to one  another. This was previously not allowed by the NSE. There is also evidence that the adoption of algorithmic trading is changing the way the market works – manual arbitragers are heading towards extinction and arbitrage opportunities in general are becoming more difficult to find – evidence that information flow between markets is becoming more efficient. 

Together with CNBC, Progress held a well attended event for the industry in Mumbai. The highlight was a panel session which had the deputy CEO of the BSE, a senior member of the Indian securities regulator and a practitioner from a buy-side firm participating. There was a consensus that the continued development of algo trading was welcome in India – bringing technological innovation, putting pressure on costs, bringing liquidity and more competition. There is some caution, particularly when it comes to the unfettered use of algos.  Reference was made to recent talk in the US and Europe about introducing algo “inspection” to provide justification for caution. As I’ve said previously, algo inspection is inherently flawed – it is far better for markets to be protected through good pre-trade risk checks and real-time market surveillance (as discussed at length recently on this blog). The panel acknowledged that real-time surveillance was key for markets to operate well.

Despite the Indian market progressing, there are still challenges. There is obvious enmity between the two stock exchanges that should lessen if and when the organisation behind the Multi Commodity Exchange (MCX) receives an equity trading license, something which is expected soon. The granting of such a license and the introduction of more competition into the market can only be a positive move. Trading costs for equities in India are still high. There are two equity clearers, each owned by the two exchanges which do not interoperate and foreign financial institutions need a license from the regulator to trade and an Indian local legal entity – a process which inhibits foreign firms entering the market and thus reduces competitive pressure on domestic firms.

In my view one change that India will see in the coming years is significant broker consolidation. Currently there are around 3000 brokers in the Indian market. Many of these are small and, in my opinion, will not survive in a market where access to technology will be needed to compete. Many will therefore go out of business or be forced to merge.

The market for algo trading in India is growing. Although it hasn’t reached a tipping point yet, it has all the promise to be one of the most important markets in Asia.

Next stop was Tokyo for Tradetech Japan 2011. I’ll talk about that tomorrow.



Monday, June 13, 2011

Tickets Please: Technology to Keep You on the Train

Posted by Dan Hubscher

The ticket to preventing and deterring rogue trading could well be technology.  Although most financial services firms have some form of surveillance and monitoring technology in place, it isn't good enough to keep them from getting kicked off the regulation train.

Financial services firms risk running afoul of new regulations because their technology is not the "right" technology anymore. The burning question now is – what will be the “right” technology be, in an unpredictable future?

Detecting, preventing and deterring market abuse can only be effective when it permeates financial services activities from pre-trade to settlement. The number of different places that trading activity occurs is constantly increasing. Trading can be done at the office, or via cell phone. Or a trader can begin to work a deal at the office, go for lunch and finish it via instant messaging with his broker.

Surveillance is necessary in order to provide transparency in trading activity, whether it is via formal trading platforms, using an instant messenger platform, e-mails, Twitter or other social media sites, or even old-fashioned phone conversations. Compliance officers need to have full visibility in order to spot and prevent abusive trading activity - and that vision has to encompass it all; every message, every trade, every conversation, every Tweet has to be recorded, taped and downloaded into a database for on-the-spot or future scrutiny.

The technology of yesterday will not be able to cope with the audit trail of today. Plus those audit trails need to occur in real-time, not just looking back over history. This means that current methods, employing historical analysis of already-old data just won't do. Analysis has to be done both in real-time and historically in order to make sense. It has to span asset classes including cash equities, interest rates, swaps, commodities, OTC derivatives - cleared or not. Silos can no longer exist in terms of monitoring; trading today is truly democratic, crossing borders, asset classes and currencies.

New market abuses seemingly proliferate by the day. Some are really the old ones - only done faster (like front running), but there are fresh ones too. Just last week the SEC suspended trading in 17 OTC microcap stocks because of doubts over the publicly available information on the companies.  Here, investigators from different offices and working groups used “a coordinated, proactive approach to detecting and deterring fraud.”

Packaged applications cannot handle new rules or monitor new types of market abuses. Add flash crashes, mini flash crashes, cross-asset crashes (we call these "splash crashes") to the mix and a picture starts to reveal itself. In this picture there are Chief Compliance and Technology Officers handing the regulatory conductors their tickets to prove that they have the right technology, and then getting kicked off the train because they have the wrong tickets.

Flexible, extensible surveillance and monitoring technology is the top-up fare needed to stay on the train. If you can see every move your traders make today, you can take control. If you can see every move your traders make down the line, you will stay in control.  A real-time platform that can handle the massive, increasing volumes of transactions and events in today's electronic marketplaces, and handle the rules of tomorrow’s, is imperative to staying on top of rapidly changing regulations. 



Thursday, June 09, 2011

All Change: When to Prepare for New Regulations

Posted by Dan Hubscher

As financial institutions bemoan the uncertainty still hovering over Dodd-Frank implementation and possible delays, there are steps that they can take to prepare for them even before the ink dries. Otherwise compliance can cause major disruptions to their business operations.

Brokers, in particular, cannot afford to wait to protect themselves and their clients from market risks, and from running afoul of shifting market regulations. Attracting new clients - and retaining existing clients - will depend increasingly upon whether a broker has measures in place to protect a client's interests.

Dipping an unprotected toe into a market where a flash crash can happen at any moment can be frightening. This is not scaremongering, it is the market today; insider trading and market manipulation can happen within the best of trading firms.  And algorithms allowed to run without proper controls can take a company's balance sheet from black to red, or worse. The debate around the notion of regulators reviewing algos before they go to market (see Larry Tabb’s article and resulting commentary here) is a clear indication of nervous market sentiment (no Twitter trading analytics required). 

Regulators are laying miles of new tracks (rules) for high-speed and heavy freight trains running through the electronic trading frontier, and they are preparing to make sure the trains stay on the tracks. Trade monitoring, auditing, and abuse prevention requirements abound throughout the proposed rules, including recent efforts to detect market manipulation. But there is little reason for financial firms to wait for strict definitions of what “sufficient” measures are in a market that continuously gives us examples of what to detect, prevent, and deter.  

By gaining real-time visibility to potentially abusive and erroneous trading activities, brokers can quickly pinpoint threats.   But brokers need to constantly adapt detection scenarios to new threats; and they also need to tailor their responses and modify them without disrupting their trading operations.  Responsiveness is more than lightning-quick reflexes.  Flexibility is also key; and is what will transform regulatory compliance into competitive advantage now and into the future.

Despite complaints, media interest and suggestions to the contrary, algorithms and high frequency trading are not going away. When problems such as crashes or abuse occur it is partly because regulators have not yet had the chance to get a uniform, industry-wide grip on how long-standing underlying market practices manifest in the new, high-speed environment, and how compliance departments should monitor them.

HFT is not necessarily the culprit; market structure must also be questioned.   High frequency trading shops often act as de-facto market makers. During the flash crash some HFT’s algorithms sensed a problem and pulled out.  If HFT's have replaced traditional market makers, why have the market making obligations not carried over?  And should they?  Now we must ask who - if anyone - should have an obligation to stay in the game when things go wrong, and the eventual answer will change market participants’ business models.

Human beings continue to possess attributes that computers cannot. But the human intelligence to slow things down in times of stress, or provide real liquidity in a two-sided market - not just volume - is a decision that can be automated. This is necessary, in fact, because humans can't step in fast enough in today's hyper-speed markets. Therefore, financial organizations have no choice but to use technology and applications that ensure compliance, though not at the expense of efficient trading operations. This kind of technology enables firms to be compliant in new ways, including transparency and reporting.

The ability to detect abuse and operational errors in real time, along with the flexibility to modify scenarios in response to new conditions, while staying ahead as regulations change, differentiates a broker competitively. The buy side wants safety in the marketplace, and it is up to the sell side to make sure their buy side customers feel secure.

New regulations bring new headaches to organisations as they have to add or change both applications and operations to comply. Having the ability to sense threats, and to respond in real-time, is just the compliance “price of entry” to the market today.  Being able to comply with new mandates quickly is the differentiator, and there will be no shortage of surprises in store as the regulatory trains rumble towards an unknown destination. 




Wednesday, April 13, 2011

Can Anything Be Done To Prevent A Second Flash Crash?

Posted by Dan Hubscher

May 6th, 2010 lives in infamy in financial markets; a precipitate dive on the US stock market wiping a trillion dollars off the value of equities in just under 30 minutes stunned the industry.

The speed at which such an astonishing crash took place was partly in thanks to the ever-shrinking timeframes in use in electronic trading methods such as algorithmic and high frequency trading (HFT).  These techniques eventually find a welcome home in many markets due to lower costs, smaller spreads, and the ability to capture alpha.  But the damage that results when automation goes awry - or is used improperly - shows that high speed trading does require high speed controls. 

As the days and weeks wore on after the flash crash, regulators realized that they had neither the tools nor the authority to look closely enough and explain what had happened. HFT and algorithmic trading were so embedded in daily market transactions that the volume of trades simply could not be tracked easily.

Since the flash crash, US regulators have been traveling two roads.  On one road, the regulators remain focused on cleaning up domestic equities trading, to prevent another flash crash.  On the other road, regulators are reforming OTC derivatives trading in response to the global financial crisis, by mimicking the market structure and transparency of the equities market, as if the flash crash had never happened. Meanwhile, markets have continued to evolve as HFT continues to spread out across exchanges around the world.

Not only are algorithmic and high frequency trading taking a growing share of activity in equities, but they are also spreading into other asset classes, such as commodities, currencies, and a wide variety of complex derivatives. The proliferation of HFT in new asset classes begs the question: are we now headed for a second flash crash?

The honest answer is: probably. And not just in equities. The interdependence between some of these asset classes also means that the crash could 'splash' across them. The use of high frequency trading strategies has already achieved dominance in stock exchanges, and migration into other asset classes and other national markets causes concern. For example, we have already seen algorithmic trading spreading beyond equities and commodities to currencies and fixed income.

A mini flash crash in Japanese yen in March was eerily reminiscent of the May 6th equities crash. If liquidity in the FX market, which trades trillions of dollars daily, could instantly dry up then there is clearly a need for better monitoring and controls in that asset class.

Measures needed to prevent a second flash crash from taking place include both regulation and technology. Market surveillance and monitoring across the board has to include regulators, trading venues, and brokers in order to be effective.

Being responsive - and responsible - means being proactive; measures have to be put into place before the next flash crash. To make this happen, sophisticated real-time surveillance is a must for monitoring anomalous market patterns, including abuses and errors; for checking sponsored access clients' credit risk; and for balancing market position limits. None of these techniques replace the need for standard compliance tools such as historical looks and material penalties for those who stray from regulatory mandates.  But the flash crash has showed that there’s an additional need to protect our markets from the damage that can’t be fixed at the end of the day.  Don't wait until it is too late, that's like driving while looking in the rear-view mirror.


Monday, April 11, 2011

How Fast Is Fast Enough?

Posted by Dan Hubscher

When thinking about technology and speed, the usual issue that springs to mind is how fast a transaction can take place.  Gone already are the days when every millisecond mattered; now we count in microseconds.  Or nanoseconds.  And debates already rage about the possibility of picoseconds, as if that’s all that matters.  Regardless of where you fall on the low latency spectrum, you can’t ignore the other side of speed – rapid customization. 

A new trading idea may work on the microsecond timescale, or nanosecond, or wherever the realistic limit is for any particular firm.  But the idea won’t be unique for long.  The question is, then, how to deploy that idea to the market first?  Specifically, how quickly can a new technology itself be implemented?  How quickly can a trading strategy be modified – the first time, the second time, the fifteenth time, to suit new market opportunities as they emerge?  How quickly can the expression of a trader’s intellectual property start delivering operational benefits and competitive advantage?  And, how quickly is that advantage eroded by others who are similarly responsive?
The customization frontier of algorithmic trading competition is especially pertinent when new regulatory measures are announced.  Today’s traders face dramatic change across asset classes and geographical boundaries.  Those who can implement compliant strategies the fastest not only stay on the right side of the law, but can attract new business by entering new markets ahead of the competition.
TradeTech 2011 starts tomorrow in London. There, technology providers, market operators, and market participants will meet to discuss the latest and greatest developments in our industry. The first question from buyers is always ‘how fast?’ instead of ‘how much.’   But as pent-up demand starts to accelerate, as budget starts to be released for new technology projects, traders and CIOs alike need to know their new ideas can be executed quickly.  The necessity is more than just quick return on investment; it is also getting ahead of rivals. Urgency is the modus operandi.
To meet this need, businesses need two kinds of responsiveness. The ability to respond to opportunities in real-time is imperative.   But, businesses also need agility ‘built in’ to their technical infrastructure. With new rules being laid out in both the US and Europe, buying something that was pre-defined in a previous era and closed to further change won’t work as deadlines draw closer. Agility will become the watch-word of purchasing decisions – and, undoubtedly, a major theme of conversations at TradeTech this week.


Thursday, March 24, 2011

Detection, Prevention, Deterrence (oh my! )

Posted by Dan Hubscher

It is gratifying to see the serious attention that regulators, traders, brokers and the buyside are paying to market surveillance today. It was not always the case. The credit crisis began a domino-effect market crash that rang alarm bells and woke them up; then the flash crash jolted them like a cup of three-shot espresso on an empty stomach.  But, despite the attention and the good-spirited attempts to put monitoring and surveillance into place globally, the changing face of regulations can create fragmented and inconsistent results.

Market surveillance today resembles a patchwork quilt, some programs monitor in real time and others look backward, said Miranda Mizen, Principal of TABB Group in a paper entitled Dynamic Surveillance: Detection, Prevention, and Deterrence.  Brokers' internal risk controls are differentiated from their clients' - for those that actually use pre-trade risk controls on their clients, not all do yet. Exchanges have varying surveillance programs and have operated in a siloed environment, with disparate procedures and processes causing a disconnect with their colleagues.

Mizen said that the drive to create more dynamic, comprehensive programs and techniques boils down to three core objectives: detection, prevention, and deterrence.

Detection of market abuse while, after, or even before it happens will help to bring back investor confidence.  Prevention of fat finger trades and rogue algorithms entering the market will go a long way to avoiding more flash crashes, or even splash crashes across asset classes. Prevention is rapidly becoming the domain of brokers, which sit squarely in the middle between order flow and trading venues.

Deterrence via real-time monitoring, fines and convictions should help to clamp down on market abuse. Although it will never be completely stamped out as long as it seems profitable, new policing techniques will up the ante so abusers will also have to up their game, said Mizen.

Consistent, viable market surveillance across the board has to include regulators, exchanges and ECNs, brokers and buyside firms in order to be effective. Sophisticated real-time surveillance is crucial for monitoring market patterns, for checking sponsored access clients' credit risk, for balancing position limits. It needs to be coupled with historical surveillance in order to compare market movements and flag possible abusive patterns.

The demand for sophisticated real-time surveillance adds to, rather than replaces, historical views, and the two increasingly overlap. Both are essential. For example, a client's credit risk needs constant monitoring (real-time) for sponsored access, but the global credit risk may not be verified until the overnight processes (historical) have run, said Mizen in her paper.

Real-time and historical pattern detection can also complement each other. If there is market activity several standard deviations from the norm within a two-minute window it could raise an intra-day red flag to the regulator, yet concluded to be non-abusive. If, however, this same activity surfaces at the broker along with correlated activity in another asset class during an overnight data crunch - it looks more suspicious. Real-time or historical, surveillance of flow and markets needs to be programmed tightly enough so that the patterns are not falsely alerting officials. 

Mizen said that of the three - detection, prevention and deterrence - prevention needs the most focus to protect markets and prevent disruptive, destructive order flow.  If using a combination of sophisticated real-time and historical surveillance tools can prevent another flash crash, stop market abuse in its tracks, and help to catch and punish wrongdoers it will go a long way to bringing back investor confidence.