Algorithmic Trading

Friday, November 11, 2011

Can market surveillance help to keep traders on track?

Posted by Richard Bentley

Richard BentleyBy Richard Bentley, Vice President, Capital Markets, Progress Software

There’s no doubt that today's high speed capital markets and cross product, cross market trade volumes mean regulation struggles to keep up with changes in the market.  MiFID II is an example of a financial regulatory directive that is seen by many as lacking real detail and remaining open to interpretation - and misinterpretation. In a panel discussion at the TABB Group Trading Surveillance event in London on last Wednesday evening, industry experts agreed that, in Europe at least, few financial services firms are afraid of regulators.

So as many new regulations remain wooly, ignored or have yet to be implemented - or in the case of ESMA (the European Securities and Markets Authority) the regulation is simply statements of clarification – the panel was asked how surveillance and risk is going to be managed moving forward? Questions were also  raised about the regulatory burden in the future and whether those outside of the "Big Five" would be able to resource the demands for growing compliance departments. Will this lead to an uneven playing field?

According to TABB Group new compliance costs are indicated at between 512 and 732 million euro, with ongoing costs between 312 and 586 million euros.  But while regulators are still determining what regulation will look like, the need for market surveillance is undiminished. Traders made about 13.3 billion euros ($18.2 billion) from market manipulation and insider dealing on EU equity markets in 2010, according to an EU commission study.  With some arguing that firms can only do so much to survey markets themselves as trades cross multiple brokers and gateways, the panel discussed the need for fragmented market data to be brought together in a consolidated tape and surveillance performed at an aggregate market-wide level. 

With respect to High Frequency Trading, there was discussion and agreement that pre-trade checks should be built in and regulators should be feared, as in some Asian markets where some market participants adopt a mindset that constantly asks "will I be allowed to trade today". That "Fear Factor" is key and there isn't fear of regulation yet in Europe.

The timeliness of market surveillance was discussed with the panel suggesting that transactions should be monitored retrospectively, but also in real-time as they happen. Clearly, there’s still a role for historic analysis of the market as some abuse takes place over an extended period of time and new abuse scenarios are discovered which can then be applied to historical data. It’s a little like having your DNA stored on file for a time in the future when forensic techniques improve. But there is also no doubt that the need for real-time surveillance to spot manipulation as it happens can be a significant factor for organisations looking to protect themselves and the market, which is one of the reasons it is mandated by Dodd-Frank and MiFID II.

Finally, the panel discussed how turbulent markets and highly publicised breaches of banking controls have demonstrated the importance of protecting market integrity. So while an increase in the complexity of market surveillance inevitably leads to an increase in cost, the panel felt that the punitive and reputational risks associated with surveillance failures justify the business case for improving compliance training, processes and technology.  After all, just as you wouldn’t expect the police to prevent all crime by themselves, it’s clear that investment is needed in surveillance technology to give the regulators a helping hand.

Wednesday, September 14, 2011

Is Revolution the Path to Transparency?

Posted by Dan Hubscher

Revolutions are proliferating.  When you watch a revolution happening elsewhere, political or otherwise, it’s a good time to contemplate the revolution in your own history, or in your future.  There are few among us that can’t point to one or the other.  One of the common drivers is the fear that something is happening where we can’t see it happen, and we want transparency – of process, of government – of whatever seems to be wrong.

The capital markets globally are experiencing a similar revolution now with regulatory change, and the current climate threatens to create a revolt as well.  Market participants may push back on reforms to the point of creating a new state of stress.  Either way, the future presents very real threats to companies that aren’t prepared.  We’re observing a vast expansion of global rulemaking, and a coming deluge of data - especially in the derivatives markets. It’s very expensive and distracting to fix problems after the fact, so we need to act now.  “Hope is not a strategy” – as is often said to have been uttered by famed (American) football coach Vince Lombardi.

In an open letter to Barack Obama published on January 23, 2009, Benjamin Ola Akande advised, "Yet, the fact remains that hope will not reduce housing foreclosures. Hope does not stop a recession. Hope cannot create jobs. Hope will not prevent catastrophic failures of banks. Hope is not a strategy."

Now we have the Dodd-Frank Act in the U.S., MiFID II and EMIR in Europe, all preceded by the de Larosiere Report (EC, 2009), Turner Report (FSA, 2009), Volcker Report (G30, 2009), G20 – Feb 2009 Declarations, Financial Stability Forum Report (FSF, 2009), INF Report (IMF, 2009), Walker Review (UK, 2009), Basel / IOSCO Reviews… the list goes on.  And the rest of the world is watching, waiting, for another revolution.  The intended scope of the most recent reforms seems to almost be panacea, and transparency is the first step.

The next Revolution is happening in Boston, fittingly.  Progress Revolution 2011, from September 19th through the 22nd, offers the chance to learn from industry innovators on how they have successfully tackled these challenges within the capital markets.  Customers including PLUS Markets and Morgan Stanley will be there to share success stories.  And Kevin McPartland, Principal at the TABB Group, will be there too.  I’ve included a sneak peek into Kevin’s “Path to Transparency” below.

According to the New York Times, at the Republican Convention in 2008, Rudy Giuliani once said while contemplating Barack Obama’s candidacy, “… ‘change’ is not a destination ... just as ‘hope’ is not a strategy.”  Rudy will be speaking at our Revolution too.  Will you be there?  It will be a lively conference – I hope that you can join us!

-Dan

The Path to Transparency

By Kevin McPartland, Principal, TABB Group

Managing the vast quantities of data born into existence by the Dodd Frank Act and related regulation will present a challenge in the post-DFA environment; but collecting and producing the required data is just the tip of the iceberg. The ability to analyze and act on that data is what will separate the survivors from the winners. This is already true in many other parts of the global financial markets, but the complexities inherent in swaps trading coupled with the speed at which these changes will take place creates unique challenges. Spread this across all five major asset classes and three major geographies, and the complexities become more pronounced.

Margin calculations are proving to be one of the biggest concerns for those revamping their OTC derivatives infrastructure. In a non-cleared world, dealers determine collateral requirements for each client and collect variation margin on a periodic schedule—in some cases once a month, and in other cases once a year. When those swaps are moved to a cleared environment, margin calculations will need to occur at least daily. The result is an upgrade of the current batch process with dozens of inputs to a near-real time process, with hundreds of inputs. Whereas before major dealers could perform margin analysis, client reporting and risk management in a single system, those systems now need to operate independently within an infrastructure that provides the necessary capacity and speed.

The trading desk will require a similar seismic shift, as flow businesses will provide liquidity across multiple trading venues to an expanding client base. Most major dealers are at some stage of developing liquidity aggregation technology intended to provide a single view of liquidity across multiple swap execution venues. Creating this type of virtual order book requires receiving multiple real-time data feeds and aggregating the bids and offers in real time.

Furthermore, rather than comparing model-derived prices to the last trade price to produce quotes, inputs from SEFs, CCPs, SDRs, internal models, third-party models and market data providers will be required inputs to real-time trading algorithms once reserved for exchange-traded derivatives.

Providing clients with execution services presents other challenges. Executing on multiple platforms also means tracking and applying commission rates per client per venue in real time. Trade allocations also complicate the execution process.  In the bilateral world a big asset manager can do a $100 million interest rate swap and spread that exposure across multiple funds as it sees fit. Under the DFA, the executing broker must know which funds are getting how much exposure. Account allocation in and of itself is not new, but cost averaging multiple swap trades and allocating the right exposure at the right price to the proper account presents complex challenges, especially in a near-real time environment.

Risk management, compliance and back-testing data will also require huge increases in processing power, often at lower latencies. Risk models and stress tests, for example, are much more robust than they were before the financial crisis, requiring a considerably higher amount of historical data.

Compliance departments now must store the requisite seven years of data so they can reconstruct any trade at any moment in the past. This is complicated enough in listed markets, when every market data tick must be stored, but for fixed-income securities and other swaps, storing the needed curves means that billions of records must not only be filed away but retrievable on demand. Similar concerns exist for quants back-testing their latest trading strategies: It is not only the new data being generated that must be dealt with. Existing data, too, is about to see a huge uptick in requirements.

In the end these changes should achieve some of the goals set forth by Congress as they enacted Dodd Frank – increased transparency and reduced systemic risk.  The road there will be bumpy and expensive, but the opportunities created by both the journey and the destination will outweigh any short term pain.

This perspective was taken from the recent TABB Group study Technology and Financial Reform: Data, Derivatives and Decision Making.

Wednesday, August 17, 2011

DON'T PANIC! Maybe a Depressed Robot is to Blame

Posted by Dan Hubscher

The market frenzy that whiplashed global markets last week has been - unsurprisingly  - blamed on high frequency trading. Headlines such as these fuelled the fire:

  • "The Trading Game is Causing Market Panic" - The Atlantic.
  • "High speed traders are exacerbating volatility" -  the Financial Times.
  • "High Frequency Traders Win in Market Bloodbath" - the Wall Street Journal.
  • "High Frequency Trading May Be Making Things Worse On Stock Markets" - HuffPo.
  • "Black box trading is the real hazard to markets, says Lord Myners" - The Guardian

As you can see, the media, brokers, and investors were quick to point fingers at HFT and, in the meantime, the SEC has sent out subpoenas to high-frequency trading firms in relation to last year's flash crash probe, according to the Wall Street Journal.

But does HFT cause panic? Panic, which is an emotional reaction to fear,  is an inflammatory word. Panic is contagious, especially in markets. Because panic is ultimately a human emotion it may not be the most accurate word to describe the moves of cold-blooded trading algorithms. But when you see trading robots hit stop-loss after stop-loss, triggering buying or selling in light-speed over and over again, the voluminous activity is bound to mimic actual panic.

The markets today make the struggle between human investors versus high frequency trading firms look like an intergalactic battle. But while the humans are sitting behind the wheel of a broken-down old car, HFTs are soaring on a spaceship equipped with an Infinite Improbability Drive. 

And the algorithms that dominate the markets - along with their HFT owners - are getting a bad reputation. They are being portrayed as the villains in this uber-sensitive economic environment.

The current dialog casts them as Vogons, the bad guys in the brilliant Hitchhikers Guide to the Galaxy (BBC series and books by Douglas Adams). The Vogons are described as "one of the most unpleasant races in the galaxy. Not actually evil, but bad-tempered, bureaucratic, officious, and callous."

High frequency trading is not actually evil either, it is the natural development of quantitative trading, and algorithmic trading which grew out of the new market structure and automation. Although quantitative trading takes the emotion OUT of trading, something has gone wrong. The effect on volatility is a matter of debate; perceptions around this question are creating panic and upsetting investors.

Investors boycotted HFT in the week ended August 10th, dragging $50 or $60 billion out of the stock market and walking it over into safer money market accounts. So what can be done to restore investor confidence? One bemused market participant said last week that what was needed was an algorithm with a built-in panic button. Instead of tumbling after the market when a stop is hit, the algo could pause and take a breath and think about what it is doing.  But would that lessen volatility? Because, again, thinking is a human activity.

Ford Prefect, one of the protagonists in Hitchhikers, said about Vogons: "They don't think, they don't imagine, most of them can't even spell, they just run things." Like the Vogons, algorithms are not great thinkers. And, with upwards of 70% of equities trading being done by algorithms, they do pretty much run things.

But blaming HFT (or Vogons) for market swings is like blaming social media for the riots in England. Yes, it was easier to get everyone together for a looting-and-burning session by using Twitter and Facebook, which made the situation worse.  And social media also helped people organize the clean-up. Would the same have happened, differently, without social media?  Likely yes.  HFT and social media have another thing in common - because they are automated they can easily be monitored and controlled, if regulations allow it. What if authorities could recognize the electronic early signs of a riot, send a warning to citizens, and arrive at the scene faster?  In the UK, the government is considering blocking Twitter and Facebook during a major national emergency. 

I am not suggesting that the government or the regulators should block HFT when markets run riot. And our markets are still tinkering with just the right recipe for actual panic buttons; witness the dislocations of the May 6th flash crash and the ensuing coordinated circuit breaker and other limit debates.

There a number of government investigations - including the US and the UK - into high frequency trading practices. Some are trying to shine a light on algorithms and determine whether their behavior is perhaps predatory or disruptive. Lifting the veil of secrecy on algorithmic and high frequency trading could be a way of regaining investor confidence. But that is a double-edged sword; if firms offer up too much about the methodologies and strategies behind their algos they could also be giving away their secret formulae.

Giving regulators a balanced peek under the covers might help. Regulators already can and do deploy market surveillance and monitoring tools to help prevent market abuse at high speed. That may be a difficult pill to swallow, but the Hitchhiker’s Guide would tell us – DON’T PANIC.  As many damning studies as there are, others come to a different conclusion ("Study gives backing to high-speed trading" – Financial News).  If something isn't done proactively - and soon - then investors, politicians, regulators and other nay-sayers are going to be calling for an end to HFT completely.

As Marvin the depressed robot in Hitchhikers said: "I've calculated your chance of survival, but I don't think you'll like it." 

-Dan

 

Wednesday, July 20, 2011

HFT Volume: Cool Liquidity or Just Hot Air?

Posted by Dan Hubscher

There have been some heated public debates recently about the kind of liquidity that is being provided by high frequency trading. In an earlier post on Tabb Forum Candyce Edelen from Propel Growth asks if HFTs provide liquidity or do they just provide volume?

Liquidity is bread and butter for firms associated with financial markets; it provides an easy way to get in and out of markets while - hopefully - making money. It is essential to investors who want to know that they can get in or out of a market when they choose. Therefore liquidity is good. (The icing on the cake would be over the counter derivatives markets where firms can use opacity to their profitable advantage.)

Liquidity, from the word liquid, depicts a flow of activity that is smooth and constant.  Investopedia says that liquidity is characterized by a high level of trading activity.

It is clear is that HFT provides that high level of trading activity. What is less clear is: is it really liquidity? High quote-to-cancel rates, with pre-programmed cancellations sometimes happening in 2 to 5 milliseconds, lead to perceived rather than actual volume. Liquidity only occurs when buyer and seller can get together and deal. If a sell quote disappears before the buyer is even aware of it, what advantage does that offer the buyer?

I applaud Edelen for asking outright whether HFTs increase liquidity or just volume.  The “just volume” replies were no surprise, but some thoughtful counter-arguments appeared in the comments to support the liquidity argument as well.  Interesting, but no consensus.  So how we expect regulators to figure it out is anybody’s guess.

There is also a lack of clarity as to whether HFTs should be market makers, thereby ensuring liquidity even in volatile markets. Edelen asks if we need new obligations for the new breed of market makers. The May 6th flash crash demonstrated what happens when uncertainty and extreme price movements occurred - the market makers fled. Computerized trading algorithms quit the marketplace even when their owners were designated as market makers, thus steepening the fall. But, since traditional market makers have all but disappeared, HFT market makers are what we have left. If you take them away, what remains?

In the flurry of comments to Edelen's article, it seems that on this at least the pundits seem stumped.  They remarked on what the obligations and incentives are or are not, but could not seem to find an answer to what they should be.  Instead they go away from market makers to seemingly trying to redefine what the market itself ought to be instead – with little success.  This may give us a clue as to why regulators and legislators can’t figure it out either.

And do we need a level playing field as some are calling for? If everyone gets the same data at the same time and can only trade at the same speed, will that solve the problems? Intuitively people seem to feel the answer is “yes” but the commentators could not even agree what the playing field or fields would look like.  Most seemed to shy away from trying to put the genie back in the bottle and going back to the way things were.  Dark pools are another sore subject, with some claiming that the growing level of volume traded in dark pools (possibly 30% of volume) is due to the fear that HFTs are creating in the lit markets.

High frequency trading is here to stay, and debate over its role and future is healthy. But are all HFTs the same, asks Edelen? Here finally we seem to have some consensus from the pundits, who appear to have argued themselves into this position:  There is Good HFT and there is Bad HFT.  We can’t define it, but we think we know it when we see it. This is the same as with any tool or technology or trading model – there are beneficial and irresponsible or harmful uses; the line between them is blurry, and the distinction springs from the user of the tool, not the tool itself.

So what are we to do with the “Bad HFT?”  This seems to be the real question.  So far we’ve heard:

  • The regulators haven’t figured it out and can’t – regulators worldwide have managed to define a small number of abusive trading scenarios, and demand compliance, but they can’t hope to cover it all
  • The markets themselves haven’t figured it out and don’t have much incentive to do so
  • The market participants can't come to a consensus - and probably won’t
  • Wishing HFT away won’t fix it – either by slowing it down with a Tobin Tax, or a liquidity fee, or by outlawing it altogether – that is throwing out the baby with the bath water and someone will always find a way to abuse what remains.

What we are left with is the capitalist mantra of “buyer beware.”  But there are ways that the buyer can equip himself to join the 21st Century and - at the same time - protect himself from high frequency errors, fraud or predators.  

As we’ve often said, high speed trading needs high speed controls.  Like a car racing down the highway, some controls will come from the lurking cop (the regulator). But the driver needs to make sure the brakes work as well as the gas pedal in order to avoid an accident.  Of course, the speed itself begs the question of whether HFT needs to have a speed limit in order to stay under control.  We don't think a speed limit is necessary.

Instead, the issue is detecting abuse and mistakes, regardless of speed or source.  If you have the market surveillance and monitoring tools that can alert you to problems occurring at high speed - if you can monitor your own car, and the cars on the road around you and see a problem coming - an accident can be avoided. And if there is no accident, despite how fast the car is going, then the cops won't have to be called out.

-Dan

 

Wednesday, June 22, 2011

A foray into Beijing

Posted by Giles Nelson

Beijing was the last stop on my three city Asian tour and, from a personal perspective, the most exciting one as I’d never visited mainland China before.

China’s seemingly inexorable economic rise has been well documented. In the last 20 years, China’s GDP growth has averaged over 9%. As I travelled from the airport into Beijing’s central business district I saw few older buildings. Virtually everything, including the roads, looked as if it had been built in the last 10 years.

The Chinese stock market is big. In terms of the total number of dollars traded, the combined size of the two stock exchanges, Shanghai and Shenzhen, is approximately double that traded in the next biggest Asian market, Japan. The increase in stock trading has been very rapid. Trading volumes on Shanghai and Shenzhen have risen by approximately 20 fold in the past 5 years, although there has been significant volatility in this rise. 

The domestic algorithmic trading market is nascent. Currently, intra-day trading in company shares is not allowed. It is the recently established futures markets therefore where algorithmic and high-frequency trading are taking place. No figures exist on the proportion of trading done algorithmically in China currently, but I’m going to estimate it at 5%.

I was in Beijing to participate in the first capital markets event Progress has held there. Although Shanghai is the finance capital of China, we chose to hold the event in Beijing to follow up on previous work we'd done there. In the end, we had about 60 people along from domestic sell-side and buy-side firms attending which was a great result considering the relatively low profile Progress has at present in this market. There was optimism and an expectation that algorithmic trading had a bright future in China. 

I believe it's a practical certainty that the Chinese market will adopt algorithmic and high frequency trading. In every developed market a high, or very high, proportion of trading is done algorithmically and, although different regulations and dynamics make each market unique, nothing except an outright ban will prevent widespread adoption in every market in time. Liberalisation in China is occurring. For example, stock index futures are now traded, exchanges are supporting FIX, short-selling has been trialled and it is now easier for Chinese investors to access foreign markets. Also, earlier this year, the Brazilian exchange, BM&FBovespa, and the Shanghai exchange signed an agreement which may result in company cross listings. Only some of these changes support electronic trading growth directly but all are evidence that the liberalisation necessary to support such growth is happening. Inhibitors remain: no intra-day stock trading, restrictions on foreign firms trading on Chinese markets thus preventing competition and knowledge transfer from developed markets, and tight controls on trading in Renminbi. The Chinese regulators will continue to move cautiously. 

The question is not if, but when. We expect to sign our first Chinese customers soon. China is becoming a very important blip on the radar. 

 

Wednesday, June 15, 2011

The rise of algo trading in Asia - first stop Mumbai

Posted by Giles Nelson

I’ve just completed a three city Asian tour, in Mumbai, Tokyo and Beijing. The purpose was to promote Progress’s business in capital markets, in particular both Apama and our Responsive Process Management suite. I’m going to give my impressions on those markets – first up is Mumbai and the Indian market for electronic trading.

Before I start with India though, I’d like to share an interesting piece of academic work I came across. We do hear a lot about the rise of Asia economically, particularly the “big two” of India and China.  Many predict that China will become the world’s biggest economy in the next 10 years and that India’s population will exceed China’s by mid century. Both economies are predicted to grow between 8-10% in 2011. The predicted long-term shift in relative economic fortunes is nicely illustrated by some work done at the London School of Economics in 2010, a graphic from which is shown below.

  Screen shot 2011-06-15 at 15.57.25
© Danny Quah, London School of Economics, October 2010

 

Professor Danny Quah calculated the “the average location of economic activity across geographies on Earth”. Put another way, this shows the “economic centre of gravity” and shows its mid-Atlantic position in 1980 and its predicted position in the middle of Asia by 2040. The economic power shift is on.

Back to my trip, and my first stop in Mumbai, commercial and financial capital of India. I wrote on the growth of electronification and algorithmic trading in India last July. Since then the market has progressed. Progress has acquired its first clients in capital markets in India using Apama for trading and the market itself has evolved and liberalised. Last year I criticised the policy of the biggest stock exchange, the National SE, for its “algo approval process”. I thought this was a poorly structured attempt to protect the markets from “rogue algos” and would stymie development. I'm pleased to say that the NSE has now relaxed this policy (the latest version of which can be found here). A further significant development has been that now both equity exchanges, the NSE and the Bombay SE, allow smart order routing from co-location facilities at each exchange to one  another. This was previously not allowed by the NSE. There is also evidence that the adoption of algorithmic trading is changing the way the market works – manual arbitragers are heading towards extinction and arbitrage opportunities in general are becoming more difficult to find – evidence that information flow between markets is becoming more efficient. 

Together with CNBC, Progress held a well attended event for the industry in Mumbai. The highlight was a panel session which had the deputy CEO of the BSE, a senior member of the Indian securities regulator and a practitioner from a buy-side firm participating. There was a consensus that the continued development of algo trading was welcome in India – bringing technological innovation, putting pressure on costs, bringing liquidity and more competition. There is some caution, particularly when it comes to the unfettered use of algos.  Reference was made to recent talk in the US and Europe about introducing algo “inspection” to provide justification for caution. As I’ve said previously, algo inspection is inherently flawed – it is far better for markets to be protected through good pre-trade risk checks and real-time market surveillance (as discussed at length recently on this blog). The panel acknowledged that real-time surveillance was key for markets to operate well.

Despite the Indian market progressing, there are still challenges. There is obvious enmity between the two stock exchanges that should lessen if and when the organisation behind the Multi Commodity Exchange (MCX) receives an equity trading license, something which is expected soon. The granting of such a license and the introduction of more competition into the market can only be a positive move. Trading costs for equities in India are still high. There are two equity clearers, each owned by the two exchanges which do not interoperate and foreign financial institutions need a license from the regulator to trade and an Indian local legal entity – a process which inhibits foreign firms entering the market and thus reduces competitive pressure on domestic firms.

In my view one change that India will see in the coming years is significant broker consolidation. Currently there are around 3000 brokers in the Indian market. Many of these are small and, in my opinion, will not survive in a market where access to technology will be needed to compete. Many will therefore go out of business or be forced to merge.

The market for algo trading in India is growing. Although it hasn’t reached a tipping point yet, it has all the promise to be one of the most important markets in Asia.

Next stop was Tokyo for Tradetech Japan 2011. I’ll talk about that tomorrow.

 

 

Tuesday, June 14, 2011

Keeping the Train on Track

Posted by Dan Hubscher

It has been said that regulators are struggling to keep up with high frequency and algorithmic trading because they have outdated methodology and technology. This battle has been likened to trying to chase a Ferrari on a bicycle.

But what happens when the regulators are constantly changing the rules, and the Ferraris are turning in the wrong direction at high speed? I liken this to regulators trying to switch tracks while your HFT train is barrelling along at 100 mph. If you do not protect yourself from high speed changes, you might find yourself thrown off the tracks.  

The ability to respond to both regulatory change and split-second market anomalies can make the difference between emerging from the global financial crisis as a leader or as the firm that has to manage the aftermath of a messy derailment.

Imminent, sweeping regulatory reforms are not the only issues that firms have to grapple with.  Market structure changes underway in both the USA and Europe, from exchange mergers to regulatory “fine tuning” such as market maker quoting rules, circuit breakers, and limit up/down rules are already completely changing the game - and the playing field - and firms have to adapt to them. Cross-asset, cross-border trading is proliferating and creating new opportunities for arbitrage trading strategies that can throw a cross-asset "splash crash" into the mix.

Or, as Tabb Group told Advanced Trading: "strategies that go beyond speed, and emphasize 'cross-asset, cross-regional multi-temporal, asymmetric versus symmetric trades, even enhanced front-to-back automation'" are on the way. This means that real-time visibility - for spotting trading abuses, market anomalies and operational errors - is necessary on a global, cross-asset, 24/7 basis to remain in regulatory compliance and mitigate reputational damage.  

Also, when regulations change you need the flexibility to change your systems to match and manage them; flexibility is key. Financial firms are constantly on the move, changing trading strategies and products to stay competitive. New regulations can throw new strategies out of compliance - the tracks keep changing.

And what about looming regulations beyond today’s mandates?   The rush to real-time trade reporting of swaps, for example, is causing some consternation among market participants because it may impede trade flow, according to Operational Risk and Regulation.  Real-time reporting is intended to help in the fight to avoid market abuse and as an early warning system to detect systemic risk. But T0 may be overkill, especially if monitoring and surveillance tools are in place.

In the Risk article, Frederic Ponzo, managing partner at technology consultants GreySpark Partners, said: "The real benefit of real-time surveillance is with identification of fraudulent activities, market manipulation and errors."

 And if you could put compliance in control by building surveillance detection and workflows on your own terms, so much the better. Because every firm is unique, brokers need to customize their case management systems for efficient incident investigations, rapidly and continuously, in addition to customizing real-time abuse detection and market monitoring scenarios. 

By gaining visibility to potentially abusive and erroneous trading activities, with the flexibility to adapt to new trading patterns and regulations, firms can protect themselves and their clients from market risks and not run afoul of shifting market regulations.  They can quickly pinpoint threats and tailor responses without disruption, maintaining regulatory compliance now and into the future - effectively changing tracks at 100mph or more. 

-Dan

 

Monday, June 13, 2011

Tickets Please: Technology to Keep You on the Train

Posted by Dan Hubscher

The ticket to preventing and deterring rogue trading could well be technology.  Although most financial services firms have some form of surveillance and monitoring technology in place, it isn't good enough to keep them from getting kicked off the regulation train.

Financial services firms risk running afoul of new regulations because their technology is not the "right" technology anymore. The burning question now is – what will be the “right” technology be, in an unpredictable future?

Detecting, preventing and deterring market abuse can only be effective when it permeates financial services activities from pre-trade to settlement. The number of different places that trading activity occurs is constantly increasing. Trading can be done at the office, or via cell phone. Or a trader can begin to work a deal at the office, go for lunch and finish it via instant messaging with his broker.

Surveillance is necessary in order to provide transparency in trading activity, whether it is via formal trading platforms, using an instant messenger platform, e-mails, Twitter or other social media sites, or even old-fashioned phone conversations. Compliance officers need to have full visibility in order to spot and prevent abusive trading activity - and that vision has to encompass it all; every message, every trade, every conversation, every Tweet has to be recorded, taped and downloaded into a database for on-the-spot or future scrutiny.

The technology of yesterday will not be able to cope with the audit trail of today. Plus those audit trails need to occur in real-time, not just looking back over history. This means that current methods, employing historical analysis of already-old data just won't do. Analysis has to be done both in real-time and historically in order to make sense. It has to span asset classes including cash equities, interest rates, swaps, commodities, OTC derivatives - cleared or not. Silos can no longer exist in terms of monitoring; trading today is truly democratic, crossing borders, asset classes and currencies.

New market abuses seemingly proliferate by the day. Some are really the old ones - only done faster (like front running), but there are fresh ones too. Just last week the SEC suspended trading in 17 OTC microcap stocks because of doubts over the publicly available information on the companies.  Here, investigators from different offices and working groups used “a coordinated, proactive approach to detecting and deterring fraud.”

Packaged applications cannot handle new rules or monitor new types of market abuses. Add flash crashes, mini flash crashes, cross-asset crashes (we call these "splash crashes") to the mix and a picture starts to reveal itself. In this picture there are Chief Compliance and Technology Officers handing the regulatory conductors their tickets to prove that they have the right technology, and then getting kicked off the train because they have the wrong tickets.

Flexible, extensible surveillance and monitoring technology is the top-up fare needed to stay on the train. If you can see every move your traders make today, you can take control. If you can see every move your traders make down the line, you will stay in control.  A real-time platform that can handle the massive, increasing volumes of transactions and events in today's electronic marketplaces, and handle the rules of tomorrow’s, is imperative to staying on top of rapidly changing regulations. 

-Dan

 

Friday, June 10, 2011

PLUS Stock Exchange Making Progress Around Regulation

Posted by Richard Bentley

Richard BentleyChanges in financial regulation will inevitably increase the importance of market surveillance for exchanges as well as the regulators. Having asked our experts their opinions on how the changes in approach will affect the market in general, we are now looking at the impact on exchanges themselves, and how they can best prepare to serve their customers and remain compliant through this time of change.

In the last of our short videos, James Godwin and Tony Harrop from PLUS Stock Exchange tell us about how the Progress Apama solution is supporting PLUS Stock Exchange. In particular, we ask, how important is it to have market surveillance technology that can be flexible as new regulations are implemented?

Here they are the other 3 videos that were part of this 4 part series:

 

Thursday, June 09, 2011

All Change: When to Prepare for New Regulations

Posted by Dan Hubscher

As financial institutions bemoan the uncertainty still hovering over Dodd-Frank implementation and possible delays, there are steps that they can take to prepare for them even before the ink dries. Otherwise compliance can cause major disruptions to their business operations.

Brokers, in particular, cannot afford to wait to protect themselves and their clients from market risks, and from running afoul of shifting market regulations. Attracting new clients - and retaining existing clients - will depend increasingly upon whether a broker has measures in place to protect a client's interests.

Dipping an unprotected toe into a market where a flash crash can happen at any moment can be frightening. This is not scaremongering, it is the market today; insider trading and market manipulation can happen within the best of trading firms.  And algorithms allowed to run without proper controls can take a company's balance sheet from black to red, or worse. The debate around the notion of regulators reviewing algos before they go to market (see Larry Tabb’s article and resulting commentary here) is a clear indication of nervous market sentiment (no Twitter trading analytics required). 

Regulators are laying miles of new tracks (rules) for high-speed and heavy freight trains running through the electronic trading frontier, and they are preparing to make sure the trains stay on the tracks. Trade monitoring, auditing, and abuse prevention requirements abound throughout the proposed rules, including recent efforts to detect market manipulation. But there is little reason for financial firms to wait for strict definitions of what “sufficient” measures are in a market that continuously gives us examples of what to detect, prevent, and deter.  

By gaining real-time visibility to potentially abusive and erroneous trading activities, brokers can quickly pinpoint threats.   But brokers need to constantly adapt detection scenarios to new threats; and they also need to tailor their responses and modify them without disrupting their trading operations.  Responsiveness is more than lightning-quick reflexes.  Flexibility is also key; and is what will transform regulatory compliance into competitive advantage now and into the future.

Despite complaints, media interest and suggestions to the contrary, algorithms and high frequency trading are not going away. When problems such as crashes or abuse occur it is partly because regulators have not yet had the chance to get a uniform, industry-wide grip on how long-standing underlying market practices manifest in the new, high-speed environment, and how compliance departments should monitor them.

HFT is not necessarily the culprit; market structure must also be questioned.   High frequency trading shops often act as de-facto market makers. During the flash crash some HFT’s algorithms sensed a problem and pulled out.  If HFT's have replaced traditional market makers, why have the market making obligations not carried over?  And should they?  Now we must ask who - if anyone - should have an obligation to stay in the game when things go wrong, and the eventual answer will change market participants’ business models.

Human beings continue to possess attributes that computers cannot. But the human intelligence to slow things down in times of stress, or provide real liquidity in a two-sided market - not just volume - is a decision that can be automated. This is necessary, in fact, because humans can't step in fast enough in today's hyper-speed markets. Therefore, financial organizations have no choice but to use technology and applications that ensure compliance, though not at the expense of efficient trading operations. This kind of technology enables firms to be compliant in new ways, including transparency and reporting.

The ability to detect abuse and operational errors in real time, along with the flexibility to modify scenarios in response to new conditions, while staying ahead as regulations change, differentiates a broker competitively. The buy side wants safety in the marketplace, and it is up to the sell side to make sure their buy side customers feel secure.

New regulations bring new headaches to organisations as they have to add or change both applications and operations to comply. Having the ability to sense threats, and to respond in real-time, is just the compliance “price of entry” to the market today.  Being able to comply with new mandates quickly is the differentiator, and there will be no shortage of surprises in store as the regulatory trains rumble towards an unknown destination. 

-Dan