Market Surveillance

Monday, August 13, 2012

Bringing Order to Machine-led Chaos

Posted by The Progress Guys

Editor’s Note: the following post is written by Theo Hildyard, Solutions Architect, Market Surveillance at Progress Software was origianally published on TABB Forum.

Tabb.forum_.logo_

Since the May 6, 2010 flash crash the issue of out-of-control machines trading on global stock markets has made headlines over and over. Just last week a US market maker, Knight Capital, nearly blew itself up with a rogue algorithm. The calls for regulation are growing louder and regulators globally are struggling to bring order to an automated marketplace that is increasingly chaotic.

In India, the Securities and Exchange Board is considering imposing speed limits on high-frequency trading. The Hong Kong Securities and Futures Commission's CEO is very keen to regulate HFT and proposes that algorithms are tested annually. Australia's Securities and Investments Commission (ASIC) wants automated trading systems tested. In Europe the Securities and Markets Authority (ESMA) is preparing to crack down on every aspect of automated trading from algorithms to CDS to short selling. And in the US the Securities and Exchange Commission is tightening rules on automated trading systems and programs, with Knight Capital having added to the urgency.

Machines trade anywhere from 25% (Australia) to 70% (US) of the volume on stock exchanges. The opportunity to make money depends upon the speed of your trading systems along with the intelligence of your algos. Algorithmic innovation is critical in order for high frequency trading firms to find an edge. Research done by the AITE Group suggests that the average lifespan of a trading algorithm can be as short as three months. With such a small window of opportunity trading firms must design, test and deploy new algos on an almost continual basis. No wonder there are problems.

When we allow machines to make the decisions for us, it is imperative that we design them to be fail-proof. Testing in a non-production environment should be mandatory, and back-testing should be exhaustive. Poor quality due diligence and quality assurance is producing catastrophic consequences. It is our responsibility to ensure that our machines, or robots if you will, do no harm.

I am reminded of author Isaac Asimov's first law of robotics: "A robot may not injure a human being or, through inaction, allow a human being to come to harm." The 'harm' in Rule #1 is happening to the marketplace. The flash crash wiped a trillion dollars off of US-listed firms' market capitalization. Knight Capital's rogue algo wiped $440m off its balance sheet and forced it to look for backers in order to survive.

If algorithms and trading systems were programmed with Asimov-style parameters, there would be far fewer glitches. But even if you are the most conscientious firm out there, you cannot ensure that your counterparties have also programmed and tested their systems and algos thoroughly. And it remains your responsibility, to your customers, staff and shareholders, to ensure that those counterparties do not do any harm to your bottom line or reputation.

Catastrophe can only be avoided by adding an extra layer of control in the trading process; a layer which monitors counterparties for rogue algos or fat fingered trades. That way you have both belts and braces - control over internal trading systems and awareness of external ones. Yes, there will be a tiny bit more latency. But isn't a small latency hop better than bankruptcy? 

 

Thursday, July 19, 2012

Regulation: who’s in charge?

Posted by Richard Bentley

Exactly how can you enforce rules when there’s nobody in charge? This was a question that reared its head at the recent International Derivatives Expo (IDX) in London after a statement from David Lawton, Acting Director of Markets at the Financial Services Authority (FSA), who concluded his keynote address by imploring industry to ‘step up’ and take responsibility.

Traffic copThe clear inference from Mr. Lawton’s speech was that it’s the role of regulatory bodies such as the FSA to provide, in some cases, very detailed guidance, but the responsibility of those within the industry to implement it. To my mind this raises a number of questions – not least around how this should happen and, perhaps more importantly, whether or not these guidelines are actually enforceable without a regulatory body policing market activity?

If the onus is indeed placed on the industry to self-regulate, the first barrier that will need clearing is for all market participants to agree that it’s in their best interests to ensure the regulations are adhered to. Only then can we focus our attention on how guidelines such as those announced by The European Securities and Markets Authority (ESMA) regarding the systems and controls required in automated trading environments are followed.

That some form of self-regulation and enforcement is required is however not in doubt. Would you drive a car without brakes and then blame the traffic cops for not slowing you down when the inevitable crash occurs? Participants need to take responsibility for policing their own activities, by deploying the same kinds of real-time controls and surveillance capabilities as venues and the regulators themselves. Brokers and banks often have more visibility of their clients’ trading activities and positions than any one venue or indeed national regulator. (The recently announced plans for a Consolidated Audit Trail in the US will do nothing to change this, given the identity of the beneficiary end-point of each trade is not identifiable from the CAT.)

Increased use of pre- and post-trade surveillance tools will not only help the industry to, as Mr. Lawton suggests, ‘step up’, but could also restore some of the faith in the markets that has been lost in recent months and years. In the absence of any single authority capable of enforcing the rules, it is in the industry’s best interests that market participants fill the gap.

Wednesday, June 13, 2012

Therapy for Toxic FX Order Flow

Posted by Dan Hubscher

DhubscherAs high frequency and algorithmic trading infiltrate foreign exchange markets some of the problems that dog equities, such as high order cancellations, are arising.

Equities markets, which have seen HFT and algo trading go through the roof, have recently started clamping down on excessive and cancelled orders. As my colleague recently explored, Deutsche Börse, Borsa Italiana, NASDAQ and Direct Edge have all announced intentions to discourage the number of cancelled orders they receive. They will encourage the "good" liquidity, those players with high fill ratios, and punish the "bad".

The IntercontinentalExchange has already seen good results from a policy it implemented last year aiming to discourage "inefficient and excessive messaging without compromising market liquidity." Regulators, too, are taking note; the SEC is considering charging HFT firms for cancelled trades.

A combination of economic incentives and controls makes it happen.  In addition to adjustments to their rebate schemes, exchanges must monitor their market makers in real-time to make sure that they are living up to their quoting obligations. This monitoring can also include spotting the "Stupid Algos" blamed for generating a burden the exchanges cannot bear. 

It was only a matter of time before other asset classes started to see similar problems with excessive orders, and a similar response via a new generation of intelligent “sensing” algos – but with a twist.

FX is increasingly traded by computers.  Consultancy Aite Group said in a report last year that FX algorithms will account for more than 25% of FX trade volume by the end of 2014. And as algorithms take control, the opportunity for a flood of quotes and cancellations increases. Order-to-trade ratios, the number of orders that come in compared with the number filled, FX Algorithm_Toxic Flow Warning_Progress Software create a load on exchanges and electronic markets and they can provide a smokescreen to hide potentially abusive behavior (so-called “quote stuffing”).

We see innovative FX brokers taking measures to rein in unproductive order flow.  Similar to equities marketplaces, FX dealers and brokers are increasingly utilizing tactics that discourage excessive orders, but in a very different way. Because FX is mainly traded via single dealer platforms, multi-dealer platforms such as FXall, and interdealer marketplaces, it is fragmented in a different way from equities. 

So it is the FX brokers that are acting like exchanges and taking the initiative to control toxic order flow with their pricing strategies. Brokers need to see every opportunity and threat hidden in their customers’ flow patterns, and automate their own real-time responses, to stay profitable as markets change. Brokers servicing HFT clients react to predatory algorithms and fluctuating fill ratios by manipulating the spreads they offer.  Traditional customer profiling based on purely historical data is good for strategic decision-making.  But for more tactical decisions with immediate impact,real-time analysis is additionally required. 

A responsive broker can, for example:

  • Mitigate "toxic flow" by detecting predatory patterns in real-time, and automatically widening spreads to those clients
  • Increase business by detecting reduction in flow from “good” clients, and automatically reducing spreads to those clients
  • Preserve the relationship by detecting pending credit breaches, and immediately calling the client

Our customers use the Apama platform to perform their own customer flow analysis.  Both global and regional FX brokers now optimize how they serve their customers based on detailed real-time diagnosis of their flow. Key parameters include P&L on individual trades, an aggregated view of individual trades over time, and the performance of tiered client groups.  Using real-time customer flow analysis brokers (and banks and trading platforms) can figure out which customers are providing the types of order flow that they need. 

Customer flow sits alongside other real-time market trend analytics such as volatility, average daily volume, and depth of book.  For example, flow from a specific customer is high but liquidity is thin - then time of day impacts spreads in addition to customer behavior.  Our customers have also been generating pricing dynamically – adjusting spreads and skews - based on market conditions and customer trading patterns – including HFT patterns.

Dynamic pricing builds on an aggregated order book as source pricing.  A basic pricing service dynamically applies a set spread to the base price generated from the aggregated book.  A more advanced service changes the spread based on any data or rule, for example:

  • Current volatility
  • Depth of book (volume on bid/ask side)
  • Real-time risk parameters such as profit/loss levels
  • News
  • Current vs. target position (changes the spread or skew automatically, and updates the auto-hedger service)
  • Customer tier
  • Historical & real-time customer trading behaviour

Brokers can take input including aggregated FX prices, customer trading patterns, market volatility and hedging activity - all in real time - into the platform. The analysis generates dynamic pricing (spreads/skews) and it can work to incentivize market participants to provide quality - not quantity - orders. 

Toxic order flow, like excessive orders-to-trade, can tax trading systems and create an environment where fraud and market abuse can flourish. Using real-time customer flow analysis to get a handle on your customers' order flow will help to prevent this. Customer flow analysis can be used not only for dynamic pricing, but also for customizing product offerings and enabling banks and brokers to create execution algorithms for their clients to use. By being proactive, FX brokers and banks can avoid the issues that plague equities. And make money along the way.

 

Friday, May 11, 2012

Automated trading restrictions: are they a presumption of guilt?

Posted by John Bates

John BatesAnyone who’s seen the news in recent months will know that High Frequency Trading is facing a sharp increase in the number of regulatory challenges, with some tough measures suggesting that it has been presumed guilty until proven innocent by many. ESMA, implemented in Europe in May 2012, is the latest set of regulatory guidelines around the systems and controls required in an automated trading environment. But are these regulations fair?

It seems clear that, with the ever-increasing volumes of data that firms need to manage and monitor in order to catch abuse, Europe has decided to take a firm stance on automated trading. But is all this a case of, as my colleague Richard Bentley suggests, using a sledgehammer to crack the nut?

Shutterstock_48500095Clearly, increasing red tape will place a significant burden on firms and may, if we’re not careful, lead to a situation of regulatory arbitrage, or lock those without deep pockets out of the market. Perhaps a better answer is to adopt a three-layered approach to surveillance where the brokers, trading venue and regulators all have a different role to play will help stamp out abuse without necessarily stubbing innovation?

On a recent visit to London, I met with Phillip Stafford at the Financial Times Trading Room to discuss EU market abuse regulations as can be seen in the video here.

 

Tuesday, May 01, 2012

Today in Event Processing

Posted by The Progress Guys

In “Cracking the High Frequency Trading Nut”, Richard Bentley discusses the effectiveness of the new guidelines from the European Securities and Markets Authority.  He compares the guidelines to a sledgehammer, questioning if such extreme measures are necessary to regulate HFT.  Would a more precise approach, which targets specific issues with HFT and offers real-time surveillance, better regulate automated trading? Find out in yesterday’s post

4

Tuesday, April 24, 2012

Today in Event Processing

Posted by The Progress Guys

In his blog post “BRICS Win by Coming in Second”, Richard Bentley explains how emerging markets benefit by coming in second when it comes to high frequency trading. By looking at the first-comers, namely the U.S. and Europe, Bentley highlights how regulators were not prepared, resulting in unsafe practices. 

3

 

Thursday, April 05, 2012

Today in Event Processing

Posted by The Progress Guys

Solving the Cross-Market Surveillance Conundrum”, from Theo Hildyard, offers his thoughts and insights on the World Exchange Congress, where he discussed the MiFID directive. Although MiFID was created to protect customers in investment services, the market has been manipulated and abused since the directive was first announced in 2007. Based on his conversations with other attendees at the April event, Theo gives his thoughts on how to better regulate and protect the market. 

2

Thursday, December 01, 2011

Today in Event Processing

Posted by The Progress Guys

Dr. John Bates lists his top 9 predictions for the financial markets in 2012. No. 1: a financial institution will take a billion dollar hit and particularly focuses on the effect of regulations. To find out what else is in store for regulation, fraud and market manipulation, check out the full post

5

Friday, November 11, 2011

Can market surveillance help to keep traders on track?

Posted by Richard Bentley

Richard BentleyBy Richard Bentley, Vice President, Capital Markets, Progress Software

There’s no doubt that today's high speed capital markets and cross product, cross market trade volumes mean regulation struggles to keep up with changes in the market.  MiFID II is an example of a financial regulatory directive that is seen by many as lacking real detail and remaining open to interpretation - and misinterpretation. In a panel discussion at the TABB Group Trading Surveillance event in London on last Wednesday evening, industry experts agreed that, in Europe at least, few financial services firms are afraid of regulators.

So as many new regulations remain wooly, ignored or have yet to be implemented - or in the case of ESMA (the European Securities and Markets Authority) the regulation is simply statements of clarification – the panel was asked how surveillance and risk is going to be managed moving forward? Questions were also  raised about the regulatory burden in the future and whether those outside of the "Big Five" would be able to resource the demands for growing compliance departments. Will this lead to an uneven playing field?

According to TABB Group new compliance costs are indicated at between 512 and 732 million euro, with ongoing costs between 312 and 586 million euros.  But while regulators are still determining what regulation will look like, the need for market surveillance is undiminished. Traders made about 13.3 billion euros ($18.2 billion) from market manipulation and insider dealing on EU equity markets in 2010, according to an EU commission study.  With some arguing that firms can only do so much to survey markets themselves as trades cross multiple brokers and gateways, the panel discussed the need for fragmented market data to be brought together in a consolidated tape and surveillance performed at an aggregate market-wide level. 

With respect to High Frequency Trading, there was discussion and agreement that pre-trade checks should be built in and regulators should be feared, as in some Asian markets where some market participants adopt a mindset that constantly asks "will I be allowed to trade today". That "Fear Factor" is key and there isn't fear of regulation yet in Europe.

The timeliness of market surveillance was discussed with the panel suggesting that transactions should be monitored retrospectively, but also in real-time as they happen. Clearly, there’s still a role for historic analysis of the market as some abuse takes place over an extended period of time and new abuse scenarios are discovered which can then be applied to historical data. It’s a little like having your DNA stored on file for a time in the future when forensic techniques improve. But there is also no doubt that the need for real-time surveillance to spot manipulation as it happens can be a significant factor for organisations looking to protect themselves and the market, which is one of the reasons it is mandated by Dodd-Frank and MiFID II.

Finally, the panel discussed how turbulent markets and highly publicised breaches of banking controls have demonstrated the importance of protecting market integrity. So while an increase in the complexity of market surveillance inevitably leads to an increase in cost, the panel felt that the punitive and reputational risks associated with surveillance failures justify the business case for improving compliance training, processes and technology.  After all, just as you wouldn’t expect the police to prevent all crime by themselves, it’s clear that investment is needed in surveillance technology to give the regulators a helping hand.

Wednesday, September 14, 2011

Is Revolution the Path to Transparency?

Posted by Dan Hubscher

Revolutions are proliferating.  When you watch a revolution happening elsewhere, political or otherwise, it’s a good time to contemplate the revolution in your own history, or in your future.  There are few among us that can’t point to one or the other.  One of the common drivers is the fear that something is happening where we can’t see it happen, and we want transparency – of process, of government – of whatever seems to be wrong.

The capital markets globally are experiencing a similar revolution now with regulatory change, and the current climate threatens to create a revolt as well.  Market participants may push back on reforms to the point of creating a new state of stress.  Either way, the future presents very real threats to companies that aren’t prepared.  We’re observing a vast expansion of global rulemaking, and a coming deluge of data - especially in the derivatives markets. It’s very expensive and distracting to fix problems after the fact, so we need to act now.  “Hope is not a strategy” – as is often said to have been uttered by famed (American) football coach Vince Lombardi.

In an open letter to Barack Obama published on January 23, 2009, Benjamin Ola Akande advised, "Yet, the fact remains that hope will not reduce housing foreclosures. Hope does not stop a recession. Hope cannot create jobs. Hope will not prevent catastrophic failures of banks. Hope is not a strategy."

Now we have the Dodd-Frank Act in the U.S., MiFID II and EMIR in Europe, all preceded by the de Larosiere Report (EC, 2009), Turner Report (FSA, 2009), Volcker Report (G30, 2009), G20 – Feb 2009 Declarations, Financial Stability Forum Report (FSF, 2009), INF Report (IMF, 2009), Walker Review (UK, 2009), Basel / IOSCO Reviews… the list goes on.  And the rest of the world is watching, waiting, for another revolution.  The intended scope of the most recent reforms seems to almost be panacea, and transparency is the first step.

The next Revolution is happening in Boston, fittingly.  Progress Revolution 2011, from September 19th through the 22nd, offers the chance to learn from industry innovators on how they have successfully tackled these challenges within the capital markets.  Customers including PLUS Markets and Morgan Stanley will be there to share success stories.  And Kevin McPartland, Principal at the TABB Group, will be there too.  I’ve included a sneak peek into Kevin’s “Path to Transparency” below.

According to the New York Times, at the Republican Convention in 2008, Rudy Giuliani once said while contemplating Barack Obama’s candidacy, “… ‘change’ is not a destination ... just as ‘hope’ is not a strategy.”  Rudy will be speaking at our Revolution too.  Will you be there?  It will be a lively conference – I hope that you can join us!

-Dan

The Path to Transparency

By Kevin McPartland, Principal, TABB Group

Managing the vast quantities of data born into existence by the Dodd Frank Act and related regulation will present a challenge in the post-DFA environment; but collecting and producing the required data is just the tip of the iceberg. The ability to analyze and act on that data is what will separate the survivors from the winners. This is already true in many other parts of the global financial markets, but the complexities inherent in swaps trading coupled with the speed at which these changes will take place creates unique challenges. Spread this across all five major asset classes and three major geographies, and the complexities become more pronounced.

Margin calculations are proving to be one of the biggest concerns for those revamping their OTC derivatives infrastructure. In a non-cleared world, dealers determine collateral requirements for each client and collect variation margin on a periodic schedule—in some cases once a month, and in other cases once a year. When those swaps are moved to a cleared environment, margin calculations will need to occur at least daily. The result is an upgrade of the current batch process with dozens of inputs to a near-real time process, with hundreds of inputs. Whereas before major dealers could perform margin analysis, client reporting and risk management in a single system, those systems now need to operate independently within an infrastructure that provides the necessary capacity and speed.

The trading desk will require a similar seismic shift, as flow businesses will provide liquidity across multiple trading venues to an expanding client base. Most major dealers are at some stage of developing liquidity aggregation technology intended to provide a single view of liquidity across multiple swap execution venues. Creating this type of virtual order book requires receiving multiple real-time data feeds and aggregating the bids and offers in real time.

Furthermore, rather than comparing model-derived prices to the last trade price to produce quotes, inputs from SEFs, CCPs, SDRs, internal models, third-party models and market data providers will be required inputs to real-time trading algorithms once reserved for exchange-traded derivatives.

Providing clients with execution services presents other challenges. Executing on multiple platforms also means tracking and applying commission rates per client per venue in real time. Trade allocations also complicate the execution process.  In the bilateral world a big asset manager can do a $100 million interest rate swap and spread that exposure across multiple funds as it sees fit. Under the DFA, the executing broker must know which funds are getting how much exposure. Account allocation in and of itself is not new, but cost averaging multiple swap trades and allocating the right exposure at the right price to the proper account presents complex challenges, especially in a near-real time environment.

Risk management, compliance and back-testing data will also require huge increases in processing power, often at lower latencies. Risk models and stress tests, for example, are much more robust than they were before the financial crisis, requiring a considerably higher amount of historical data.

Compliance departments now must store the requisite seven years of data so they can reconstruct any trade at any moment in the past. This is complicated enough in listed markets, when every market data tick must be stored, but for fixed-income securities and other swaps, storing the needed curves means that billions of records must not only be filed away but retrievable on demand. Similar concerns exist for quants back-testing their latest trading strategies: It is not only the new data being generated that must be dealt with. Existing data, too, is about to see a huge uptick in requirements.

In the end these changes should achieve some of the goals set forth by Congress as they enacted Dodd Frank – increased transparency and reduced systemic risk.  The road there will be bumpy and expensive, but the opportunities created by both the journey and the destination will outweigh any short term pain.

This perspective was taken from the recent TABB Group study Technology and Financial Reform: Data, Derivatives and Decision Making.