Compliance (MiFID, RegNMS)

Monday, December 13, 2010

Calming 'Regulation Anxiety'

Posted by Dan Hubscher

There is a new kind of emotional disorder going around the financial markets - the previously unnamed fear of something ominous now that new financial rules have been laid down. Let's call it regulation anxiety.

Regulation anxiety has led to all sorts of new types of behavior in banks such as laying off proprietary trading staff, hiring ex-SEC lawyers, and laying on extra lobbyists to besiege Capitol Hill. The syndrome is so widespread that it has finally attacked the foreign exchange market - the market that performed the best during the financial crisis despite a lack of almost any regulation. And although the FX market 'ain't broke' it will undoubtedly get 'fixed' under new rules. It is these fixes that are causing panic attacks in the FX industry.

A survey of FX professionals at the Bloomberg FX10 conference in October showed marked anxiety over the impact of regulation and also possible changes to market structure.  More than 80 percent of those polled said they were concerned about the impact of recent regulations on their businesses.  They were also against structural reform and at odds as to which industry model is best for the future.  According to Bloomberg, the majority of the respondents were opposed to an exchange-traded model or a clearing house model, with only 19% believing the FX markets should have both clearing houses and exchange-traded requirements.

FX is a unique asset class in many respects; being (to date) almost totally free from regulation and benefiting from high liquidity on a global scale. Traders - wholesale, institutional and retail - are attracted by the ease and convenience of online currency buying and trading. The statistics bear this out with an average turnover of around $1.5 trillion per day – a clear indication of the strength of the market.

FX liquidity and volatility is growing day by day and trading foreign exchange in fast-moving, highly volatile times carries a high level of risk. As such it may not be suitable for all types of investors, institutions and buy-side firms. As a result, sell-side organizations that are serving the quickly-growing needs of hedge funds, proprietary traders, and other firms that take on these risks take on their own additional risk. There is a need to manage their own increased risk intelligently without erasing their competitive advantages.  

At the same time increased automated order volumes from the buy-side represent revenue opportunities for sell-side firms. But attracting that order flow away from competitors requires unique services, aggressive pricing and the ability to find the best prices in a highly fragmented market - not to mention the speed and scale needed to keep up in a high-risk environment.

There are solutions available which enable sell side institutions worldwide to rebuild their FX eCommerce platforms in line with the requirements of the most challenging customers and prospects. This is with a view to automate and customize their trading operations to become more competitive. There are now technologies that combine FX trading venue connectivity with a bird’s eye view of the market in real time; aggregating fragmented liquidity and including smart order routing algorithms, enabling every element of an eCommerce platform to automatically find and leverage the best prices.  

And, a very few include a rich development framework for both business users and IT. The flexibility for the business user allows traders to create and rapidly deploy proprietary FX and cross-asset trading strategies that help them competitively engage with clients.

There have been numerous recent examples of banks looking to take advantage of these solutions. For example, Royal Bank of Canada (RBC) recently deployed a new FX Aggregation solution to support its foreign exchange dealing operations. The Progress Apama FX Aggregation Solution Accelerator  is completely customizable and has been modified for RBC to meet its specific requirements. RBC's new system has significantly increased the efficiency in which its traders obtain the best FX prices for their clients.

RBC is the latest in a growing list of global and regional banks, which have deployed this type of platform as a foundation for eCommerce. Other organizations that have deployed FX solutions driven by technologies from Progress Software (namely its Apama product) recently include BBVA, UniCredit and ANZ, who can now access multiple sources of liquidity and dramatically improve their ability to handle increased trade volume.

The best way to deal with anxiety is to address the root cause. In this case, regulation. Regulation is coming, change is coming. Since the FX world is now facing looming regulations with dramatic impact, you’re going to need to adapt your business models and supporting applications quickly in order to survive – for instance by building flexible rules within your FX trading systems to identify and manage risks, whatever they may turn out to be.  If you do, you’ll be ahead of the pack and will be able to create competitive advantage.

-Dan

Thursday, November 04, 2010

A postcard to Jeremy Grant

Posted by Giles Nelson

Jeremy Grant, editor of FT Trading Room at the Financial Times, recently asked for explanations "on a postcard" about why speed is a force for good in financial markets, or put another way, to explain what the benefits are of high frequency trading. I've just come back from Mexico where I was addressing the Association of Mexican Brokers and during my visit I thought I'd write that postcard. So here it is:

 

Dear Jeremy

I saw your request for postcards recently, and as I'm travelling I thought I'd drop you one. There's not a lot I like doing more than explaining the benefits of so-called "high frequency trading".

I would suggest that you think of high frequency trading, or HFT, as being just the latest stage in the evolution of electronic trading. And this, as you know, has evolved very rapidly over the last decade because of cheaper and faster computers and networks. It's led to many innovations and benefits: electronic crossing networks, algorithmic trading, online retail trading, smaller order sizes, the overall increase in trading volume, more price transparency, greater trader productivity, more accessible liquidity, spreads between buy and sell prices tightening, broker commissions reducing, competition between exchanges and so smaller exchange fees - none of these things would have happened without electronic trading. MiFiD couldn't have happened; it simply wouldn't have been financially viable for the many alternative European equity-trading venues to launch without cheap access to networks and computers. Without these we would still have greedy, monopolistic exchanges with high transaction prices.

HFT is just the latest step in a technology driven evolution. You can't just look at it in isolation.

"Ah", you exclaim, "but high frequency trading is a step too far. Trades happening far faster than the blink of an eye. Surely that can't be right?"

So what if trades happen quickly? Things "going too fast" is a common concern. In 19th century Britain, people were worried about trains going faster than 30mph. They thought that passengers would suffocate or that as the train reached a corner it would simply come off the rails! And to those that say trading happens too quickly, at what speed should it occur? If not micro or milliseconds, should it be a second, a minute, an hour? Who's going to decide? Any choice is entirely arbitrary anyway; time is infinitely divisible.

There are plenty of things that happen too fast for humans to comprehend - human nerve impulses travel at more than 100m per second, yet we function successfully. Why? Because we have the monitoring systems in place that ensure the information from the nerves is processed correctly. Put a finger on a hot coal and it will be retracted immediately - quicker than we can consciously think. And if a 200mph train goes through a red light then warning bells will ring and the train will be automatically stopped.

And so to the main point. Trading speed, per se, is not the problem. But, yes, problems there are. Markets, particularly in Europe and the US are now very complex. These markets are fast moving, multi-exchange, with different, but closely interlinked asset classes. It is this complexity we find difficult to understand. Speed is only one facet of this. We imagine that an armageddon incident could occur because we know that the markets are not being monitored properly. Regulators freely admit this - Mary Schapiro recently said that the SEC was up to two decades behind in its use of technology to monitor markets. And because we know that the people in charge don't know what's going on, we get scared.

It doesn't have to be like this. The same technological advances that led to the evolution of HFT can be used to ensure that the markets work safely, by ensuring that limits are not exceeded, that an algorithm "going crazy" can't bring down an exchange, that a drunken trader can't move the oil price and that traders are dissuaded from intentionally trying to abuse the markets.

Doing things faster is a human instinct. Faster, higher, stronger. The jet engine, the TGV, the motorway. Would we really go back to a world without these?

Monday, October 04, 2010

No evil algo-trader behind the flash crash

Posted by Giles Nelson

The long anticipated joint SEC and CTFC report on the 6 May 2010 flash-crash came out last Friday.

After reading much of the report and commentary around it, I'm feeling rather underwhelmed.

The root cause of the flash-crash, the most talked about event in the markets this year, was a boring old "percentage-by-volume" execution algorithm used by a mutual fund to sell stock market index futures. How banal.

The algorithm itself was simple. It just took into account volume, not price, and it didn't time orders into the market. Many commentators have pejoratively described this algorithm as "dumb". It may be simple, but it's one of the most common ways that orders are worked - buy or sell a certain amount of an instrument as quickly as possible but only take a certain percentage of the volume available so the market isn't impacted too much. The problem was the scale. It was the third largest intra-day order in the E-mini future in the previous 12 months - worth $4.1Bn. The two previous big orders were worked taking into account price and time and executed over 5 hours. The flash-crash order took only 20 minutes to execute 75,000 lots.

It wasn't this order on its own of course. Fear in the markets created by the risk of Greece defaulting was already causing volatility. Stub quotes (extreme value quotes put in by market makers to fulfill their market making responsibilities) appear to have contributed. There was the inter-linking between the futures market and equity markets. There was the very rapid throwing around of orders - described as the "hot potato" effect, certainly exacerbated by the many high-frequency traders in the market. There was the lack of coordinated circuit breakers in the many US equity markets. There was the lack of any real-time monitoring of markets to help regulators identify issues quickly.

High-frequency and algorithmic trading have been vilified in many quarters over the last months. I think many were expecting that the flash-crash cause would be a malignant algo, designed by geeks working in a predatory and irresponsible hedge fund, wanting to speculate and make profits from "mom and pop" pension funds. It just wasn't anything of the kind.

The flash crash has raised important issues about the structure of multi-exchange markets, the role of market makers, the lack of real-time surveillance and how a simple execution strategy could precipitate such events. I do hope that the findings in the flash-crash report will ensure a more balanced view on the role of high-frequency and algo trading in the future.

Tuesday, August 31, 2010

Taming the Wild Algos

Posted by John Bates

"And now," cried Max, "let the wild rumpus start!"

— Maurice Sendak: Where the Wild Things Are

 

It’s not just equities and futures markets where strange stuff happens! An “algo gone wild” was spotted in the oil market (it actually happened earlier this year) and intrepid Reuters journalists got to the bottom of it.

 

High frequency trading firm Infinium Capital Management is at the center of a six-month probe by CME Group (and reportedly the CFTC) into why its brand new trading program malfunctioned and racked up a million-dollar loss in about a second, just before markets closed on Feb. 3. The algorithm, which was brand new, went live 4 minutes before the end of trading. It fired in 2000-3000 orders per second before being shut off. The oil price surged $1 then slid $5 over the course of the next two days. Read about the full story here:

http://www.reuters.com/article/idUSTRE67O2QQ20100825

 

I know the CEO of Infinium Chuck Whitman from the CFTC technology advisory committee – he’s a good guy and very knowledgeable. I believe him when he says his wild algos had no malicious intent – the algos were just broken and shouldn’t have been put live.

 

With algorithms and HFT comes the possibility of mistakes. Many more firms outside of the equities world are embracing HFT and their inexperience can cause market disruptions such as the Feb 3rd CME issue. A flash crash in oil or other commodities - or even foreign exchange - is not to be scoffed at. In fact, many commodities markets are much less liquid and homogenous than equities, and can be even more vulnerable to mistakes or manipulation. In the case of Infinium, the algo caused a spike in trading volumes by nearly eight times in less than a minute. It was a classic case of the algo running wild until it faltered and 'choked'. This is not how HFT strategies are supposed to work.

 

There are a number of best practices that can be used to mitigate against algos going wild:

 

The first best practice is diligent backtesting – using historic data and realistic simulation to ensure many possible scenarios have been accounted for. What does the algo do in a bull market, a bear market, at the open, at the close, when unexpected spikes occur, during a flash crash, when non-farm payrolls or other economic news is released etc. etc.? Of course there’s always the possibility of a “black swan” scenario – but then there’s always the possibility of an earthquake in London – but I bet the buildings aren’t built to withstand one – it’s a matter of covering likely possibilities as best you can. A backtesting process needs to be streamlined of course – as short time to market of new algos is key.

 

A second best practice is building a real-time risk firewall into your algo environment. Just like a network firewall stops anomalous network packets reaching your computer, so the risk firewall should stop anomalous trades getting to trading venues. These anomalous trades might be human or computer generated – such as “fat finger” errors, risk exposures (for a trader, a desk or an institution) being breached, or even algos gone wild (e.g. entering spurious loops and spitting out anomalous orders). Real-time risk monitoring is a second level protection for those problems you don’t catch in backtesting.

 

A third best practice is to use real-time market surveillance in your algo environment. Even if trades do not breach risk parameters, they may breach compliance rules, regulations or may be perceived by a regulator as market manipulation (by accident if not design). Detecting these patterns as they happen enables good internal policing by trading firms, rather than investigation or prosecution by regulators.

 

An algorithm is a tool in a trader's toolkit, and it needs to be taken care of as such. If it is well-oiled and the trader or quant or risk manager monitors its progress then the algo will do its job quickly and nicely. If the trader/quant/risk manager doesn’t properly prepare the algo or ignores the algo and lets it get rusty, so to speak, it could lose its edge and run amok. Algorithms must be monitored constantly for performance and for errors, and sometimes tweaked on-the-fly to ensure best results. A good algorithmic trading platform will enable trading firms to do just that.

 

Trading firms are not the only ones who need to be on guard for possible algos gone wild. In the case of Infinium, the regulators and the exchange were also slow on the uptake. This shows that everyone needs to be proactive in using the correct tools to monitor algorithmic trading. Sensing and responding to market patterns before the aberrations or errors have a chance to move prices is the right thing to do - in all asset classes. Be like Max and tame the wild things!

Wednesday, August 04, 2010

Algorithmic Terrorism

Posted by John Bates

At the CFTC's first Technology Advisory Council meeting on July 14, there was concern expressed around the concept of quote-stuffing. There was some evidence presented that the May 6th flash crash may have been caused by or exacerbated by this activity. While with regard to the flashcrash, other market experts I’ve spoken to know dispute this was the cause, quote-stuffing is a topic worthy of discussion

 

At the CFTC meeting, where I was an invited participant, data was presented from trade database development firm Nanex, which suggested quote stuffing contributed to the destabilization on May 6th. In this case the data suggests huge numbers of quotes were fired into the market on particular symbols (as many as 5000 per second) and that many of these were outside the national best bid/offer (NBBO). So what’s the point of this? Well with latency as a key weapon, one possibility is that the generating traders can ignore these quotes while the rest of the market has to process and respond to them – giving an advantage to the initiator. Even more cynically one can consider these quotes misleading or even destabilizing the market. In fact, Nanex state in their paper: "What we discovered was a manipulative device with destabilizing effect". Quote stuffing may be innocent or an honest mistake, but Nanex's graphs tell a very interesting tale (http://www.nanex.net/FlashCrash/CCircleDay.html). There are patterns detected - on a regular basis - that one could conclude is quote stuffing for the purpose of market manipulation. There's a very good article by Alexis Madrigal that discusses the research and issues in more detail (http://www.theatlantic.com/science/archive/2010/08/market-data-firm-spots-the-tracks-of-bizarre-robot-traders/60829/).

 

At the extreme, quote-stuffing could operate like a “denial of service attack” – firing so many orders that the market can’t cope - and crippling the trading of certain symbols, certain exchanges or the whole market. An influx of orders in sudden bursts to one exchange on one stock can slow down that system as it tries to process these orders. Nanex notes that there are 4,000 stocks listed on the NYSE and nine other reporting exchanges in the U.S. If each reporting exchange for each stock quoted at 5,000 quotes per second it would equal 180.0 million quotes per second. A daunting task no matter how advanced their processing technology is.

 

Without trying to overstate the issue, in the most extreme circumstances these practices could be considered algorithmic terrorism. One can imagine how, at the extreme, it is potentially catastrophic. The concern is that a well-funded terrorist organization might use such tactics in the future to manipulate or cripple the market. So much of our economy is underpinned by electronic trading – so protecting the market is more important than guarding Fort Knox! Regulators, such as the CFTC and SEC are taking this seriously - and need to respond.

Tuesday, July 27, 2010

Smart - but is it Smart Enough?

Posted by John Bates

Nasdaq liked the idea of real-time market surveillance so much that it bought one of the companies that specialize in it.

 

Nasdaq OMX announced this week that it will buy Smarts Group, an Australia-based market surveillance business that helps exchanges monitor compliance with trading rules.  You can read the full story here: http://online.wsj.com/article/BT-CO-20100727-712964.html.

 

The market moves a lot faster than it used to thanks to algorithmic trading. What has not kept pace is monitoring high speed trading. Smarts is one commercial approach that aims to enable such monitoring. However, there is a big problem with Smarts - the time it takes to develop a new surveillance scenario. I have spoken to a number of venues around the world, including the Australian Stock Exchange, who have told me they are totally dependent on Smarts to add new rules when they need one – and it takes 6 months to a year – if they’re lucky. In fast-moving markets we need to evolve in hours not years!!

 

Despite shortcomings with Smarts, the Nasdaq acquisition is an indicator of the importance of real-time surveillance in a post-flashcrash world. Maybe the flash crash of May 6th has a silver lining if lessons learned are leading exchanges to better use surveillance and monitoring. In the aftermath of the crash, exchanges scrambled to recover trading data and do some forensic investigation into the causes.  This proved extremely difficult probably because of inadequate analysis capabilities to pinpoint what had happened.

 

Exchanges, ECNs, brokers, traders and regulators all must take an intelligent approach to monitoring and surveillance in order to prevent rogue trades and fat fingers. Transparency is the key. Regulators in the US and Europe are concerned about the lack of transparency in markets where high frequency algorithmic trading takes place, as well as in dark pools.

 

We ran a survey at SIFMA this year where we asked 125 attendees about high frequency trading and market surveillance. A staggering 83 percent said that increased transparency is needed to effectively deal with market abuse and irregular market activity, such as the flash crash. However, only 53 percent of firms surveyed currently have real-time monitoring systems in place.

 

Nasdaq says that Smarts will be used to expand broker surveillance solutions, which I take to mean monitoring scenarios such as sponsored access. This would be a smart move (forgive the pun). With naked access, high frequency traders can plug straight into an exchange through their broker – and it’s critical that pre-trade risk and surveillance is in place to prevent a crisis in which wild algos could cause havoc.

 

The detection of abusive patterns or fat fingered mistakes must happen in real-time, ideally before it has a chance to move the market. This approach should be taken on board not just by the regulators, but by the industry as a whole. Only then can it be one step ahead of market abuse and trading errors that cause a meltdown (or up).

 

As many market participants have pointed out, technology can't solve all of the problems, but it can help to give much more market transparency. To restore confidence in capital markets, organizations involved in trading need to have a much more accurate, real-time view on what's going on. In this way, issues can be prevented or at least identified much more quickly.

 

While I applaud Nasdaq's initiative and dedication to improving market surveillance buying Smarts, I must point out that you don't have to go quite that far to get the same results. Progress provides market-leading real-time monitoring, surveillance and pre-trade risk – powered by Complex Event Processing – enabling complex real-time monitoring of the fastest moving markets. Unlike Smarts, Progress includes the ability for business users to customize and create new scenarios rapidly (in hours rather than Smart’s months). And you don’t have to buy and integrate our company to get access to it!!

Thursday, July 22, 2010

Beware the weight-challenged digits

Posted by John Bates

Fat fingers (or weight-challenged digits to my politically correct friends) have had a good run lately. First we heard that Deutsche Bank had to close its quantitative trading desk in Japan after an automated trading system misread equities market data. The system generated a massive sell order that caused the Nikkei 225 Stock Average to dive (full story here: http://tinyurl.com/23rnn5v). Then an unknown trader spiked the Swedish krona and a computer glitch at Rabobank smacked sterling by about 1%, according to the Wall Street Journal (http://tinyurl.com/2el9kgw).

Although the press was surprised that the efficient foreign exchange market was susceptible to trading errors, it is just as vulnerable as equities or futures. In FX, trades are often made directly to an FX trading destination such as EBS, Reuters or Currenex. In many institutions, trades are often made without adequate pre-trade checking or risk management applied.

As my colleague, Deputy CTO - Dr. Giles Nelson, told the Wall Street Journal: “The consensus in the market is that this was a computer-based trading error, but ultimately there would have been a human involved somewhere.”

Human error is part of being human. The reality of highly automated trading is that the programs are built by humans and run by super-fast machines. And unless there are robust computerized checking mechanisms that vet trades before they hit the markets, errors can wreak havoc in the blink of an eye.

Deutsche Bank's algorithms generated around 180 automated sell orders worth up to 16 trillion yen ($183 billion) and about 50 billion yen's worth were executed before the problem was addressed. The Rabobank mistake could have dumped £3 billion worth of sterling into the market in one lump, rather than splitting it up to lower market impact - but luckily the bank spotted the error and stopped the trade before it was fully completed. The Swedish krona mistake sank the krona against the euro by 14% before it was spotted. 

Pre-trade risk checks would help to prevent errors, trade limit breaches, or even fraudulent trading from occurring. And pre-trade risk controls need not be disruptive.  Ultra-low latency pre-trade risk management can be achieved by trading institutions without compromising speed of access.  An option is a low latency "risk firewall" utilizing complex event processing as its core, which can be benchmarked in the microseconds. 

With a real-time risk solution in place, a message can enter through an order management system, be run through the risk hurdles and checks, and leave for the destination a few microseconds later. The benefits of being able to pro-actively monitor trades before they hit an exchange or ECN or FX platform far outweigh any microscopic latency hops. They include catching fat fingered errors, preventing trading limits from being breached, and even warning brokers and regulators of potential fraud - all of which cost brokers, traders and regulators money. 

Wednesday, July 21, 2010

Defending Against the Algo Pirates

Posted by John Bates

It was an honor to sit on the CFTC Technology Advisory Committee (TAC) last week. I was very impressed with the presentations and discussion, chaired ably by Commissioner Scott O’Malia. I was also impressed by the other Commissioners and with my fellow committee members. This week the CFTC has been discussing new rules to handle disaster recovery and has also received further coverage on one topic discussed at the TAC – that of pirate algos attacking algos going about their normal trading business and aiming to manipulate the market.

 

Further coverage can be seen in this article “CFTC posits new disaster recovery rules as regulators probe 'algo price pirates'”

 

http://www.finextra.com/news/fullstory.aspx?newsitemid=21610

 

The CFTC has a sensible proposal on the table to require exchanges and clearing houses to have effective disaster recovery plans in order to quickly recover from any market-wide disruption. After 9/11 it became clear that many NYC-based financial services firms were not prepared for a disaster of that magnitude, and subsequently took disaster recovery (or business continuity as it came to be known) very seriously. Now it is time for those virtual businesses - exchanges and ECNs - to do the same.

 

Operational risk is a very real issue in today's fast moving markets, where anything can go wrong. Being able to recover and quickly start trading again - across all exchanges and destinations - is paramount. The May 6th 'flash crash' gave us a glimpse of what can happen if something went wrong at one exchange and the rules across other exchanges were not harmonious.

 

The flash crash was a man-made event exacerbated by machines. Algorithms are programmed to do as they are told, and if one destination is not responding they will hunt down and ping, scrape and trade on whatever others they can find. Sometimes this can have unfortunate consequences for the market as a whole. This is why there must be consistency across trading venues in how they respond to crises.

 

At the CFTC's Technology Advisory Committee meeting last week, there were several interesting observations about high frequency trading and algos. We heard new analysis of the flash crash from trade database developer Nanex LLC. The Nanex report suggested that predatory practices such as "quote stuffing", where algos try to prevent others high-frequency traders from executing their strategies, may have contributed to the crash. Commissioner Chilton of the CFTC (who I had the pleasure of sitting next to at the TAC last week), the TAC and the SEC are taking these claims very seriously. Commissioner Chilton expressed his concern that there are algorithms out there hunting down and interfering with other algorithms, calling them 'algo price pirates' that may trigger a new enforcement regime. Now I believe that firms and their algos are going to be monitoring the market with the goal of figuring out how your algos work and devising a strategy to capitalize – that’s just the natural order of capitalism. However, that’s different from using algo terrorism to bully the market into behaving a particular way. That’s something we need to watch for and prevent for it causes damage.

 

If such 'pirates' are to be policed and caught, the regulators will have to sail with the pirates in shark-infested high frequency waters. Surveillance and monitoring are critical, as is the need for speed. The speed at which algorithms can pump quotes into a destination is daunting, so the policemen will also need to work at ultra high velocity. I was a little concerned when Commissioner Chilton said at the TAC meeting: "Just because you can go fast it doesn't mean you should." I know where he’s coming from but would modify the statement to say that in HFT it is critical to go fast to be competitive – but you need the proper best practices, training and safety precautions. High frequency trading, if properly monitored, need not be scary or evil. It can contribute to liquidity and market efficiency, and provide alpha generation. To truly address the HFT issue, real time market surveillance technology must be adopted to monitor and detect patterns that indicate potential market abuse such as insider trading or market manipulation. Or pirate algorithms trying to board your strategy ship and take off with the gold doubloons. 

Friday, June 25, 2010

Banks & Bullets: Maybe They Dodged One - But Still Needs Some Silver Ones!

Posted by John Bates

By now you’ve probably seen that a deal was reached this morning by the House and Senate on regulation. Some would say it waters down provisions from the tougher Senate bill, limiting rather than prohibiting banks to trade derivatives and invest in hedge funds. This articles describes it as banks “dodging a bullet” http://www.businessweek.com/news/2010-06-25/banks-dodged-a-bullet-as-u-s-congress-dilutes-trading-rules.html

 

We applaud the superhuman efforts put into the new financial regulation bill by the U.S. Congress and the House of Representatives. However, as I’ve said many times, transparency and consistency are critical to successful regulation in the capital markets. One could be forgiven for fearing that the watering down of the regulations, including the Volcker Rule, may create more havoc rather than increase transparency and consistency.

 

One thing is for sure – there’s going to be a raising of the priority on handling the complexity and requirement for real-time risk and surveillance within institutions. Risk managers and C-level executives concerned about minimizing risk and maximizing capital will need to view trading positions and limits across the firm, including, if permitted, derivatives that are 'spun out'. Ideally the risks should be aggregated and analyzed, in real-time, giving the ability to detect and prevent “accidents”. Pre-trade risk management will be increasingly important, as firms seek to maintain capital requirements at all times.

 

A top-down approach to risk where managers can see, in a single view on a dashboard, the risks across all asset class silos has gone from a “nice to have” to high on the wishlist – but many still wonder if it actually possible. Continual monitoring of trades in real-time can help to prevent exceeding trading limits, prevent mistakes and catch market abuse.

p.s. Many thanks to all the comments from market practitioners who comment that technology can't solve all the problems for Regulators, Banks and Trading Venues. I completely agree! But we can go a lot further than what happens right now. But of course technology is only one of the approaches. Changes in regulation is another, as is increased transparency, improved reporting (e.g. from fax to real-time data!) etc. etc. 

Thursday, June 03, 2010

Optimism in the world of financial services regulation

Posted by Giles Nelson

It seems that we’re finally making some progress on making the financial markets function more safely. 

After the “flash-crash” of 6 May, US equity market operators have agreed to bring in coordinated circuit-breakers to avoid a repeat of this extreme event. There is widespread agreement on this. Industry leaders from brokers and exchanges yesterday made supportive statements as part of submissions to the SEC.

Regulators are going public with their use of real-time monitoring technology. Alexander Justham, director of markets at the Financial Services Authority, the UK regulator, told the Financial Times that the use of complex event processing technology will give the FSA “a more proactive machine-on–machine approach” to market surveillance (the FSA is a Progress customer). Other regulators are at least admitting they have a lot of work to do. Mary Schapiro, the SEC chair, believes that the technology used for monitoring markets is “as much as two decades behind the technology currently used by those we regulate”. Scott O’Malia, a commissioner at the Commodity Futures Trading Commission admitted that the CTFC continues to receive account data by fax which then has to be manually entered. 

The use of real-time pre-trade risk technology is likely to become much more widespread. “Naked” access, where customers of brokers submit orders directly to the market without any pre-trade checks, is likely to be banned. This is an important change as late last year Aite Group, an analyst firm, estimated that naked access accounted for 38% of the average daily volume in US stocks. The SEC is also proposing that regulation of sponsored access is shorn up – currently it has evidence that brokers rely upon oral assurances that the customer itself has pre-trade risk technology deployed. The mandated use of pre-trade risk technology will level the playing field and will prevent a rush to the bottom. Personally I’ve heard of several instances of buy-side customers insisting to brokers that pre-trade risk controls are turned off as they perceive that such controls add latency and therefore will adversely affect the success of their trading.

The idea of real-time market surveillance, particularly in complex, fragmented markets as exist in the US and Europe is gaining credence. The SEC has proposed bringing in a “consolidated audit trail” which would enable all orders in US equity markets to be tracked in real-time. As John Bates said in his previous blog post, it’s likely that the US tax-payer will not be happy paying the $4B the publically funded SEC estimates that such a system would need to get up and running. Perhaps the US could look at the way the UK’s FSA is funded. The FSA reports to government but is paid for by the firms it regulates.

As I mentioned in my last blog our polling in April at Tradetech, a European equities trading event, suggests that market participants are ready for better market monitoring. 75% of respondents to our survey believed that creating more transparency with real-time market monitoring was preferable to the introduction of restrictive new rules.

CESR, the Committee of European Securities Regulators, is currently consulting on issues such as algorithmic trading and high frequency trading. It will be interesting to see the results of their deliberations in the coming months.

I’m so pleased the argument has moved on. This time last year saw a protracted period of vilifying “high frequency trading” and “algo trading”. Now, there is recognition of the benefits as well as the challenges that high frequency trading has brought to equity markets and regulators seem to understand that to both prevent disastrous errors and deliberate market manipulation occurring it is better for them to get on board with new technology rather than try to turn the clock back to mediaeval times. 

New approaches are sorely needed. Yesterday saw the conclusion of another investigation into market manipulation when the FSA handed out a $150,000 fine and a five-year ban to a commodity futures broker.