Fraud

Wednesday, August 04, 2010

Algorithmic Terrorism

Posted by John Bates

At the CFTC's first Technology Advisory Council meeting on July 14, there was concern expressed around the concept of quote-stuffing. There was some evidence presented that the May 6th flash crash may have been caused by or exacerbated by this activity. While with regard to the flashcrash, other market experts I’ve spoken to know dispute this was the cause, quote-stuffing is a topic worthy of discussion

 

At the CFTC meeting, where I was an invited participant, data was presented from trade database development firm Nanex, which suggested quote stuffing contributed to the destabilization on May 6th. In this case the data suggests huge numbers of quotes were fired into the market on particular symbols (as many as 5000 per second) and that many of these were outside the national best bid/offer (NBBO). So what’s the point of this? Well with latency as a key weapon, one possibility is that the generating traders can ignore these quotes while the rest of the market has to process and respond to them – giving an advantage to the initiator. Even more cynically one can consider these quotes misleading or even destabilizing the market. In fact, Nanex state in their paper: "What we discovered was a manipulative device with destabilizing effect". Quote stuffing may be innocent or an honest mistake, but Nanex's graphs tell a very interesting tale (http://www.nanex.net/FlashCrash/CCircleDay.html). There are patterns detected - on a regular basis - that one could conclude is quote stuffing for the purpose of market manipulation. There's a very good article by Alexis Madrigal that discusses the research and issues in more detail (http://www.theatlantic.com/science/archive/2010/08/market-data-firm-spots-the-tracks-of-bizarre-robot-traders/60829/).

 

At the extreme, quote-stuffing could operate like a “denial of service attack” – firing so many orders that the market can’t cope - and crippling the trading of certain symbols, certain exchanges or the whole market. An influx of orders in sudden bursts to one exchange on one stock can slow down that system as it tries to process these orders. Nanex notes that there are 4,000 stocks listed on the NYSE and nine other reporting exchanges in the U.S. If each reporting exchange for each stock quoted at 5,000 quotes per second it would equal 180.0 million quotes per second. A daunting task no matter how advanced their processing technology is.

 

Without trying to overstate the issue, in the most extreme circumstances these practices could be considered algorithmic terrorism. One can imagine how, at the extreme, it is potentially catastrophic. The concern is that a well-funded terrorist organization might use such tactics in the future to manipulate or cripple the market. So much of our economy is underpinned by electronic trading – so protecting the market is more important than guarding Fort Knox! Regulators, such as the CFTC and SEC are taking this seriously - and need to respond.

Thursday, July 22, 2010

Beware the weight-challenged digits

Posted by John Bates

Fat fingers (or weight-challenged digits to my politically correct friends) have had a good run lately. First we heard that Deutsche Bank had to close its quantitative trading desk in Japan after an automated trading system misread equities market data. The system generated a massive sell order that caused the Nikkei 225 Stock Average to dive (full story here: http://tinyurl.com/23rnn5v). Then an unknown trader spiked the Swedish krona and a computer glitch at Rabobank smacked sterling by about 1%, according to the Wall Street Journal (http://tinyurl.com/2el9kgw).

Although the press was surprised that the efficient foreign exchange market was susceptible to trading errors, it is just as vulnerable as equities or futures. In FX, trades are often made directly to an FX trading destination such as EBS, Reuters or Currenex. In many institutions, trades are often made without adequate pre-trade checking or risk management applied.

As my colleague, Deputy CTO - Dr. Giles Nelson, told the Wall Street Journal: “The consensus in the market is that this was a computer-based trading error, but ultimately there would have been a human involved somewhere.”

Human error is part of being human. The reality of highly automated trading is that the programs are built by humans and run by super-fast machines. And unless there are robust computerized checking mechanisms that vet trades before they hit the markets, errors can wreak havoc in the blink of an eye.

Deutsche Bank's algorithms generated around 180 automated sell orders worth up to 16 trillion yen ($183 billion) and about 50 billion yen's worth were executed before the problem was addressed. The Rabobank mistake could have dumped £3 billion worth of sterling into the market in one lump, rather than splitting it up to lower market impact - but luckily the bank spotted the error and stopped the trade before it was fully completed. The Swedish krona mistake sank the krona against the euro by 14% before it was spotted. 

Pre-trade risk checks would help to prevent errors, trade limit breaches, or even fraudulent trading from occurring. And pre-trade risk controls need not be disruptive.  Ultra-low latency pre-trade risk management can be achieved by trading institutions without compromising speed of access.  An option is a low latency "risk firewall" utilizing complex event processing as its core, which can be benchmarked in the microseconds. 

With a real-time risk solution in place, a message can enter through an order management system, be run through the risk hurdles and checks, and leave for the destination a few microseconds later. The benefits of being able to pro-actively monitor trades before they hit an exchange or ECN or FX platform far outweigh any microscopic latency hops. They include catching fat fingered errors, preventing trading limits from being breached, and even warning brokers and regulators of potential fraud - all of which cost brokers, traders and regulators money. 

Tuesday, July 13, 2010

CFTC Launches Technology Advisory Committee

Posted by John Bates

Yesterday the CFTC, the regulator in charge of Futures and Options markets, announced details of a new Technology Advisory Committee (TAC), chaired by the very capable Commissioner Scott O’Malia. See article here:

 

http://www.reuters.com/article/idUSN126207520100712

 

I am absolutely delighted to be included in the group of experts that the CFTC has called together to form the TAC. I am joined by an extraordinary group of some of the industry's top executives from banks, brokers, trading firms, exchanges and clearing firms as well as some very impressive academics. On Wednesday, July 14th (tomorrow as I write) we will meet to discuss the impact of high frequency and algorithmic trading on the markets, including whether algorithms may be implicated in the May 6th 'flash crash'. From this, we’ll discuss what recommendations we have for regulation of and/or best practices for algorithmic and high frequency trading.

 

High frequency and algorithmic trading are essential for efficient execution and alpha generation in a complex, multi-asset, fast-moving world. However, there are a number of accusations that have been made against these forms of trading, including that they may aggravate volatility and may even have caused the ‘flash crash’. I believe evidence from the TAC participants will exonerate the accused.

 

I am hoping that our meetings will result in solutions that not only head-off future ‘flash crashes’, but also help exchanges, banks and brokers to better monitor and police trades. The proactive use of real-time monitoring systems can alert regulators to problems before they become a crisis. Monitoring technology can 'see' major price and volume spikes in particular instruments, how often they happen and maybe even why, and whether a pattern in market behavior caused them. It can also tell how much trading is potentially market abuse, for example, insider trading might be detected by correlating unusual trading incidents with news releases and market movements. (The FSA, for example, thinks that 30% of trading around acquisitions is insider.)

 

It is now possible to apply high frequency techniques to not just trading – but also to market monitoring, surveillance and pre-trade risk checks – for regulators, exchanges and brokers. The technology is out there (with proven approaches built on next generation platforms such as CEP) and it needn't be expensive. The CFTC's TAC is a positive step in the right direction. I look forward to the meeting and will let you know how it goes! Follow me on Twitter @drjohnbates where I'll Tweet when possible.

Wednesday, June 30, 2010

What do you do with the drunken trader?

Posted by John Bates

The news that Steven Perkins, (former) oil futures broker in the London office of PVM Oil Futures, has been fined 72,000 pounds ($108,400) by the FSA and banned from working in the industry is no surprise, see article here:

 

http://www.telegraph.co.uk/finance/newsbysector/energy/oilandgas/7862246/How-a-broker-spent-520m-in-a-drunken-stupor-and-moved-the-global-oil-price.html.

 

It could have been worse given that the broker, after a few days of heavy drinking, took on a 7.0 million barrel long position on crude oil in the middle of the night. The fine seems miniscule since it cost PVM somewhere in the vicinity of $10 million - after unwinding the $500+ million position.

 

The surprising thing about this incident is that it happened at all. Perkins was a broker, not a trader. He acted on behalf of traders, placing orders on the Intercontinental Exchange among other places. That he could go into the trading system and sneak through 7.0 million barrels without a customer on the other side is unbelievable.

 

Heavy drinking is practically a job requirement in the oil industry, my sources tell me, so this kind of thing could be a real issue going forward. As algorithmic trading takes hold in the energy markets, trading may approach the ultra high speeds seen in equities markets.  This is a recipe for super high speed disaster, unless there are proper controls in place - especially if there were a way for the broker or trader in question to enrich himself in the process.

 

One powerful way to prevent this kind of accident or fraud is through the use of stringent pre-trade risk controls. The benefits of being able to pro-actively monitor trades include catching "fat fingered" errors, preventing trading limits from being breached, and even warning brokers and regulators of potential fraud - all of which cost brokers, traders and regulators money. PVM is a good example of this.

 

Ultra-low-latency pre-trade risk management can be achieved by brokers without compromising speed of access.  One solution is a low latency "risk firewall" utilizing complex event processing as its core, which can be benchmarked in the low microseconds.  Errors can be caught in real-time, before they can reach the exchange. Heaving that drunken trader right overboard, and his trades into the bin.

 

Monday, June 14, 2010

Rogue Trading Below the Radar

Posted by John Bates

Jerome Kerviel, the trader who allegedly lost Societe Generale nearly 5.0 million euros, went on trial in Paris on Tuesday, June 8th. The bank alleges that Kerviel took "massive fraudulent directional positions" in 2007 and 2008, which were far beyond his trading limits.

It is interesting to note that Kerviel was not only experienced on the trading floor, but he also had a background in middle office risk management technology. It may have been this knowledge that enabled him to manipulate the bank's risk controls and thus escape notice for so long.

Still, it is perplexing that fraud on such a scale can go on without detection for so long, even if Kerviel did have an insider's knowledge of the firm's risk management systems. Internal risk controls are not something that a financial firm can take for granted, left to run unchecked or unchanged for months or years.

The detection of criminal fraud or market abuse is something that must happen in real-time, before any suspicious behaviour has a chance to lose a firm money or to move the market. Pre-trade risk management is paramount, with trading limits specified and checked in real-time. Internal controls should be monitored for possible manipulation, again in real-time. The good news is that technology does exist in the form of real-time surveillance software from companies can analyse data transactions by the millisecond.

Financial institutions need to start looking inward to improve standards, regardless of current regulation. Otherwise the culture of greed and financial gain at all costs will encourage more and more Kerviels.

Tuesday, April 27, 2010

Monitoring and surveillance: the route to market transparency

Posted by Giles Nelson

Again this week, capital markets is under the spotlight, with the SEC and Goldman standoff. Just a few weeks ago, the FSA and UK Serious Organised Crime Agency were making multiple arrests for insider trading. Earlier this year Credit Suisse were fined by the New York Stock Exchange for one of their algorithmic trading strategies damaging the market. Still, electronic trading topics such as dark pools, high frequency trading are being widely debated. The whole capital markets industry is under scrutiny like never before.

Technology can't solve all these problems, but one thing it can do is to help give much more market transparency. We're of the view that to restore confidence in capital markets, organisations involved in trading need to have a much more accurate, real-time view on what's going on. In this way, issues can be prevented or at least identified much more quickly.  I talked about this recently to the Financial Times, here

Last week at the Tradetech conference in London, Progress announced its release of a second generation Market Monitoring and Surveillance Solution Accelerator. This is aimed at trading organisations who want to monitor trading behaviour, whether to ensure compliance with risk limits for example, or to spot abusive patterns of trading behaviour. Brokers, exchanges and regulators are particularly relevant, but buy-side organisations can also benefit from it. Previously this solution accelerator just used Apama. Now it's been extended to use our Responsive Business Process (RPM) suite, which includes not only Apama, but Savvion Business Process Management, which extends the accelerator to give it powerful alert and case management capabilities. We know that monitoring and surveillance in capital markets is important now, and believe it will become more so, which is exactly why we've invested in building out product. You can read the take on this from the financial services analyst Adam Honore here and more from Progress about the accelerator and RPM. A video on the surveillance accelerator is here

As all this is so relevant at the moment and Tradetech is the largest trading event of its kind in Europe (although very equity focused), we thought we'd conduct some research with the participants. We got exactly 100 responses on one day (which made calculating the percentages rather a breeze) to a survey which asked about attitudes to European regulation, high frequency and algorithmic trading and dark pools. Some of the responses relating to market monitoring and surveillance are worth stating here. 75% of respondents agreed to the premise that creating more transparency with real-time trading monitoring systems was preferable to the introduction of new rules and regulations. 65% of respondents believe that European regulators should be sharing equity trading information in real-time. And more than half believe that their own organisation would support regulators having open, real-time access to information about the firm's trading activity. To me, that's a pretty strong sign that the industry wants to open up, rather than be subjected to draconian new rules.

There will be substantial changes to the European equity trading landscape in the coming year. There will be post MiFID regulation change by the European Commission acting on recommendations by the Committee of European Securities Regulators who are taking industry evidence at the moment. Their mantra, as chanted last week, is "transparency, transparency, transparency". Let's hope that this transparency argument is expressed in opening up markets to more monitoring rather than taking a, perhaps politically expedient, route of outlawing certain practices and restricting others.

Wednesday, April 21, 2010

Observations from Tradetech 2010

Posted by Giles Nelson

Day one of Tradetech Europe 2010 has nearly finished. I won't be here tomorrow, so here are some thoughts and take-aways from today's event.

It's fair to say that Tradetech is the premier European equities trading and technology event, and thus very relevant for Progress' business in capital markets, particularly customers using Apama. Progress has a substantial presence as always. It's a good event to meet brokers, hedge funds, exchanges and pretty much every one within the industry. Lots of old friends are here every year. Regarding the event itself, it's pretty well attended considering the recent issues with volcanic ash. It usually takes place in Paris, but I'm sure the organisers were pleased that they chose London this year as the London contingent was able to attend without disruption.

This years big theme really seems to be market structure and regulation. In the third year after MiFID, an event which brought competition into European equity markets, and after the credit crunch, issues about how the market is working, the influence of alternative venues such as dark pool,  and how high-frequency trading is affecting the market are issues front of mind.

What's interesting is how some things stay the same. Richard Balarkas, old Tradetech hand and CEO of Instinet Europe, talked about trading liberalisation in the late 19th and early 20th century. Then, vested interests were complaining about the rise of "bucket shops", giving access to trading on the Chicago Board of Trade via telegraph to people that wouldn't previously have traded. In the view of some at the time, this lead to speculation and "gambling". Regulators were wrestling at the time with the fact that only 1% of CBOT trades resulted in actual delivery of goods - the rest were purely financial transactions and therefore arguably speculative. This reminds me of some of the current debate around the "social usefullness" of high frequency trading which is going on now.

European equities trading has changed a lot. Vodafone, a UK listed stock, has now only about 30% of its average European daily volume traded on the London Stock Exchange (LSE). The rest is traded on alternative trading venues across Europe. However, Xavier Rolet, CEO of the LSE, believes that there's a long way to go. He stated  that "the European equities market remains anaemic when compared to the US". Volumes, adjusted for relative market capitalisation, are about 15% of that in the US.

Regulation of European markets is a thorny issue. Regulation is fragmented, together with the market itself. CESR - the Committee of European Securities Regulators, the nearest Europe has to a single regulator - is taking evidence on a whole range of issues and will recommend a set of reforms to the European Commission in July this year. These recommendations will relate to post-trade transparency and information quality and enhanced information about systematic internalisers and broker crossing systems. CESR is also looking at other issues such as algorithmic trading and co-location. Legislation will follow towards the end of 2010.

Equity markets are in a sensitive place. There's still more deregulation to do, more competition to be encouraged and yet, with sentiment as it is, regulators may decide to introduce more rules and regulations to prevent this taking place. The CESR proposals will be about "transparency, transparency, transparency" - as part of this we believe that more real-time market monitoring and surveillance by all participants is key to bringing back confidence in the markets and ensuring that draconian rules don't have to be introduced.

Emerging markets were talked about in one session, and Cathryn Lyall from BM&FBovespa in the UK, talked about Brazil in particular. We've seen Brazil become a pretty significant market recently. Not only have demand grown for all Progress products substantially but Apama is now being used by 18 clients for algorithmic trading of both equities and derivatives. Brazil is the gorilla in the Latin American region. It accounts for 90% of cash equities and 95% of derivatives business in Latin America. 90% of Brazilian trading is on exchange. Brazil emerged largely unscathed from the credit crunch and it's taken only 2-3 years to achieve the level of trading infrastructure that took perhaps 10-15 years to evolve in the US and Europe. More still needs to happen. Although the regulatory regime has an enviable reputation, it is moving slowly. Concerns regarding naked and sponsored access are holding up liberalisation that would lead to DMA and co-located access to the equities market, something which is place already for derivatives.

So, that's what I saw as highlights from the day. Tradetech seems, still, to be the place the whole industry gathers.

Tuesday, April 20, 2010

Predictions for increased transparency in Capital Markets

Posted by Giles Nelson

  It is my view that one of the most significant causes of the global financial crisis was a lack of transparency in financial markets.  Put simply, that means no one, not regulators or market participants, knew what the size of certain derivatives markets (like credit default swaps) was, who held what positions, or what the consequences of holding positions could be.  If financial reform brings nothing else, it should at least hold banks accountable for the business they conduct, and that means full disclosure and constant monitoring by responsible regulators.  

This action would help provide the basis for preventing future crises. No matter how inventive financial products may become, if regulators have complete and detailed information about financial markets and banks’ activities there, better assessments of risk can be made. This means that if necessary, banks’ activities can be reigned in through higher capital requirements or similar measures.  Simply limiting banks’ ability to conduct certain business is a blunt instrument that does not resolve the lack of transparency and likely will hamper economic growth.

Market transparency exhibits itself in many forms. Particularly relevant is that related to electronic trading. Therefore, I predict that regulators will require banks to implement relevant stronger pre-trade risk mechanisms. Regulators, such as the FSA & SEC, will ultimately bring in new rules to mitigate against, for example, the risk of algorithms ‘going mad’. This is exemplified by Credit Suisse, which was fined $150,000 by the NYSE earlier this year for “failing to adequately supervise development, deployment and operation of proprietary algorithms.”

Furthermore, volumes traded via high frequency trading will increase, although at a much slower pace than last year, and at the same time the emotive debates about high frequency trading creating a two-tier system and an unfair market will die down.

In addition, with regards to mid market MiFID monitoring, greater responsibility for compliance will be extended from exchanges to the banks themselves. Banks and brokers will soon be mandated to implement more trade monitoring and surveillance technology. There will also be no leeway on Dark Pools; they just simply have to change and be mandated to show they have adequate surveillance processes and technology in place. They will also have to expose more pricing information to the market and regulators.

This year will see a definite shift to an increasingly transparent – and therefore improved – working environment within capital markets. The ongoing development of market surveillance technologies and changes in attitudes to compliance will drive this forward, creating a more open and fairer marketplace for all.

Tuesday, December 22, 2009

My Baby Has Grown Up

Posted by John Bates

20090625_7172 copy_2 I was proud to recently be appointed CTO and head Corporate Development here at Progress Software http://web.progress.com/en/inthenews/progress-software-ap-12102009.html. But I don’t want anyone to take that as an indication that I won’t still be involved with event processing – au contrair. Event processing (whether you call it CEP or BEP) is now a critical part of enterprise software systems – I couldn’t avoid it if I tried!!

But taking a broader role does give me cause to reflect upon the last few years and look back at the growth of event processing and the Progress Apama business. Here are some observations:

  • It’s incredibly rare to have the pioneer in a space also be the leader when the space matures. I’m really proud that Progress Apama achieved that. Our former CEO Joe Alsop has a saying that “you don’t want to be a pioneer; they’re the ones with the arrows in their backs!” Usually he’s right on that one – but in the case of Progress Apama, the first is still the best! Independent analysts, including Forrester and IDC, all agree on it. Our customers agree on it too.
  • It’s tough at the top! I had no idea that when you are the leader in a space, many other firms’ technology and marketing strategies are based completely around you. I have met ex-employees of major software companies that have told me that there are Apama screenshots posted on the walls of their ex firms’ development centers – the goal being to try to replicate them or even improve on them. Other firms’ marketing has often been based on trying to criticize Apama and say why they are better – so their company name gets picked up by search engines when people search for Apama.
  • Event processing has matured and evolved. Yes it is certainly used to power the world’s trading systems. But it’s also used to intelligently track and respond to millions of moving objects, like trucks, ships, planes, packages and people. It’s used to detect fraud in casinos and insider trading. It’s used to detect revenue leakage in telecommunications and continually respond to opportunities and threats in supply chain, logistics, power generation and manufacturing. It enables firms to optimize their businesses for what’s happening now and is about to happen – instead of running solely in the rear view mirror.
  • Despite all the new application areas, Capital Markets remains a very important area for event processing. Critical trading operations in London, New York and around the world are architected on event processing platforms. The world’s economy is continually becoming more real-time, needs to support rapid change and now needs to support the real-time views of risk and compliance. We recognize the importance of Capital Market. My congratulations to Richard Bentley who takes on the mantle of General Manager of Capital Markets to carry on Progress Apama’s industry-leading work in this space. With his deep knowledge and experience with both Apama and Capital Markets, Richard is uniquely placed to carry on the solutions-oriented focus that has been the foundation to Progress Apama’s success.
  • Even in a terrible economy, the value of event processing has been proven – to manage costs, prevent revenue leakage and increase revenue.  Progress announced our fourth quarter results today http://web.progress.com/en/inthenews/progress-software-an-12222009.html which saw a double digit increase for Apama and triple digit for Actional. Apama and Actional are used, increasingly together, to gain visibility of business processes without modifying applications, to turn business process activity into events and to respond to opportunities and threats represented by event patterns – enabling the dynamic optimization of business performance.
  • But one thing I do believe: that soon there will be no such thing as a pure-play CEP vendor. CEP is part of something bigger. We’ve achieved the first mission, which is to raise the profile of event processing as a new technique that can solve hitherto unsolvable problems. Now the follow on mission is to ensure event processing finds its way into every solution and business empowerment platform. It is one of a set of key technologies that together will change the world.

I wish everyone Happy Holidays and a successful and profitable 2010 !!!

Monday, March 23, 2009

We're going on Twitter

Posted by Giles Nelson

Louis Lovas and myself, Giles Nelson, have started using Twitter to comment and respond to exciting things happening in the world of CEP (and perhaps beyond occasionally!).

The intent is to complement this blog. We'll be using Twitter to, perhaps, more impulsively report our thinking. We see Twitter as another good way to communicate thoughts and ideas.

We would be delighted if you chose to follow our "twitterings" (to use the lingo), and we'll be happy to follow you too.

Click here to follow Louis and here to follow Giles (you'll need to signup for a Twitter account).