« May 2010 | Main | July 2010 »

June 2010

Wednesday, June 30, 2010

What do you do with the drunken trader?

Posted by John Bates

The news that Steven Perkins, (former) oil futures broker in the London office of PVM Oil Futures, has been fined 72,000 pounds ($108,400) by the FSA and banned from working in the industry is no surprise, see article here:




It could have been worse given that the broker, after a few days of heavy drinking, took on a 7.0 million barrel long position on crude oil in the middle of the night. The fine seems miniscule since it cost PVM somewhere in the vicinity of $10 million - after unwinding the $500+ million position.


The surprising thing about this incident is that it happened at all. Perkins was a broker, not a trader. He acted on behalf of traders, placing orders on the Intercontinental Exchange among other places. That he could go into the trading system and sneak through 7.0 million barrels without a customer on the other side is unbelievable.


Heavy drinking is practically a job requirement in the oil industry, my sources tell me, so this kind of thing could be a real issue going forward. As algorithmic trading takes hold in the energy markets, trading may approach the ultra high speeds seen in equities markets.  This is a recipe for super high speed disaster, unless there are proper controls in place - especially if there were a way for the broker or trader in question to enrich himself in the process.


One powerful way to prevent this kind of accident or fraud is through the use of stringent pre-trade risk controls. The benefits of being able to pro-actively monitor trades include catching "fat fingered" errors, preventing trading limits from being breached, and even warning brokers and regulators of potential fraud - all of which cost brokers, traders and regulators money. PVM is a good example of this.


Ultra-low-latency pre-trade risk management can be achieved by brokers without compromising speed of access.  One solution is a low latency "risk firewall" utilizing complex event processing as its core, which can be benchmarked in the low microseconds.  Errors can be caught in real-time, before they can reach the exchange. Heaving that drunken trader right overboard, and his trades into the bin.


Friday, June 25, 2010

Banks & Bullets: Maybe They Dodged One - But Still Needs Some Silver Ones!

Posted by John Bates

By now you’ve probably seen that a deal was reached this morning by the House and Senate on regulation. Some would say it waters down provisions from the tougher Senate bill, limiting rather than prohibiting banks to trade derivatives and invest in hedge funds. This articles describes it as banks “dodging a bullet” http://www.businessweek.com/news/2010-06-25/banks-dodged-a-bullet-as-u-s-congress-dilutes-trading-rules.html


We applaud the superhuman efforts put into the new financial regulation bill by the U.S. Congress and the House of Representatives. However, as I’ve said many times, transparency and consistency are critical to successful regulation in the capital markets. One could be forgiven for fearing that the watering down of the regulations, including the Volcker Rule, may create more havoc rather than increase transparency and consistency.


One thing is for sure – there’s going to be a raising of the priority on handling the complexity and requirement for real-time risk and surveillance within institutions. Risk managers and C-level executives concerned about minimizing risk and maximizing capital will need to view trading positions and limits across the firm, including, if permitted, derivatives that are 'spun out'. Ideally the risks should be aggregated and analyzed, in real-time, giving the ability to detect and prevent “accidents”. Pre-trade risk management will be increasingly important, as firms seek to maintain capital requirements at all times.


A top-down approach to risk where managers can see, in a single view on a dashboard, the risks across all asset class silos has gone from a “nice to have” to high on the wishlist – but many still wonder if it actually possible. Continual monitoring of trades in real-time can help to prevent exceeding trading limits, prevent mistakes and catch market abuse.

p.s. Many thanks to all the comments from market practitioners who comment that technology can't solve all the problems for Regulators, Banks and Trading Venues. I completely agree! But we can go a lot further than what happens right now. But of course technology is only one of the approaches. Changes in regulation is another, as is increased transparency, improved reporting (e.g. from fax to real-time data!) etc. etc. 

Monday, June 14, 2010

Rogue Trading Below the Radar

Posted by John Bates

Jerome Kerviel, the trader who allegedly lost Societe Generale nearly 5.0 million euros, went on trial in Paris on Tuesday, June 8th. The bank alleges that Kerviel took "massive fraudulent directional positions" in 2007 and 2008, which were far beyond his trading limits.

It is interesting to note that Kerviel was not only experienced on the trading floor, but he also had a background in middle office risk management technology. It may have been this knowledge that enabled him to manipulate the bank's risk controls and thus escape notice for so long.

Still, it is perplexing that fraud on such a scale can go on without detection for so long, even if Kerviel did have an insider's knowledge of the firm's risk management systems. Internal risk controls are not something that a financial firm can take for granted, left to run unchecked or unchanged for months or years.

The detection of criminal fraud or market abuse is something that must happen in real-time, before any suspicious behaviour has a chance to lose a firm money or to move the market. Pre-trade risk management is paramount, with trading limits specified and checked in real-time. Internal controls should be monitored for possible manipulation, again in real-time. The good news is that technology does exist in the form of real-time surveillance software from companies can analyse data transactions by the millisecond.

Financial institutions need to start looking inward to improve standards, regardless of current regulation. Otherwise the culture of greed and financial gain at all costs will encourage more and more Kerviels.

Thursday, June 03, 2010

FSA Loses Insider Trading Case - but more to come...

Posted by John Bates

Today’s insider trading cases acquittals in London are a big blow to the FSA, but their ability to detect and prosecute these market abusers cannot be overlooked. Without the technology to detect the trading anomalies, alleged white collar criminals cannot be prosecuted in the first place.

It’s also clear that the FSA is sending a message to the investment community: shape up or be prepared to pay. The £33.32 million ($48.8 million) fine for JPMorgan is the largest in the FSA’s history. 

As the SEC and CFTC in the US looks to adopt market surveillance technology, it will be interesting to see the potential rise in insider trading court cases and the size of fines in the US.

I think we're going to see a lot more of this type of prosecution around the world. The FSA is currently prosecuting 11 people for alleged market abuse.

As you may have read this week (link here -> http://bit.ly/aslFmV), the FSA is using new technology to crack down on potential market abusers. in the UK, the FSA receives 6m-8m transaction reports daily. The FSA will soon even have a system in place that will automatically alert staff to potential abuse in “real time”. Alexander Justham, the FSA’s director of markets, says the use of such “complex event processing” technology will give the FSA “a more proactive, machine-on-machine approach” to surveillance.

Optimism in the world of financial services regulation

Posted by Giles Nelson

It seems that we’re finally making some progress on making the financial markets function more safely. 

After the “flash-crash” of 6 May, US equity market operators have agreed to bring in coordinated circuit-breakers to avoid a repeat of this extreme event. There is widespread agreement on this. Industry leaders from brokers and exchanges yesterday made supportive statements as part of submissions to the SEC.

Regulators are going public with their use of real-time monitoring technology. Alexander Justham, director of markets at the Financial Services Authority, the UK regulator, told the Financial Times that the use of complex event processing technology will give the FSA “a more proactive machine-on–machine approach” to market surveillance (the FSA is a Progress customer). Other regulators are at least admitting they have a lot of work to do. Mary Schapiro, the SEC chair, believes that the technology used for monitoring markets is “as much as two decades behind the technology currently used by those we regulate”. Scott O’Malia, a commissioner at the Commodity Futures Trading Commission admitted that the CTFC continues to receive account data by fax which then has to be manually entered. 

The use of real-time pre-trade risk technology is likely to become much more widespread. “Naked” access, where customers of brokers submit orders directly to the market without any pre-trade checks, is likely to be banned. This is an important change as late last year Aite Group, an analyst firm, estimated that naked access accounted for 38% of the average daily volume in US stocks. The SEC is also proposing that regulation of sponsored access is shorn up – currently it has evidence that brokers rely upon oral assurances that the customer itself has pre-trade risk technology deployed. The mandated use of pre-trade risk technology will level the playing field and will prevent a rush to the bottom. Personally I’ve heard of several instances of buy-side customers insisting to brokers that pre-trade risk controls are turned off as they perceive that such controls add latency and therefore will adversely affect the success of their trading.

The idea of real-time market surveillance, particularly in complex, fragmented markets as exist in the US and Europe is gaining credence. The SEC has proposed bringing in a “consolidated audit trail” which would enable all orders in US equity markets to be tracked in real-time. As John Bates said in his previous blog post, it’s likely that the US tax-payer will not be happy paying the $4B the publically funded SEC estimates that such a system would need to get up and running. Perhaps the US could look at the way the UK’s FSA is funded. The FSA reports to government but is paid for by the firms it regulates.

As I mentioned in my last blog our polling in April at Tradetech, a European equities trading event, suggests that market participants are ready for better market monitoring. 75% of respondents to our survey believed that creating more transparency with real-time market monitoring was preferable to the introduction of restrictive new rules.

CESR, the Committee of European Securities Regulators, is currently consulting on issues such as algorithmic trading and high frequency trading. It will be interesting to see the results of their deliberations in the coming months.

I’m so pleased the argument has moved on. This time last year saw a protracted period of vilifying “high frequency trading” and “algo trading”. Now, there is recognition of the benefits as well as the challenges that high frequency trading has brought to equity markets and regulators seem to understand that to both prevent disastrous errors and deliberate market manipulation occurring it is better for them to get on board with new technology rather than try to turn the clock back to mediaeval times. 

New approaches are sorely needed. Yesterday saw the conclusion of another investigation into market manipulation when the FSA handed out a $150,000 fine and a five-year ban to a commodity futures broker.