Giles Nelson

Monday, December 14, 2009

Predictions for 2010

Posted by Giles Nelson

Last week we published some predictions for capital markets would evolve in 2010. I’d like to say a bit more about them.

Firstly, we predict there will be a big uptake of the use of technology for regulatory compliance and enforcement. Whilst the turmoil of the last 18 months was primarily caused by the over-the-counter, credit derivatives market, the effects of increased regulatory scrutiny are being felt throughout the trading industry. The power and authority of regulators has been bolstered, exchanges and alternative trading venues understand there is a greater need to monitor trading activity, and brokers and buy-side firms want to monitor and control their own and their clients’ trading activities better. There has been a significant debate in the media in the last few months on the merits of high frequency trading and variants of. This started in the specialist trade media, then made it to mainstream news outlets such as the BBC and it has been a topic deemed sufficiently important to be discussed by members of the US Congress and the British government. This has resulted in pressure on market participants to really up their game as far as trade monitoring is concerned. So how will technology be used to better enforce regulation and control over trading activity? Let’s start with the liquidty venues – the exchanges, the MTFs, the ECNs and the dark pools. Regulated exchanges in advanced markets generally do real-time monitoring of trading activity already to spot patterns of market abusive behaviour. They will need to continue to invest to ensure that they have the technology that can scale and be flexible enough to evolve with  changing patterns of trading behaviour. In contrast, exchanges in emerging markets do not often have adequate monitoring systems. This will change – we’re seeing substantial interest in Apama for exchange surveillance in less developed markets.  At the other end of the liquidity spectrum, the regulation around dark pools will change. It is likely that there will be limits imposed on the proportion of stock that can be traded through dark pools and operators will need to to disclose more information. Furthermore, regulators will insist that dark pool operators prove they have adequate monitoring systems in place. It won’t be just a paper exercise – they’ll have to prove it. Brokers will be in a similar position. Each participant in the trading cycle in the UK, for example, has a responsibility to ensure that the market is working fairly. The UK regulator, the FSA, is putting pressure on brokers to show that they have proper trade monitoring technology in place so customer and internal flow can be understood better.

Let’s move on to hosted services. “Cloud computing” is certainly a term du jour. What does it mean for capital markets? The first thing to say is that cloud computing, as in the provision of hosted software and computing resources, is a very familiar technology concept in capital markets, even though until recently people may not have used the term “cloud computing”. Anyone using a Reuters market data feed, or passing over orders over FIX to a broker, or accessing a single bank foreign exchange portal, is accessing services in the cloud. In fact electronic trading relies to a great extent upon cloud services. In 2010 however, more hosted services of a richer functional nature are going to become available. Instead of just building blocks – the market data, the DMA access etc. – more services will become available for algorithmic trading and risk management. Brokers do offer hosted algo services already, but they are broker specific. An example of a hosted algo service is one we launched with CQG recently. These will mature and broaden their scope. These types of services are invaluable to mid-sized trading organisations who can’t, or don’t want to, build a whole range of systems themselves.

Lastly, our prediction about emerging markets. We’re seen significant growth this year in demand for Apama in Brazil, India and China. Brazil particularly, because of continued economic growth and market liberalisation, has led the way (for example, Progress has 15 customers using Apama in Brazil now). India and China are getting there. They have further to go in market liberalisation to fuel the demand for algorithmic trading, but to attract extra investment and liquidity to their domestic markets they’ll be left with little choice. Hong Kong is an exception – algorithmic trading is used extensively both by global and regional players and it provides a window onto developed markets that mainland China can learn from.

Capital markets will evolve quickly in 2010, as in every year. That's what makes it such an interesting area to work in.

Thursday, November 05, 2009

In defence of high frequency trading

Posted by Giles Nelson

The high frequency trading (HFT) debate seems to have entered a new and worrying phase in the UK. On Tuesday this week in an interview with the BBC, Lord Myners, the UK’s financial services minister, warned that high frequency trading had “gone too far” and that share ownership had “now lost its supporting function for the provision of capital to business”. (You can find the original interview here and reports of it in the Financial Times and The Independent yesterday).

 

 Mary Schapiro, head of the SEC, signalled at the end of October that a number of electronic trading areas were going to be looked into – naked access (where a broker sponsors a firm to have direct electronic access to an exchange), dark pools and high frequency trading.

 

It does seem now that on both sides of the Atlantic, governments and regulators are steeling themselves to act and softening the markets up to be able to accept the fact that electronic trading might have some limits.

 

The concern is that governments and regulators are going to come down too hard on electronic trading and the benefits that it gives investors will be damaged.

 

It all started with the flash order issue in the US a few months ago. Commentators were linking together various different, although related issues, in an inappropriate way. Flash orders seemed to be viewed sometimes as being synonymous with HFT, both of which were sometimes reported as forms of market abuse. All three topics are quite different. In my opinion, there are legitimate questions over the use of flash orders and a proposal to ban them is now being considered.

 

Dark pools, where large blocks of stock are traded off exchange to minimise market impact, have been the next targets. There are, again, legitimate issues. Dark pools, by their very nature, do not have good price transparency. Regulators have become concerned with their use because more and more trading is going through dark pools. Some estimates put this at between 10% and 30% in Europe and the US. This lack of knowledge about what exactly is the proportion is part of the problem itself. No one really knows what proportion of trading dark pools is taking. If a significant proportion of the market has no price transparency then this undermines the notion of a fair market for all. Regulators are looking at this and its likely that they will force dark pool operators to disclose far more information about what is being traded than they do currently. The SEC is considering limiting the proportion of a stock that can be traded through dark pools to a small percentage.

 

These legitimate issues however risk skewing the whole HFT debate to one where people will conclude that “HFT is bad”.

 

What people are now describing as HFT – the very fast and frequent, computer assisted trading of, usually, equities – is an evolution of something that has been happening in the market place for at least the last 10 years. In this time electronic trading has proliferated, not just in equities but also in all asset classes such as derivatives, bonds and foreign exchange. Far more venues for trading have been created. There are now many places where a company’s stock can be traded both in the US and Europe. This has brought competition and choice. Prices have been lowered, improving access to retail investors. Spreads have narrowed. Arbitrage opportunities are harder to find, which mean that market information is disseminating faster which, in turn, means that price transparency has improved. Because there is more trading going on, there is more liquidity available, which also means keener prices.

 

A key part of the HFT trend has been the use of algorithmic trading (the most prevalent use of complex event processing technology). Algo trading models fall broadly into one of two camps: alpha seeking, where market prices are examined to find a trading opportunity that will make money, and execution where orders are, usually, split up into smaller parts and then traded automatically in the market in an intelligent way to find good prices and to ensure those prices are not overly influenced by the trades being made themselves. For each type of model it can be very useful to react very quickly to market information, either to take advantage of a price discrepancy or to quickly pickup liquidity at a good price. Algorithmic trading is enormously beneficial for those who use it and its use is not limited to specialist hedge funds. Most algorithmic trading uses execution models that find liquidity, good prices, help minimise market impact and, lastly, increase significantly a trader’s productivity. Instead of wasting time executing several simple orders in the market over the course of many minutes or hours, the trader can simply ask a machine to do it. The trader can then spend time either covering more of the market (useful in straitened economic times) or spend more time actually delivering real value to a client.

 

Algorithmic trading and HFT have brought very significant benefits. It is these benefits that must not be threatened.

 

Trading has always involved cunning and guile, whether human or computer based. Competition has always existed in who’s got the best traders and trading systems. Organisations investing in ultra low-latency infrastructure to ensure orders arrive at an exchange in microseconds (not nanoseconds as sometimes claimed by the way – light travels 30cm in 1 nanosecond which isn’t far enough to be very useful) are part of this competitive world. Competition leads to innovation and it is this innovation that has brought so many of the benefits described above. Computer-based models can somtimes be used abusively. There are many forms of market abuse that regulators and exchange operators look for. Some exchanges and regulators have been investing in real-time surveillance technology (Progress counts Turquoise and the UK Financial Services Authority as customers using Apama) to ensure that they can spot abusive patterns of behaviour quickly.

 

We can’t start slowing trading down. We can’t go backwards and put the electronic trading genie back in the bottle. We don’t want to lose all the benefits that have come. Rather, regulators and exchanges should concentrate on ensuring maximum transparency in how markets operate and ensure that those attempting to maliciously abuse the markets are dissuaded or caught.

 

Monday, October 19, 2009

Progress Apama Announcing Latest Release 4.2

Posted by Apama Audio

As a follow up to the Louie Lovas blog posting on October 16th , this  podcast captures a discussion between David Olson and Giles Nelson on Apama 4.2 features.


Friday, October 09, 2009

Developing Event Processing Applications

Posted by Apama Audio

Listen to this podcast to hear Chris Martins and Giles Nelson discuss development of event processing applications.


Wednesday, October 07, 2009

Business Events and Operational Responsiveness - our research

Posted by Giles Nelson

Yesterday, we published a press release on some research that we commissioned from a independent research firm. I wanted to give a bit more background to the research and how we intend to use it.

Our intent in doing this research was twofold:

(a) To discover something new about the markets that Progress operate in and validate some of our own beliefs about the market (or dispell them).

(b) To gather some interesting and relevant information to act as talking points around the things we think are important for our customers and prospective customers, as well, of course, as being commercially relevant to us.

We commissioned the research company Vanson Bourne to do this research and whilst we worked with them on the scoping of it, it was left entirely to them to execute on that scope.

We wanted to hear from end-users so a range of questions were posed to 400 organisations in Europe and the US in three industries - telecommunications, energy generation and logistics. No vendors, analysts or systems integrators were approached.

The questions were all around the theme of "operational responsiveness" - how good are firms at monitoring their operations, identifying issues with process execution, interacting with their customers, extracting and integrating information etc. In particular how good are firms at dealing with the business events which are flowing around, both internally and externally, and how good are they at acting on them in a timely fashion?

Why did we pick these three verticals? Firstly, we couldn't cover everybody and we wanted to go to more companies in a few verticals rather than go very broad. Secondly, we believe that these three verticals are the most interesting when it comes to the demands being placed upon them to cope with business events (Financial services is another obvious one but we know quite a lot about the demands in that industry already). Telecommunications firms are very dependent upon IT to differentiate their services; logistics companies are using more and more technology to track goods, trucks, ships etc. to streamline and automate their operations; energy producers are having to rapidly plan for the introduction of smart metering.

We're still digesting the results. But a few are worth highlighting here. Social networking is creating a significant challenge for all organisations in dealing with customer feedback - consumers expect instant feedback to their interactions. Organisations aspire to more dynamic and real-time pricing of goods and services to increase their competitiveness and maintain margins. And companies struggle with achieving a holistic view of how their processes are operating, both to operationally identify issues before they become expensive to fix or affect customer services, and to identify ways in which processes can be shortened.

We'll be talking more about the research results soon, both qualitatively and quantitatively.


Monday, September 28, 2009

Good application development practices are critical for EP project success

Posted by Giles Nelson

Mike Gaultieri, a analyst at Forrester Research, takes application developers to task for not knowing what the business wants.

This has never been more the case than in the case of event processing. Without understanding the business events that are available to them and the needs of end-users, developers are not going to get anywhere. What do I mean by business event? It's simply an event containing some business orientated, rather than technically orientated, information - a stock tick, weather report, GPS coordinate etc. It can be pretty much anything, but it must have some business value and meaning to the business.

In addition, developers need to understand what the business wants to achieve - is it about simply maintaining an up-to-the-moment view on a goods ordering process, using appropriate charts and graphs to visualise the information, or is it about automating decisions - pricing a product dynamically or re-routing a vehicle based upon traffic conditions? Often, an end-user won't care about what technology is being used to deliver the information or make the decisions they want - and nor should they. As Mike Gaultieri correctly points out, a good user experience is everything; the technology itself is at best secondary or often irrelevant.

My advice to developers is to start small and focussed, get something done and prepared to iterate quickly. Try and define business requirements to begin with but be flexible about changing them (within reason). If developers don't do this, and work in a vacuum, success won't come.

Friday, August 21, 2009

State of the event processing market, August 2009

Posted by Giles Nelson

The year so far has seen the interest in event processing gathering momentum. It is moving out of being used in niche areas – for electronic trading for example – and being recognised as a general method of building more responsive systems and applications.

There's evidence for this. Two influential analyst firms have published event processing reports this year. IDC in February and most recently Forrester, who published their Wave in July. This means the software market more generally recognises event processing as a distinct discipline. If you follow John Rymer or Mike Gaultieri on Twitter, the two Forrester analysts responsible for the Wave, you'll now see them reporting frequently on customer enquiry calls regarding event processing. Joe McKendrick, influential IT commentator, yesterday described event processing as a bona fide market. In addition, we've just got back the results from a global study in the uptake of real-time technology. We'll be talking more about this soon, but here's an interesting snippet – 86% companies reported that they had critical business events they wanted to monitor in real-time. If that's not evidence that the market for event processing will accelerate in future, I don't know what is. And finally, and perhaps most concretely, the number of vertical industries deploying event processing has increased. In addition to a whole range of use cases in financial services, Progress has now got customers using Apama in telecommunications, transport and logistics, manufacturing, retail and travel.

Why is this happening? Firstly, the benefits of event processing are becoming better understood by a wider group of people, both in the business and technology domains. Secondly, organisations are having to act because of the ever increasing amounts of data being thrown at them. An example of this latter point is Royal Dirkzwager, maritime information supplier, who has taken an event processing approach in part because its data rates are increasing from 10s of events per second to 10s of thousands a second. Finally, organisations generally are having to become more responsive – whether to detect errors or risks with existing business processes or to be able to interact with customers more intelligently. 3 Italia, the mobile telco, provides an example here. The end-of-day report, reporting on what happened rather than what is happening is simply not good enough.

Is it this issue of responsiveness that is really the key driver; the reason why an organisation can commit dollars to a project. Becoming more proactive enables you to spot problems more quickly (and therefore repair issues more cheaply), improve customer service (by, for example, reacting to customer interactions with a Web site or wireless network more quickly) and to be more competitive (for example, by pricing products more dynamically by taking customer propensity and changing market information into account). In every customer I can think of, it is one or more of these issues that has driven them to buy; the understanding of the impact of processing crucial events in the business – of doing business event processing.

The market has yet to become mainstream. It is still one that is about innovation rather than commoditization. It still takes visionaries and light bulb moments for the applications of event processing to be identified. But it is rapidly changing. It won't be too long before the full force of events becomes widely recognised.

Sunday, May 10, 2009

CEP deployed in 3 Italia, wireless telco

Posted by Giles Nelson

3 Italia is a third-generation mobile telco in Italy, a division of Hutchison Whampoa who hold a number of 3G licenses worldwide. 3 is in the process of deploying Apama to achieve better real-time analysis of network and business systems.

 

The first area where 3 Italia has deployed Apama is in the monitoring of core network elements and business support systems, such as customer relationship and billing systems. By correlating events from both network and business systems, 3 can better understand issues that are occurring which might affect customer service or 3’s ability to charge correctly for services used.

 

So why do they need Complex Event Processing to do this?

 

Firstly, telcos are highly sophisticated technical environments. IT is, of course, entirely at the centre of how a telco operates. Even 3, a “modern”, wireless-only telco with none of the complexity demanded of legacy, wired environments, has many different operational and business systems. Many of these are monitored on a continual basis to ensure that the network and services are running correctly. However the operational monitoring solutions used to do this work in a restrictive way. Typically they only monitor a specific network or business system (billing or CRM for example). Some offerings are able to monitor multiple systems but the analysis and the presentation of information from these systems is not correlated. So, a holistic view of what is happening as network and business services execute is simply not possible.

 

Secondly, there’s the issue of scale. The amount of traffic in a telco is enormous. A typical telco with a few million subscribers will generate billions of events per day. Yes, some of these events can be intepreted, analysed and aggregated, but only at low level within a telco’s architecture where limited value is gained. To be able to analyse this information in a flexible, general-purpose way in anything approaching real-time is impossible using conventional technologies such as databases or business intelligence systems.

 

So this is where CEP comes in at 3. It is only by being able to bring information from multiple different parts of the organisation together and to correlate and analyse this in real-time that 3 can obtain a more complete, customer-centric view. 3 has built a large number of end user dashboards, built in Apama, to monitor various aspects of their billing and service delivery. In total, several hundred key performance indicators are being tracked which allow deviations from norms to be identified and predictions made when service level agreements would be exceeded. It is the operational business users which monitor these dashboards to give them the personalised, real-time view of 3’s business that they need. Apama provides 3 with the flexibility to introduce new monitoring scenarios and enhance existing ones straightforwardly, so ensuring that the operations department stays on top of, and influences, changes and enhancements to the services that 3 offers.

 

Looking beyond the monitoring of the operational business systems, CEP can provide other benefits. Telcos operate within a very competitive, innovative business environment which is rapidly changing because of trends such as ubiquitous wireless access, smartphones and Web 2.0. Telcos do not want to be simple utility providers – giving network access and nothing more. Delivering value-add services and improving customer loyalty are key aims. By understanding how customers are using network services now provides benefits. Customers may change the way they use services over time. Consider a user who may start sending significant numbers of international SMS messages. By being able to detect this trend and be aware that they might benefit from discounts from a different tariff means that a telco can, by being real-time aware of this usage, interact with the user once behaviour patterns are detected to offer that user a better, more personalised tariff – perhaps to offer them a bundle of extra international SMS messaging for a small additional fee. By responding to their usage, not at the end of the month, or in some other arbitrary time period, but now, the telco can be seen to be responsive to their needs. The customer can receive immediately a benefit and so, customer loyalty is enhanced. This understanding of customers’ usage patterns is invaluable for a telco’s marketing department and makes them far more customer focussed.

3 provides an excellent example of how a telco can deploy CEP to give them insights that they have never previously had. Not only are operations in 3 being optimised, but new models of real-time interaction with customers are in sight. CEP promises to be an invaluable aid in the telcos’ fight to deliver higher-value, personalised services and to be better placed to retain customers.

 

Our press release on 3 can be found here.

 

Monday, March 23, 2009

We're going on Twitter

Posted by Giles Nelson

Louis Lovas and myself, Giles Nelson, have started using Twitter to comment and respond to exciting things happening in the world of CEP (and perhaps beyond occasionally!).

The intent is to complement this blog. We'll be using Twitter to, perhaps, more impulsively report our thinking. We see Twitter as another good way to communicate thoughts and ideas.

We would be delighted if you chose to follow our "twitterings" (to use the lingo), and we'll be happy to follow you too.

Click here to follow Louis and here to follow Giles (you'll need to signup for a Twitter account).

Friday, March 13, 2009

An(other) industry award for Apama

Posted by Giles Nelson

I’m very pleased to announce that Apama has won the 2009 Technical Analyst award for “Best Automated Trading product”. The awards ceremony took place last night at the Sheraton Park Lane Hotel in London.

The judges not only considered the product itself and the way it was being used in electronic trading but also considered the customers who were using it too, so this is a really fantastic validation of Apama.

Here are some of the key attributes of Apama that were considered by the judges:

  • Apama is (of course) a CEP platform – one thing that that provides is flexibility. Once deployed to run an application such as FX aggregation, Apama can be applied equally well to algorithmically trade equity futures or any other type of asset.
  • Connectivity to a vast range of general software and trading specific systems.
  • Availability of Solution Accelerators for FX aggregation, order routing, algorithmic trading and others providing 90% of an end-user application.
  • Backtesting of trading strategies and ongoing strategy evolution.

For the record, the other finalists were Alphacet, Berkeley Futures (IQ Trader), Patsystems (Pro-Mark), QuantHouse (QuantFACTORY) and TradeStation.

Now, from a CEP technology perspective, this is the significant thing for me. Apama was up against, in quite a focussed industry category, a number of other vendors who do nothing else except build their products for use in capital markets. And yet, Apama won, thus demonstrating the fact that a general purpose CEP platform provides the first-class choice for organisations who want to flexibly deploy a whole range of event-driven applications. That’s great news for all CEP aficionados.

Now, where’s the rest of that bottle of bubbly?