Monday, June 16, 2008

SIFMA Retrospective

Posted by John Bates

Img00040As a follow-up to my colleague Louis' report on last week's SIFMA show, I thought I'd add some thoughts of my own. My conclusion is that it was the most exciting SIFMA show I have experienced.  While I think attendance was down from previous years, I also think the quality of attendees was up. And for me personally, the excitement of being involved in some industry-moving announcements as well as meeting up with many of my colleagues from capital markets firms, vendors, press and analysts was highly invigorating.

So what were the highlights and take-aways for me?

1. CEP is clearly a theme that is getting a lot of mindshare. So many people said that CEP was a key theme of the show – which is great to hear after many years of working to help define the market. It’s also great to see this add to the momentum so soon after the Event Processing Technical Society was launched. The use cases of CEP are many and varied – and there was a lot of interest and questions around this at SIFMA. We demonstrated on our SIFMA booth 5 different CEP use cases on 5 different pods - algorithmic trading, smart order routing, managing FX market fragmentation, market surveillance and real-time bond pricing. Also the demands of CEP applications continue to make demands on the technology, and we were thrilled to demonstrate Apama 4.0 – which extends performance and user experience of CEP to new levels. Another supporting factor in the maturing of CEP is that there are starting to be very senior people in Capital Markets firms focusing on CEP as an enabling technology. Marc Adler from Citigroup   is a key example. He’s active in the community and on the STAC CEP committee, helping to define benchmarks. It was great to meet Marc at SIFMA and also to catch up with many  other esteemed colleagues from the CEP space.

2. The liquidity wars are hotting up. It was our pleasure to be involved in a press release with NYSE-Euronext which was certainly one of the big releases of the conference. Progress Apama will be hosted in the NYSE Euronext as part of the exchange's Advanced Trading Solutions offering. Traders will be able to download custom logic for algorithmic trading, risk management and smart order routing into the NYSE itself - with low latency connectivity to other trading venues via Wombat and Transact Tools. This arrangement turns NYSE into a technology provider as well as a one-stop-shop liquidity provider. This announcement was picked up by major press, including the Financial Times - in Europe, America and Asia -- see the article here.

3. Hardware is important – and so is “green”. The increase of capital markets data volumes require completely new software architectures – like CEP. But software is not always enough to support the low latency transport , processing and storage requirements. Many firms are turning to specialized hardware, combined with software – to create high performance solutions. Vhayu, for example, launched Squeezer – which combines hardware and software to supercharge their tick data offering. Also, Progress Apama were pleased to put out a joint announcement with Sun on a collaboration for end-to-end CEP solutions – combining Sun hardware and operating systems with Apama’s CEP platform and solutions. We demonstrated an end-to-end bond pricing application using the whole stack. Sun was one of the vendors who have a “green” aspect to their hardware – for example on a major CEP deployment, the hardware can be scoped for peak throughput – but can selectively shut down capacity to save power when event throughput is reduced. In this era of high energy costs and global warming there seems to be a lot of interest in this approach.

4. I love partying on the trading floor. Progress Apama were honored to be invited to a party at the NYSE to celebrate the latest developments at NYSE-Euronext (see picture at the top). It was a great pleasure to speak with our friends at NYSE-Euronext and to meet many of our old friends from the capital markets industry there – while sipping some delicious wine in that amazing place. In a way it is a shame that electronic trading is making the traditional trading floor a thing of the past – but there is something amazing about that place and I hope it stays just the way it is – even if it becomes a cool venue for other purposes. Thanks to NYSE-Euronext for inviting us – we had a great time.

I’m sure I’ve neglected a load of other trends and themes – but there’s my brain dump for the day. I’m interested to hear if you all agree.


Friday, February 29, 2008

Taking CEP to The Next Step

Posted by Chris Martins

Standards are an area of growing interest in the complex event processing market. What will be the role of standards.  In what aspects of CEP will they emerge? When? 

With the premise that “in the future, it will be necessary for heterogeneous CEP products to interoperate across organizational boundaries” the Object Management Group is hosting a Technical Meeting in Washington DC to address some of the issues. As part of the agenda, fellow Apama colleague and blog contributor, Matt Rothera, will be speaking on How CEP Takes the Next Step. So if you are going to be in the DC area on March 13, and interested in this topic, you definitely should check it out.

Wednesday, July 04, 2007

Why use SQL?

Posted by Giles Nelson

SQL is certainly one of the successes of the computing industry. It all started with the much cited and seminal paper of Codd in 1970 which first described the relational model. Over the next few years and after efforts by both IBM and Relational Software (which later became Oracle Corp) SQL was launched into the commercial domain in the late 1970s. Standardisation then followed in the mid-1980s and further support for more modern trends such as XML added more recently. Database management systems now have matured into highly sophisticated environments for the storage, manipulation and retrieval of enterprise information. SQL is the standard language of use in the database world. Attempts to move this on and break with the Codd paradigm, such as the move towards object oriented databases in the 80s and 90s have, apart from in niche areas, failed.

We now see a trend by a number of event processing vendors to represent SQL as the language of choice for building CEP applications. For example, Streambase, Coral8 and now BEA. Why is this? Well, Streambase is simple to explain. Michael Stonebraker is one of the key forces behind Streambase and his background in the database industry is second to none. He was involved in Ingres in the 1970s and also behind some of the work integrating object-oriented principles with relational databases with Illustra in the 1990s. Databases, and therefore SQL, is part of Streambase’s DNA. In comparison, BEA’s use of SQL is harder to understand. BEA’s business is built (still) upon their application server technology and they are strongly going to market with an enterprise integration offering – Aqualogic. Databases haven’t really formed part of BEA’s background. The use of Xquery would have been more obvious.

Perhaps these vendors have concluded rightly that SQL is actually the right way of doing things in an event processing world. I don’t believe it is. It’s certainly a way, but it’s not the best. I believe it can confuse people as to what event processing is all about and can serve to inhibit adoption. SQL is certainly well understood, but by providing an SQL interface to event processing products, practitioners assume that an SQL way of thinking will be appropriate. It isn’t. By thinking of event processing as actually a real-time database you get stuck in a database-centric design pattern.

When John Bates and I were doing some of the academic research which resulted in the formation of Apama in 1999, we were looking at how to support effectively applications which were powered by high-volume streams of data – stock tick analysis, location-aware telco applications and others. We looked at using database technologies to support this, but had to rip the work up. Not only did these technologies not perform, we realised we were force fitting one paradigm into another. Taking a clean sheet of paper we came up with the beginnings of a much more elegant, performant architecture. It was data, not query driven. The use of SQL to be a language interface to this just didn’t seem to be appropriate.

It seems that others agree. I was at a conference at CITT in Germany recently where CEP formed a major topic of discussion. In particular we talked about some of the challenges of using SQL to build event processing applications underpinned by practical implementations of use cases using a variety of event processing technologies. What became apparent was that the SQL approaches appeared to hinder, not help, the developer. The baggage that SQL brought made it difficult for people to get their heads around the thinking required to implement event processing use cases. The resulting SQL was clunky and difficult to follow.

So, am I going to conclude by saying that SQL should be shunned? Well no, I’m not. As a vendor, Progress Software is all too well aware that its products exist only to help organisations solve their business problems. Partly this is allowing problems to be solved that could not be solved previously. Partly, this is also to enable productive development. Giving a choice of a development environment with which many technologists are familiar is important and SQL can provide some of this familiarity. We therefore are observing and listening closely to the opinion of the wider market and to our prospects and customers. SQL may be one of the ways in which organisations should be able to interact with an event processing system.

However I maintain that it is certainly not the best choice, nor should it be the only one.

Sunday, April 01, 2007

Sentient CEP and the Rights of Algorithms

Posted by John Bates

Titchy_john_3I have just returned from the UK where, as part of my duties for the week, I spoke at a conference on Fixed Income and another on Foreign Exchange. At both these events, the delegates were interested to hear of the latest trends in the industry – which include aggregating multiple market data streams from disparate trading venues into a single view and using rules-based trading techniques to rapidly detect complex patterns and make and execute trading decisions. In these areas and beyond, Complex Event Processing is being successfully used by Investment Banks and Hedge Funds to enable such complex and low latency requirements.

While I was in the UK, one of the items in the news was the marking of the 200th anniversary of the abolition of the slave trade by Britain with a service in Westminster Abbey, attended by the Queen. I hope none of my comments seem to in any way belittle this significant incident, but it did rekindle some thoughts about “will algorithms ever have rights?”. We look back on the past and find it inconceivable that any class of people could be considered as less human than others. Society has adopted that principle, quite rightly, as part of our programming. However, we’re quite happy to turn a blind eye to the suffering of animals in unnecessary cosmetic testing and in horrific factory farm conditions. In the same way that most people in the 18th Century were quite happy to receive cheap cotton and sugar, now we are quite happy to receive cheap cosmetics and food. History suggests, however, that this will change eventually.

So what of the rights of the algorithm? Now you may think this is totally mad – and you’d probably be right. However, consider for a moment the field of algorithmic trading. While speaking at both of the conferences this week, I illustrated the changing role of the trader. Rather than the trader watching the markets for key trading opportunities and then manually executing, he/she now can deploy an army of algorithms to look for and respond to the key opportunities. The algorithms are the trader’s slaves. As a customer of Apama’s, Lou Morgan of HG Trading, put it recently – “…. They don’t need a lunch-break and they don’t need a bonus…!”. Of course these algorithms are not sentient, and therefore they don’t have any rights – but what if they were!?

Together with my colleague and mentor at Cambridge - Professor Andy Hopper, I ran a research group that looked into what we termed “Sentient Computing”. This is a very grandiose title for what Gartner might describe in commercial terms as the “Enterprise Nervous System”. It was all about how complex asynchronous stimuli in a wide-area computing environment could be detected and used to automate responses. There were lots of fun hardware in this environment, like “Active Badges” - that could detect your location to within 2 centimeters in 3 dimensions, “Active surfaces” – that provided a high bandwidth network to your wearable computers when in contact with the surface, and a variety of other ubiquitously deployed sensors, actuators, compute terminals and devices.

But the whole glue that made Sentient Computing possible was Complex Event Processing and Event-Driven Architectures – as they are called today. Sensors generated events – that needed to be analyzed; Actuators could be triggered – but needed something to do the triggering. Event-based rules provided the glue to enable complex circumstances to be modeled. For example “When John Trigg and Chris Martins are within 10 feet of each other  and there is a video terminal in their view, then play message ‘good morning’ on the nearest video terminal”. Some people described this kind of environment as an “Exploded Robot” – because rather than a single object having various sensors and actuators attached to it, the network cloud is the medium to which they are attached – and CEP and EDA are the mechanism through which the “neuron firing” is wired together. Nowadays, we are familiar with how CEP and EDA are enabling all sorts of “exploded robot” applications – such as logistics companies that monitor the wide-area movements of their fleets of trucks, and optimize based on the perishability of their goods, weather, traffic conditions etc.

Although we are at an early stage with Sentient Computing, the technology approaches of CEP and EDA definitely provide a great framework. An event is a nerve impulse from the body. CEP provides the brain – which can be changed with new rules (new thinking?) at any time – rather than being hardwired. The next stage of course is rules that are written and refined by the system itself. But the exciting thing is that there doesn’t have to be one brain; the powerful thing about events is they can flow on to another brain that can be processed and analyzed in a different way (a different perspective on the information; a different opinion?). And then events resulting from decisions can be routed to actuators/services/humans to ultimately cause things to happen – the resulting nerve impulse. All the time, EDA is providing a framework to bring these various nerve inputs, brains and nerve outputs together. Clearly we’re not there yet with EDA – and CEP is being used mainly in individual systems unconnected – but it is coming to offer an exciting “event cloud”, enabling various CEP systems to interoperate.

So should algorithms have rights? Well of course not – and it is April 1st as I’m writing this. But I do ask you to think about what you would have been like it you’d be alive in the 18th Century and enjoyed cheap tobacco, sugar and cotton. Would you have “thought differently”? I remember an episode of “Star Trek – The Next Generation” in which a scientist wanted to take Mr Data apart to see how he worked. Captain Picard successfully proved to Star Fleet that Mr Data was sentient and thus an individual, rather than being property. I don’t think you necessarily have to be sentient to have rights – there’s no reason to pull the wings off a butterfly just because it isn’t familiar with Descartes. I used to comment to one of my friends – who is a Maths Professors – that his computer programs were so badly written that they were like strange mutated creatures that probably suffered. Of course this is very silly!

Anyway, I leave you with the thoughts that perhaps Event-driven Architectures offer us the ability to have a truly global “information nervous system”. We are asynchronous; the world is asynchronous – so computing paradigms should support this. Of course realistically this is just a set of inputs, rules and outputs. However, as we add more intelligence to the “brains” then we must be careful this system doesn’t become too autonomous. After all – you know what happened with "Skynet" in Terminator!

Thursday, March 08, 2007

On CEP and Standardization

Posted by Progress Apama

Last week Anita Howard wrote an article about the adoption of event processing in capital markets.  The first part was great - she reports that event processing is being broadly adopted in the capital markets (and other markets as well).  She also suggests that SQL is an approach to addressing the complex event processing (CEP) technology issue.  But then the article implied, via quotes from StreamBase, that SQL is the "only candidate" for a standard event processing language and that SQL is "widely used," and there was follow up in the blog from StreamBase marketing people that there was "broad support" for SQL as a standard for event processing.  Although it's accurate to say that a SQL based approach is a viable, and in some ways effective, approach to CEP, there are lots of other language-based approaches that are, arguably,  more effective, and, arguably, more "widely used." I'm waiting for the next claim that SQL has been adopted for CEP on Jupiter, indicating that SQL is, indeed, an intergalactic standard. 

But the real issue I have is that StreamBase continues to polarize the industry and provoke CEP-by-SQL religious language wars.  Naturally software vendors will emphasize that their approach is better than the next guy's, but tactics that divide early stage technology markets are not healthy and should be stopped immediately.

Research in CEP began simultaneously in the mid 1990's at Stanford University (David Luckham), Cal Tech (Mani Chandy), and Cambridge University (John Bates).  This work, and subsequent commercial work, has resulted in a range of language approaches to CEP, including:

  • Graphical point-and-click environments
  • CEP rules
  • CEP languages (e.g., David's RAPIDE, and Apama's EPL)
  • Java
  • SQL-like syntax

Vendors, of course, are a proxy for customer demand and preference, and so far, there is no "intergalactic CEP standard." Progress Apama supports graphical tools, CEP rules, a CEP language, and Java; I believe Aptsoft has visual tools;  I believe TIBCO BusinessEvents is Java based, and has graphical tools; I believe Kaskad has a language based approach; and I believe Aptsoft has a visual environment.  In other words, these vendors, and by proxy their customers, represent that there is strong support for visual, Java, rules, and language-based approaches to CEP.

At Apama, we will continue to avoid polarizing discussions about standards, and participate in the nascent standards discussions that are inclusive of all approaches.  We'll also continue to advocate other important issues like customer use cases, the business value of those use cases, architectural discussion about the relationship between EDA and SOA, continued work on the event processing glossary, and more.

Constructive discourse that unites, not divides, will help the tide rise for event processing, and lift all boats along with it - customers, vendors, and more.