« August 2008 | Main | October 2008 »

September 2008

Friday, September 26, 2008

The Financial Meltdown and the impact on CEP

Posted by Louis Lovas

Watching, waiting, wondering are the catch phrases of late. What will be the eventual conclusion of the meltdown and subsequent bailout of our financial system? Like many of you, I've been steadfast in my search for information, watching the news, reading blogs and op-ed's about how we got into this mess. I anxiously await our government's response. As a homeowner and family man with a retirement plan I am keen to understand this predicament. I also have a vested interest from a professional sense. The event processing technology in which we immerse ourselves is well entrenched in this same financial community.  Below I share a few thoughts and perspectives on our troubled financial times and how it might impact event processing technology.

Short Selling induced volatility

The idea behind short-sell trading is to sell an asset (i.e. shares of stock) that you the seller don't really own.  Then with luck the price drops, you buy the shares at the new lower price return the shares you borrowed and pocket the difference. Short selling strategies are common place in equities, currencies and futures markets and have been in use for a number of years. Which of course came to a complete halt when the FSA and the SEC banned short selling in an effort to stave off market instability. The FSA's clampdown gaged only a small number financial instruments, whereas the SEC's action was more punitive, setting restrictions on 800 companies. Once those regulatory bodies announced the crackdown the investment bank community, hedge funds and asset managers reluctantly scaled back their short selling strategies to comply. Short selling is a common use-case for CEP.  These recent restrictions are a clear indication of the need for algo strategies to be dynamically adaptable to market or regulatory conditions, whether in short selling or other types of strategies. It's imperative that the development tools provide the means to swiftly enable changes to strategies.

All told, the shorting restrictions were applied to those instruments that had massive impact on the market's stability. Just determining the market impact, the instability or general volatility was an investigative research project where CEP could have played a significant role. Volatility is a statistical measure of the scale of fluctuations in a price or index. By looking at historical norms, the FSA concluded 29 stocked exceeded those historical norms and the SEC determined it was 800. While I don't know if either regulatory body used a CEP product they clearly could have. Used in conjunction with a tick database for the historical market data, a CEP product like Apama provides the tool set and language to rapidly construct a market impact analysis solution with the ability to carry-out a multi-year analysis. 

Leveraged induced risk

Investment banks rely heavily on borrowed money, so much so the typical "leverage ratio" is 30 to 1. That means for every tangible dollar held, 30 are borrowed. While leverage can and does create the opportunity for huge windfall profits, it can also incur massive risk and huge losses. As long as markets are reasonably stable it allows for predictable behavior of investment strategies thus permitting banks to make money on borrowed money and continue to leverage.  However, as instability invades it erodes many aspects of investment strategies. Losses begin to mount -  on the investment itself, on the borrowed amount and the interest payment of the borrowed sum.

So how can CEP play a more active role in this highly leveraged world? With risk mitigation.  CEP is emerging as a key constituent in real-time risk management systems. One example is in deal-monitoring / position keeping systems. Auto-hedging is also a technique employed by risk systems to keep positions within the bounds of tolerable risk limits. One form of hedging is short-selling (see above). Risk management systems define allowable tolerances for trading, positions and P&L. With a 30 to 1 leverage ratio those tolerance limits seem terribly high. So when the market takes a turn for the worse, losses become astronomical. This is not the fault of the risk systems themselves but the allowable limits established by the bank's business then coded into the risk systems that permit highly leverage positions.   One can only image that investment banks will be adjusting the limits on their risk downward a few notches. It will also spark the opportunity for them to invest in newer more flexible real-time risk solutions. In volatile, ever changing market conditions risk systems need to be dynamically adaptable. CEP is an ideal technology to fill that need.

Reclassified Investment Banks

The last two major investment banks, Goldman Sachs and Morgan Stanley, have recently changed status from an investment bank to a commercial bank. In doing so they can now take deposits in the form of checking and savings accounts as a source of funds to shore up losses created by their highly leveraged positions. But commercial bank status means they face a litany of Federal regulation. For one, the 30 to 1 leverage ratio will be prohibited. The days of windfall profits in good times (and catastrophic losses in bad times) are over.  The Fed's also provide a safety net in the form of an emergency loan program for commercial banks. Goldman and Morgan now qualify to be protected by this net, thus preventing them from suffering the same fate as Lehman. I'm sure they had this in mind when they requested the change of status.  I wonder if Goldman and Morgan will be buying new name plates for the front entrances to signify their new commercial designation?

"In every crisis, opportunity" could be an appropriate catch phrase for the eternal optimist. While this meltdown certainly has the appearance of doom and gloom, in the end banks are in the business to make money and they will find a way. Event processing technology will be at the fore-front of that endeavor. Regulations will simply draw a box around which they operate. It will stir the creative spirit to achieve within these new boundaries. New forms of algo strategies, real-time risk monitoring and surveillance will be required as a result of the imposed bans and regulations. CEP technology - the Apama platform, will be at the forefront of this new (banking) world order.

Monday, September 22, 2008

Reflections on the Gartner Conference and EPTS4

Posted by Louis Lovas


Like many of my colleagues in the event processing community, I thought I would share a few reflections on the recent happens at the two back-to-back technology conferences of the past week. Gartner sponsored their annual vendor-fest known as the Event Processing Summit, and the EPTS had their fourth annual symposium. This being my first EPTS, I've had some initial thoughts and reactions which I've shared over the weekend.  For this, I'll delve more into the conference's content.

I attended a number of the sessions at the Gartner conference. I did not have any set agenda so I picked the sessions more on a personal appeal rather than some well thought out plan. While I do work in an engineering team, I have a customer focus so I attended all the customer sessions. I always find it valuable to understand how customers are deploying event processing technology in real-world use cases. Their efforts clearly infiltrate the product roadmap of vendors.

     
  • Lou Morgan of HG Trading, a lively speaker described his use of event processing technology in high frequency trading. Lou has been an Apama user for quite a few years and we've invited him to speak on our behalf on a number of occasions. He's an entertaining soul with a clear understanding of the Capital Markets business. We're delighted he presented his use of Apama at this conference.
     
  • Albert Doolittle of  George Weiss Associates Inc. gave a talk on using event processing technologies in this firm.  Albert described his technique to pick a vendor for his CEP project, which if I were to paraphrase was a coin flip.  Towards the end of his talk, he digressed from CEP technologies to present a short discourse on high performance computing (HPC). The idea of leveraging supercomputing-like technologies and FPGA's for compute intensive operations like Black-Sholes Options pricing certainly has caught Mr. Doolittle's attention. Typically CEP and compute intensive tasks don't mix well because of latency considerations. However, a marriage of CEP and HPC is possibly one made in heaven. I was intrigued.
     
  • The ebullient Marc Alder gave his brusque, no-holds-barred perspective on the CEP project he embarked on at Citi. Marc did a great job of explaining the challenges of introducing a new technology at a large corporation, one with a well entrenched bureaucratic IT organization.  I think most of us have faced the bureaucratic fortress at some time or another in our careers. Knowing how to play the game is a skill only a few master well, kudos to Marc for his successful venture.  As Marc unfolded his project's architecture he wisely chose a course to prevent vendor lock-in.

The juxtaposition of these three use-cases was most curious. Lou Morgan jumped deep into CEP technology and bet-the-ranch on it. Albert Doolittle took a gamble with a coin flip in choosing a vendor and Marc Alder kept his choice of a CEP product isolated and contained within his overall system architecture. A safeguard in case he felt the need to replace it.  Nonetheless all great examples of how CEP is gaining momentum in main stream business.

One session I thoroughly enjoyed was Don  DeLoach's "Extending the range of CEP". Don is the CEO of Aleri. I'm not sure I enjoyed this session more for its content or for Don's presentation skills. As is usually the case at technology conferences, it's death-by-Powerpoint. Slideware is typically jammed with an overabundance of barely readable text and dazzling graphics.  Don's slides however had a clear minimalist slant. A plain monotone background with either a single word or a (very) short phase well choreographed with his oration. He spoke of CEP as an evolving technology from the simple ability to filter streaming data to managing complex application state. He used an example that has become the Pièce de résistance of Aleri, order book consolidation.

There were many sessions on SOA and Event Driven Architectures - so many I lost count. 

I attended the panel discussion on low-latency messaging protocols. This was a Q&A session moderated by Roy Schulte of Gartner. The panelists were the crop of high-speed/low-latency message vendors. TIBCO-killers as I've affectionately referred to them. Vendors such as 29West, RTI, Solace Systems, IBM and even TIBCO themselves (apologies to those vendors I've not mentioned). Each described how they have defied physics to achieve incredible speeds yet still provide reliable delivery, management tools and even application level services (i.e. RTI's last value cache).  However, its noteworthy to contrast these low-latency vendors, all focused on shaving microseconds off message delivery via proprietary, even hardware-based schemes, to the many standard-based messaging systems trumpeted in other sessions. Those SOA and EDA sessions paraded a whole barrage of Web Services based standards models (i.e. WSDL, WS-Eventing, WS-Notification, WSDM, the list goes on and on) as the right way to build applications. These certainly seem like opposing forces that will only foster confusion in the eyes of those who have a clear business need for low-latency yet desire to adhere to a standards approach.

The EPTS Symposium began its first day with a keynote address from a VC which had funded Event Zero.  I had first met with Event Zero about a year ago, they have appeared to recast themselves from an adapter/connectivity vendor to one delivering an Event Processing Network (EPN). An EPN can be defined as an infrastructure platform for event processing agents or services. Those CEP agents performing both independently and in concert with other agents (or services) act upon streaming data sources. Together the whole becomes greater than the sum of the parts. Such is the grandiose vision of an EPN.  SRI was also promoting a similar notion of event processing as a service, which I would argue is a variation on this same theme.  Unfortunately, I think there is trouble ahead. The problem is simply timing, maturity and standards (or lack thereof).  I don’t think customers will buy into EPN's or Event Zero's vision until there is a clear establishment of standards for CEP. As a perspective, Application Server vendors tried this and failed (anyone remember SilverStream? Apptivity?). It was not until the J2EE specification established a uniform model that created true viability for a network or service infrastructure platform for AppServers.  Until we see the formation of CEP standards for interoperability and integration, the appeal of CEP will remain as basically a standalone application platform and vendors will continue to market a solutions approach, just look at any CEP vendor's website for proof of this. Nonetheless, Event Zero has embarked on a bold initiative and I wish them all the best.

Speaking of standards, moving slightly up the stack one could clearly detect the prevailing wind blowing against streaming SQL as the language of choice for CEP.  Going back to the Gartner conference there were a few noticeable comments to that effect. Marc Adler, described streaming SQL as making the simple things difficult to do.  Don DeLoach, downplayed the SQL language in Aleri in favor of the SPLASH enhancements. The renowned Dr. Luckham in his closing keynote address outlined Holistic Event Processing as the future implied it required a language beyond streaming SQL. 

At the EPTS Alex Koslenkov from Betfair castigated the streaming SQL approach for his use case in managing complex long-running state. Alex is an advocate of the RuleML approach to CEP languages, as such it stands to reason he doesn't have a high regard for streaming SQL and it showed.

Susan Urban from Texas Tech University presented a research project on a language they've dubbed StreamCEDL. Susan denounced streaming SQL as lacking the algebraic expressiveness necessary to move beyond simple stream processing to true complex event processing. One example, she mentioned in the description of StreamCEDL is its support of an APERIODIC operator.  The intent is to process irregular or out-of-order data streams.

Lastly, Chris Ferris from IBM presented on Industry Software Standards. This was a great session that portrayed the far reaching impact of adopting standards across our industry.  He stressed the importance in making every attempt to get broad vendor agreement, customer validation and to be sure the adopted technology serves the needs of the community because you'll have to live with it for years to come.  This is such an important message in the quest for standardization of CEP. Open, widely accepted standards are exactly what the CEP community needs; the sooner we embark on this journey the better.

Friday, September 19, 2008

A Truce at the CEP Front

Posted by Louis Lovas

                                        <p>A Truce at the CEP Front</p>                


I am a bit of a history buff and often times I'm reminded of some historical event when reading about current events. This inclination I have can easily be applied to the meltdown of the global financial markets we see happening all around us. The lessons of the past should be constant reminders of how we should behave now and in the future. I've always thought a degree in history should be a prerequisite to a political life, armed with such knowledge would clearly provide guidance to govern wisely. Maybe our business leaders should follow a similar career path.

I've just attended my first EPTS Symposium. It was the Technical Society's 4th annual get together. If you're unfamiliar with this organization, it's purpose is to promote event processing technologies through academic research and industry participation. The organization has a number of working groups that have contributed greatly to the overall awareness of event processing.  You can read more about the EPTS at their website.


The symposium was well attended by members of both academia and industry. All the major CEP vendors were there and it was the first time I've been in a setting where the atmosphere was completely non-competitive. It was a truce of sorts. While we typically wage war in the virtual battlefield in a land-grab for customers, for 2 days we discussed vision, standards and use-cases. We debated ideas, but we also laughed, ate and drank together. It was general camaraderie. As I mentioned, history is one of my interests and these two days reminded me of the 1914 Christmas Truce where the Germans and the Brits crawled out of their trenches and met in the no man's land to celebrate Christmas together. The guns fell silent that night in 1914 and for the 2 days of the symposium the virtual guns of competition also fell silent.

Come Monday we'll all be back at the war again. But for a short while it was fun. To see the face of the enemy unmasked, to get to know him, to share an idea and a drink was genuinely uplifting. We found common ground in our desire to see event processing become a main stream technology.

Sibos 2008 - the event processing angle

Posted by Giles Nelson

I am writing this at the end of the Sibos financial services show in Vienna. Sibos is the biggest global banking event of the calendar with pretty much everyone involved in the core banking area present including commercial banks, central banks, regulators, vendors and consultancies. It couldn’t, of course, have take place at a more interesting time. The extraordinary events we have witnessed this week in financial markets permeated every panel session, presentation and informal discussion held.

Event processing is big in financial services but, so far, it has generally only penetrated the front-office and middle-office risk management functions for use cases related to trading. There are good reasons for this: in general terms the front-office uses technology to directly enable more money to be made. Core banking on the other hand is about doing what banks were set up to do – to take deposits, to give credit and to be an arbiter of financial transactions. The reasons for technology investment are quite different and are driven by increasing operational efficiency, lowering costs and improving customer service. It’s more conservative and as yet event processing has not penetrated this area to any significant extent.

There’s no lack of use cases, I believe. Here are a couple of examples around the processing of payments. Earlier this year the UK launched its Faster Payments Initiative. Finally in the UK (the Netherlands, for example, have had this for 10 years) you can now pay a counterparty who banks with another UK bank in real-time, rather than waiting 3 days for the payment to clear (it’s remarkable it’s taken so long to fix such a, frankly rubbish, state of affairs and indeed it took the regulator itself to force change). As an end-user of this I am delighted with the results. I can now make an electronic payment using Web-based banking and it all happens immediately – the transaction occurs in the way one feels in the modern Internet era that it should. However this does raise a problem: how does a bank do its anti-money laundering checks, its comparison with counterparty blacklists and all the other fraud checks in the 12 seconds it has for the payment to go through? The answer is – currently with enormous difficulties.  Event processing is surely part of the answer.

Here’s another example. Currently the European payments industry is going through a lot of regulatory change to bring about lower prices and more competition for cross-border Euro payments (PSD and SEPA are the relevant acronyms if you’re interested). This will force technology investment because consolidation will mean a smaller number of banks will have to process more payments at lower cost. Furthermore competition will increase and, for example, a business in France will be able to use a bank in Germany to deal with its payments. Now, I reckon that having insight into what is going on with my payment systems, being able to identify processing exceptions, being able to identify when my customer SLAs are being exceeded and so on in real-time will be a crucial part of ensuring a world-class operation. Payment systems will continue to use many different types of technology, from mainframe to modern SOA environments,  so you need something to logically sit above this, extracting relevant real-time information and analysing and correlating it appropriately. There are offerings from conventional BAM vendors that address some of this now but I think they won't be performant or flexible enough to deal with future needs. Some customer engagements support this.

All of this is really about risk management and it seems inevitable that this area is going to be a booming area of investment in the next few years. Knowing what is going on in your business transactions, as they occur, will become more and more important. For example, in electronic trading it is becoming vital to not only regulators and trading venues (such as our customer Turquoise who we announced went live with Apama this week) but also to brokers. They want to know what their customers and they themselves are up to.

I think Richard Oliver, Executive Vice President at the Federal Reserve summed it up well. When asked about the future role of technology in core banking and payment systems he responded that the “immediacy of information is going to be vital” and that it was going to be all about “getting value from the information flows”. I think that provides a pretty good fit for event processing.

Saturday, September 13, 2008

The Need for Speed: Don't Strangle the Front Office!

Posted by Richard Bentley

I'm currently looking out of my hotel room window at preparations for the upcoming Singapore F1 Grand Prix - a road race under lights which - if the level of preparation is anything to go by - will be a truly awesome spectacle. I only wish I were staying here long enough to see it ...

But enough titillation for the petrolheads out there; there is a (CEP related) point here. In my previous post I mentioned a presentation I gave at last week's Derivatives World Asia event entitled "Real-time Risk Management and Compliance: Don't strangle the Front Office!". The sub-title gives away the thrust of the argument - the increasing need for real-time risk controls without adding unacceptable milliseconds in pre-trade checks to the orders send by Algo and High-Frequency traders.

This is a very topical subject for CEP right now. Our friends at Coral8 and Aleri have both recently discussed this use case. Here at Apama, our upcoming and much extended Algorithmic Trading Accelerator (watch this space!)  includes a comprehensive Risk "Firewall" for real-time breach detection, alerting and prevention. By considering each trade instruction (order placement, amendment, cancel etc) as an Event, the requirements of real-time risk and market abuse detection would seem ideal for a CEP solution:

  • processing multiple event sources concurrently - algo flow, treasury flow, DMA etc - scalable to 000s of updates / second
  • extensible risk rule base - evolved through scripting and graphical tools
  • real-time dashboards for monitoring e.g. positions and position limits and adjustment
  • filtering combined with stateful risk rules for tracking e.g. open P&L, value at risk etc

Not to mention the need for ultra-low latency decision making - don't strangle the front office!

aside: There is something of an irony here. CEP in its current incarnation found its initial home in the Front-Office for Algo Trading. Now the same technology is being deployed in the middle-office Compliance Unit (the "Profit Prevention Department" as some of my trader friends refer to it) to directly combat the risks from the Algos ... a case of fighting fire with fire?

Before we get carried away here though, CEP - or any technology - can never be a panacea for real-time risk management. In researching my talk I reviewed a number of case studies of recent incidents where risk management and compliance seems to have failed - none more so than the infamous case from earlier this year at SocGen, where one trader's actions brought about a 5 billion Euro loss on the Eurex Futures Market. In this case, however, existing controls did their job (though not in real-time of course) - prior to the approx 50 billion Euro position being "discovered", 75 separate warnings were issued including:

  • trade settlement dates falling on a Saturday;
  • trades which broke counterparty credit limits;
  • trades where the bank was both both seller and buyer;
  • distorted counterparty brokerage fees

and more. Such issues could easily be translated into risk rules and embedded in a real-time firewall, but the failings here were not those of detection - they were principally human failings in taking appropriate subsequent action.

We can of course seek to address such failings e.g. by adding "trip switches" to our real-time CEP firewalls which require reset by senior Compliance officers but we'll never solve the problem entirely. In the SocGen case, the trader explicitly circumvented existing risk controls by logging fictitious trades using knowledge of the risk rules that were being applied - risk firewalls will only ever be as good as the rules they manage and the data they get; compliance will always, to some extent, be playing "catch-up".

But when we're talking about risk, we are talking mitigation, not necessarily guarantees. To get back to the opening topic of this post, traders are like racing drivers; we need to give them the tools to go as fast as possible but tools which, when they crash - which they will from time to time - allow them to get out of the car and walk away with nothing more than a few bruises.

That is where the (CEP-powered) real-time risk firewall comes in.

Singapore Sling

Posted by Richard Bentley

I'm currently in Singapore as part of a road trip visiting clients in SE Asia. As well as enjoying a nice change from the lousy British climate (winter followed by a slightly wetter winter, as an ex-pat described it to me), I'm very much enjoying meeting with traders and IT folk to discuss Algo Trading and CEP as it pertains to Asia specifically. Algo Trading is really big news in the region - and we're seeing significant growth for Apama as a result (see our recent press releases on our deployments at Bank of China International in Hong Kong and Leading Investment and Securities in Korea).

I spent the last couple of days at the Derivatives World Asia conference, giving a presentation on Real-time Risk Management and Surveillance (a rapidly growing area for Apama and CEP in general) and hosting a workshop on Algo Trading. A strong theme from this event concerned the huge regional variations in Asia concerning technological maturity, readiness of the banks to adopt algo, regulatory frameworks etc. Japan, for example, was one of the first to deploy sophisticated network infrastructure and electronify its markets, but is now suffering scaling issues as a result - a case of "first mover disadvantage". In countries like Singapore, which are coming later to the game, lessons have been learned with very impressive results. The Singapore Stock Exchange (SGX) recently launched a new platform for its Equities and Derivatives markets offering sub-millisecond access to the exchange through proximity hosting, and a trading platform capable of scaling to thousands of orders per second. Singapore is really gearing up for Algo - not only on the technology side, but also through incentives to attract the big banks into the country - with Citigroup and RBS joining Standard Chartered in locating significant operations here. The local banks are also in the game, with regional acquisitions driving growth in their Investment Banking and Brokerage operations.

Algo is big news in Asia and in Singapore specifically; it currently accounts for 12% of all Equities and 18% of Derivatives traded in Singapore but those numbers are set to grow rapidly. Having the fastest trading platform in Asia will certainly help nicely!

Tuesday, September 09, 2008

Another CEP TLA - BOCI

Posted by Chris Martins

Colleague Giles Nelson recently commented on the fixation on TLAs and their meanings - a fixation that has seemed to dominate the CEP market's online discussion of late. He suggests that it obscures what is important in terms of coming to appreciate what is really the value of CEP. I tend to agree, though it can be entertaining to monitor the lively - but not necessarily illuminating - debate about internet routers and what kind of router best serves as a metaphor for CEP.

But rather than dwell on real or metaphorical routers, I'll attempt here to interject a bit more substantive news - another Apama customer win, this time the Bank of China.  For purposes of this posting lets call them BOCI (Bank of China International Holdings) to ensure this posting has enough TLA critical mass - though admittedly, BOCI is really an FLA. BOCI has chosen the Progress Apama CEP platform to support algorithmic trading in equities, futures, futures indices, warrants and bonds. With Apama they can receive market data concurrently from both SEHK and HKFE (the Hong Kong Stock and Futures Exchanges respectively) and algorithmically place orders into different sub-markets on these exchanges.

Now, there may be some debate somewhere as to whether this is really CEP. We cannot always provide details regarding the specifics of what a client does; in some instances the client does not share that detail, as they deem it too proprietary and valuable to their business. But the intricacies of trading in such complex markets (with transient liquidity and the need to act quickly in order to capitalize on that liquidity) make these applications excellent CEP use cases. Not all trading applications may represent CEP, but these applications indeed qualify.

TLAs vs. TLIs

On a side note, kudos to Giles Nelson for his posting's reference to "initialisms" vs acronyms.  It prompted a bit of research on the differences between acronyms and initialisms. As it turns out, BOCI is an initialism, not an acronym.  Likewise, so is CEP and perhaps many of the common shorthand abbreviations that dominate technology. If BOCI were an acronym, you would say it like the Italian game: bocce.

So, as far as Apama is concerned, the choices are

Boci_logo

Bocce_players_scoring

Our BOCI is on the far left. :-)

Monday, September 08, 2008

SQL Standards - an impedance mismatch with reality

Posted by Louis Lovas

SQL standard for CEP - an impedance mismatch with reality
Well the hype train has left the station. As I'm sure the whole of the CEP community knows by now StreamBase has teamed up with Oracle announcing a Streaming SQL standard. I am certainly in favor of standards in software technology, they clearly represent the tide that raises all boats. Customers and vendors alike are benefactors from communal standards. From ANSI standard programming languages like C and C++  to open standards like XML. Many a consortium of vendors and customers have labored arduously to define well-known technology standards for the collective benefit of the greater worldwide community.    However, this recent announcement by StreamBase and Oracle is nothing more than the practice of the crafty art of deception and diversion.  While I see nothing wrong with StreamBase and Oracle teaming up to work on enhancing the streaming SQL language for their CEP products, to tout it as representing an emerging industry standard is simply brazen.

The streaming SQL language in today's CEP products finds its roots in academia.  The Aurora project is one such academic endeavor. SQL was the language of choice for this project for good reason. Streaming data shares a number of common attributes with static data, why not use a well known data access, data filtering, data manipulation language. The Auoroa authors clearly had this in mind when they chose SQL. I'm sure they also had an expectation that streamingSQL and the future products based on it would evolve in a manner similar to database or other backend data service technology.

However, CEP platforms have matured into application platforms. This in no small measure is due to Progress Apama and our solutions approach to the market.  The Apama stack easily lends itself to the rigors and demands of the solutions or application environment. The Apama EPL, Monitorscript has the expressiveness of syntax to describe the semantics of  complex logic in today's CEP applications.  As the saying goes, imitation is the sincerest form of flattery, many of our competitors have followed our lead by introducing a solutions approach themselves. But as a result, they've faced a challenge with SQL being the underpinnings of their EPL.  SQL was never intended to be an application language, therefore they've chosen either to build application solutions in the mixed-language environment or extend their base EPL to include procedural constructs to support the needs of application semantics. In either case, something has to give.  Reading the fine print of a StreamBase solutions datasheet: "Incorporates algorithms as Java or C++ plugins" is an indication of the inefficacy of streamSQL for the intended purpose.  With each new release of Coral8 and Aleri they announce features in their SQL-based EPL adding procedural constructs and imperative scripting constructs similar to Apama Monitorscript.  These language enhancements or mixed-mode development requirements clearly validate that CEP has evolved into an application platform and not just a back-end data service engine.  From a language standards viewpoint this has only served to fracture.  Each vendor has carved their own course in this brave new world.

As a cautionary note, standards can be the opiate of the masses. They give customers a sense that they are protected against vendor lock-in.  Even the perception of an emerging standard can be hypnotic. This is all under false pretense.  Real standards provide benefits to customers and vendors alike covering a broad swathe not just a select few.  As the CEP community ventures into the standards world we should focus in those same areas where standardization has a proven track record in other technologies, interoperability and integration.  There is plenty of fodder here and I'm sure it will unfold in the coming months.

Friday, September 05, 2008

Acronym irrelevance

Posted by Giles Nelson

There’s been lots of discussion very recently (and lots of discussion about the discussion) about how CEP is related to other software acronyms and what constitutes a CEP use case or not. See here and here.


This kind of debate depresses me in two different ways. Firstly it displays symptoms of a more general software malaise – the wish to group and pigeon hole things into software classes which then confuse people who are not in the in-crowd. Once a name is agreed upon, let’s say Complex Event Processing, it then gets reduced to an acronym - CEP (excuse the pedantry but this is actually an initialism, not an acronym but that's enough of that). People feel a little sense of achievement. “We’ve made it – we’ve got a TLA!”. Debates then rage about how CEP relates to BRMS, BAM, ESB, BPM, ESP, BEP, EDA and EAI. Dreadful stuff, but, yes I know, we’re all guilty of it at times; it does give others the impression that the IT industry has its head up its own back passage.


The second reason this debate depresses me is that I really don’t understand this constant wish to class things as problems which fit into the "CEP class" or not (just see all the nonsense, albeit amusing, around routers of various types voiced by Messrs Bass and Palmer). Software is ultimately a productivity tool and what end-users really want to know is whether a product will help them achieve something they would be unable to do so by other means. End-users are using and considering using event processing products for a whole variety of purposes – trading, order routing, travel information dissemination, exception management in long-lived IT processes, detection of aberrant conditions in groups of pressure sensors, detection of fraud at retail point-of-sale terminals… the list could go on. People who have a problem to solve might think that event processing technology could help them. It is their responsibility, together with a vendor who may hope to sell something to them, to determine whether a product would help them or not. Were people doing event processing before off-the-shelf products came along? Yes. Do you need a CEP product to do event processing? No. Does everything you do with a CEP product have to involve complex, multiple streams of perhaps temporally related information existing in some distributed computing data cloud? No, of course it bloody doesn’t. I note that Opher Etzion seems similarly turned off by this type of debate.


And before I go I’m quite aware that I’ve used the CEP initialism eight times in this posting. I think I can be excused - this is a Complex Event Processing blog after all.