EDA and SOA

Monday, January 11, 2010

10 Reasons Why Progress Chose Savvion

Posted by John Bates

Today Progress announced the acquisition of Savvion http://web.progress.com/inthenews/progress-software-co-01112010.html

The reason that Progress chose to enter the BPM market is clear. Businesses are increasingly turning to BPM to implement and improve their business processes. Why? Firstly because no other solution can help enterprises achieve real-time visibility, agility, efficiency and business empowerment the way BPM does. Secondly BPM enables this to be achieved with low Total Cost of Ownership (TCO) and ease of use.

But why did Progress choose Savvion? Here are 10 reasons to start off with…

  1. Savvion is a trailblazer and industry leader – Savvion is a pioneer in BPM but is also still at the cutting edge. We wanted the best BPM thinkers at Progress. 
  2. Savvion has been proven to work at the enterprise level. Some BPM systems only work at the departmental level, but Savvion works at either departmental level or enterprise levels.
  3. Savvion offers System-centric and Human-centric BPM – Savvion can orchestrate processes but can also involve human users in workflow.
  4. Savvion is event-enabled – so business processes can respond to events. Progress has a lot of momentum behind event-driven business systems through our Actional and Apama solutions – and Savvion will work seamlessly in event-driven business solutions.
  5. Savvion offers vertical industry solutions – Analogous to Progress’ Solution Accelerators, Savvion offers out-of-the-box vertical solutions in industries including Financial Services and Telecommunications.
  6. Savvion offers an integrated Business Rules Management System – Expressing logic in terms of rules can often be very important. Savvion have developed a rules engine, integrated with their BPM system, enabling decision-oriented BPM – modifying the process flow based on rule conditions. This is a powerful capability.
  7. Savvion offers an integrated Analytics Engine – Business Intelligence has proved its worth but it is a “rear view mirror” technology – analyzing facts that have already happened. Savvion’s analytics engine enables continuous analytics to augment business processes and human user with advanced real-time analytics, enabling better decision-making.
  8. Savvion offers an integrated Document Management System (DMS) – Savvion’s integrated DMS enables rich document handling and empowers document-centric BPM.
  9. Savvion BPM suite is highly scalable, high performance and highly available – At Progress we pride ourselves on the strength of our underlying technology. We want to offer our customers a complete solution that embodies scalability, performance and availability. Thus selecting a BPM vendor in-keeping with this philosophy was key – and Savvion is just such a vendor.
  10. Savvion is a great cultural fit with Progress – An often-overlooked point is that cultural fit is key to acquisition and integration success. The Savvion team pride themselves on being innovative, customer-focused and fun - just like the Progress team. We’re looking forward to working together. 

Tuesday, December 22, 2009

My Baby Has Grown Up

Posted by John Bates

20090625_7172 copy_2 I was proud to recently be appointed CTO and head Corporate Development here at Progress Software http://web.progress.com/en/inthenews/progress-software-ap-12102009.html. But I don’t want anyone to take that as an indication that I won’t still be involved with event processing – au contrair. Event processing (whether you call it CEP or BEP) is now a critical part of enterprise software systems – I couldn’t avoid it if I tried!!

But taking a broader role does give me cause to reflect upon the last few years and look back at the growth of event processing and the Progress Apama business. Here are some observations:

  • It’s incredibly rare to have the pioneer in a space also be the leader when the space matures. I’m really proud that Progress Apama achieved that. Our former CEO Joe Alsop has a saying that “you don’t want to be a pioneer; they’re the ones with the arrows in their backs!” Usually he’s right on that one – but in the case of Progress Apama, the first is still the best! Independent analysts, including Forrester and IDC, all agree on it. Our customers agree on it too.
  • It’s tough at the top! I had no idea that when you are the leader in a space, many other firms’ technology and marketing strategies are based completely around you. I have met ex-employees of major software companies that have told me that there are Apama screenshots posted on the walls of their ex firms’ development centers – the goal being to try to replicate them or even improve on them. Other firms’ marketing has often been based on trying to criticize Apama and say why they are better – so their company name gets picked up by search engines when people search for Apama.
  • Event processing has matured and evolved. Yes it is certainly used to power the world’s trading systems. But it’s also used to intelligently track and respond to millions of moving objects, like trucks, ships, planes, packages and people. It’s used to detect fraud in casinos and insider trading. It’s used to detect revenue leakage in telecommunications and continually respond to opportunities and threats in supply chain, logistics, power generation and manufacturing. It enables firms to optimize their businesses for what’s happening now and is about to happen – instead of running solely in the rear view mirror.
  • Despite all the new application areas, Capital Markets remains a very important area for event processing. Critical trading operations in London, New York and around the world are architected on event processing platforms. The world’s economy is continually becoming more real-time, needs to support rapid change and now needs to support the real-time views of risk and compliance. We recognize the importance of Capital Market. My congratulations to Richard Bentley who takes on the mantle of General Manager of Capital Markets to carry on Progress Apama’s industry-leading work in this space. With his deep knowledge and experience with both Apama and Capital Markets, Richard is uniquely placed to carry on the solutions-oriented focus that has been the foundation to Progress Apama’s success.
  • Even in a terrible economy, the value of event processing has been proven – to manage costs, prevent revenue leakage and increase revenue.  Progress announced our fourth quarter results today http://web.progress.com/en/inthenews/progress-software-an-12222009.html which saw a double digit increase for Apama and triple digit for Actional. Apama and Actional are used, increasingly together, to gain visibility of business processes without modifying applications, to turn business process activity into events and to respond to opportunities and threats represented by event patterns – enabling the dynamic optimization of business performance.
  • But one thing I do believe: that soon there will be no such thing as a pure-play CEP vendor. CEP is part of something bigger. We’ve achieved the first mission, which is to raise the profile of event processing as a new technique that can solve hitherto unsolvable problems. Now the follow on mission is to ensure event processing finds its way into every solution and business empowerment platform. It is one of a set of key technologies that together will change the world.

I wish everyone Happy Holidays and a successful and profitable 2010 !!!

Friday, October 16, 2009

Apama 4.2 release - Cruising in the fast lane

Posted by Louis Lovas

Apama 4.2 release - Cruising in the fast lane
The Apama engineering team has done it once again. True to our record of releasing significant new features in the Apama product every 6 months, the v4.2 release is hot off the presses with major new functionality. The Apama roadmap is driven by a keen sense of our customer requirements, the competitive landscape and an opportunistic zeal. The engineering team is a dedicated R&D team driven to excellence and quality. We are dedicated to delivering value to our customers. A consistent comment we've heard from analysts and customers alike is the maturity of the Apama product.  

The current v4.2 release, the third in the v4.x family adds significant enhancements in three concurrent themes - Performance, Productivity and Integration. This consistent thematic model is one we've held for a number of years. Below I've touched upon the highlights of the current release along these themes:


  • Performance
High Performance Parallelism for Developers.  The Apama Event Processing Language (EPL) provides a set of features uniquely suited to build scalable event-driven applications.  The language natively offers capabilities for event handling, correlating event streams, pattern matching and defining temporal logic, etc. Equally important, the language provides a flexible means to process events in parallel.  For this we provide a context model and a new high performance scheduler. Contexts can be thought of as silos of execution, where CEP applications run in parallel. The scheduler's role is to manage the runtime execution in an intelligent high-performance way, and to leverage the underlying operating system threading model. It’s via the context architecture that the Apama Correlator squeezes the most out of operating system threads to achieve maximum use of multi-core processors for massive vertical scalability. For IT developers, this is a effective and efficient means to build high performance, low latency CEP applications without the pitfalls of thread-based programming, such as deadlocks and race conditions.

High Performance Parallelism for Business Analysts.  Not to be left out of the race, we've also ensured the scalable parallelism provided in the Apama CEP engine is available through our graphical modeling tool, the Event Modeler. We've had this graphical modeling capability since the very first release of Apama. This tool designed for analysts, quantitative researchers and of course developers, allows you to design and build complete CEP applications is a graphical model.  Parallelism is as easy as an automatic transmission, simply select P for parallel.

  • Productivity

Real men do use Debuggers (and Profilers too). The Apama Studio now sports major new functionality for development, a source level debugger and a production profiler. Building applications for an event-driven world presents new programming challenges. Having state-of-the-art development tools for this paradigm is a mandate. The Apama EPL is the right language for building event-driven applications - now we have a source-level debugger designed for this event paradigm. Available in the Eclipse-based Apama Studio it provides breakpoints to suspend applications at specific points, examine contents of program variables and single stepping. It works in concert with our parallelism as well. Profiling is a means to examine deployed Apama applications to identify possible bottlenecks in CPU usage.

Jamming with Java. We've enhanced our support for Java for building CEP applications. The Apama Studio includes a complete set of wizards for creating monitors, listeners, and events to improve the development process when building java-based CEP applications in Apama.

  • Integration

The (relational) world plays the event game. While we have provided connectivity to relational databases for many years we've made a significant re-design in the architecture of how we do it with the new Apama Database Connector (ADBC). The ADBC provides a universal interface to any database and includes standard connectors to ODBC and JDBC.  Through the ADBC, Apama applications can store and retrieve data in standard database formats using general database queries, effectively turning these relational engines into timeseries databases. The data can be used for application enrichment and playback purposes. To manage playback the Apama Studio includes a new Data Player that enables back-testing and event playback from a range of data sources via the ADBC. One can replay at varying speeds event data and time itself. The tested CEP applications behaves temporally consistent even as data is replayed at lightening speed.

Cruising at memory speed with MemoryStore. The MemoryStore is a massively scalable in-memory caching facility with in-built navigation,  persistence and visualization functionality.  This allows CEP applications, which typically scan, correlate and discard data very quickly to retain selected portions in memory for later access at extreme speed. This could be for managing a financial Order Book, Payments or other data elements that the application needs to be able to access at user’s requests quickly. Furthermore, if required the in-memory image can be persisted to a relational database for recovery or other retrieval purposes, and lastly the MemoryStore allows selected portions of the in-memory cache to be automatically mapped to dashboards.

Well that's the highlights. There were also about a dozen other features within each of these three themes, just too numerous to mention.

We are committed to improving the Apama product by listening to our many customers, paying close attention to the ever-changing competitive landscape and researching new opportunities.

Again thanks for reading, you can also follow me at twitter, here.
Louie



Wednesday, October 07, 2009

Business Events and Operational Responsiveness - our research

Posted by Giles Nelson

Yesterday, we published a press release on some research that we commissioned from a independent research firm. I wanted to give a bit more background to the research and how we intend to use it.

Our intent in doing this research was twofold:

(a) To discover something new about the markets that Progress operate in and validate some of our own beliefs about the market (or dispell them).

(b) To gather some interesting and relevant information to act as talking points around the things we think are important for our customers and prospective customers, as well, of course, as being commercially relevant to us.

We commissioned the research company Vanson Bourne to do this research and whilst we worked with them on the scoping of it, it was left entirely to them to execute on that scope.

We wanted to hear from end-users so a range of questions were posed to 400 organisations in Europe and the US in three industries - telecommunications, energy generation and logistics. No vendors, analysts or systems integrators were approached.

The questions were all around the theme of "operational responsiveness" - how good are firms at monitoring their operations, identifying issues with process execution, interacting with their customers, extracting and integrating information etc. In particular how good are firms at dealing with the business events which are flowing around, both internally and externally, and how good are they at acting on them in a timely fashion?

Why did we pick these three verticals? Firstly, we couldn't cover everybody and we wanted to go to more companies in a few verticals rather than go very broad. Secondly, we believe that these three verticals are the most interesting when it comes to the demands being placed upon them to cope with business events (Financial services is another obvious one but we know quite a lot about the demands in that industry already). Telecommunications firms are very dependent upon IT to differentiate their services; logistics companies are using more and more technology to track goods, trucks, ships etc. to streamline and automate their operations; energy producers are having to rapidly plan for the introduction of smart metering.

We're still digesting the results. But a few are worth highlighting here. Social networking is creating a significant challenge for all organisations in dealing with customer feedback - consumers expect instant feedback to their interactions. Organisations aspire to more dynamic and real-time pricing of goods and services to increase their competitiveness and maintain margins. And companies struggle with achieving a holistic view of how their processes are operating, both to operationally identify issues before they become expensive to fix or affect customer services, and to identify ways in which processes can be shortened.

We'll be talking more about the research results soon, both qualitatively and quantitatively.


Wednesday, September 30, 2009

EPTS, the Symposium of Trento

Posted by Louis Lovas

EPTS, the Symposium of Trento
How many angels can dance on the head of a pin? I suppose that was a question debated at the Council of Trent that took place in Trento, Italy back in the 16th century. However, the Event Process Technical Society's (EPTS) annual symposium just last week took up residence in Trento to discuss and debate a host of lofty topics on event processing.

  • CEP's role and relationship to BPM (or more appropriately event-driven BPM)
  • Event Processing in IT Systems management
  • Event-based systems for Robotics
  • EPTS Working Groups ...
While the sessions and discussions on event processing did not have the global significance of angels on pin heads or the Counter Reformation it did provide a clear indication of just how broadly and deep event based systems can reach. Whether it's a business application monitoring mortgage applications, IT management systems in a Network Operation Center, bedside monitoring systems in a hospital or a robot packing pancakes into boxes they all have a common underpinning, consuming and correlating streaming event data.

Granted, not everyone approaches it with the same viewpoint. IT Systems Management people don't think about processing and correlating events, they think about device management, KPI's, Alerts and the like. Someone building, managing a business process is likely concerned with managing Orders - validating them, stock allocations, warehouses and shipments. Nonetheless, a common framework model behind these systems is event processing.

Two of my favorite sessions at the EPTS Symposium were a panel session on the EPTS Mission and an open forum on Grand Challenges, a brainstorming session focused on identifying barriers to the adoption of CEP.

EPTS Mission

Four panelists, myself included presented their expectations of the EPTS and it's role as an industry consortium, it's goals and what improvements can be made. As a baseline, the EPTS does have a existing mission statement defined as ...

To promote understanding and advancement of Event Processing technologies, to assist in the development of Standards to ensure long-term growth, and to provide a cooperative and inclusive environment for communication and learning.


Given this mission statement and my own expectations there are a number of basic intentions the EPTS should provide to the uninitiated to event processing:

Awareness   Provide commercial business and industry the necessary knowledge of event processing as a technology supported by numerous vendors with continuing research in academia.
Definition Provide a concise and definitive meaning of event processing,  a Taxonomy of Event Processing so to speak. This is both from the horizontal technology perspective and also a vertical focus for a handful of specific industries. It's often difficult for business people to understand technology without the context of a business or application focus.
Differentiation  Provide a clear distinction that defines event processing and distinguishes it from other technologies. Event processing is available is many forms, this symposium provided evidence of that.  Much of it is available in specialized form as in IT Systems management. There are also pure play event processing (CEP) vendors, such as Progress/Apama. But there are also Rules engines, Business Intelligence platforms, Analytic platforms, etc. This easily presents a bewildering world filled for choice, conflicting and overlapping marketing messages. The EPTS is in the perfect position to provide that clarity behind defining what is CEP and what isn't.
Cooperative Event Processing rarely operates in a vacuum. There are many synergistic technologies that closely pair with CEP. Often this can have a specific vertical business flavor, but often it's other platform technology such as BPM and temporal databases.


The EPTS has four working groups that have been active for the last year: Use-cases, Reference Architecture, Language Analysis and Glossary. To a large extent the working groups have provided and are working towards the definition of CEP that is clear. However, there still a need to highlight the salient value of event processing. For specific vertical domains, the value of CEP is clear-cut simply because the fit and function is tailor made. In Capital Markets, for example algo trading has all the hallmarks of a CEP application - high performance, low latency, temporal analytics and a streaming data paradigm fit-for-purpose. However, there are other application domains where CEP is equally viable but much more subtle.  I believe the EPTS can provide a vendor-neutral taxonomy of event processing - from the basics to the advanced. Explain why it's unique and different, why language is important and how it is synergistic with a host of other technologies. To this end, the group has decided to form two new working groups to focus on many of these areas. Clearly a forward thinking move.

The Event Processing Technical Society is an organization made of up both vendors and academics. We're held together by a common thread, a goal that the whole is greater than the sum of the parts and our collective will benefit all even as many of us are undeniably competitors.

Once again thanks for reading,  you can also follow me at twitter, here.
Louie



Tuesday, August 04, 2009

Forrester Wave Cites Apama

Posted by Chris Martins

Forrester Research, a leading independent research firm, has just published its “Wave” on CEP and Progress Apama has been judged to stand out as a leader.  The evaluation process is quite detailed and lengthy.  It addresses:

·        Current product offering: product architecture, features, development environment, administration, interoperability, etc.

·         Vendor strategy: vendor’s product road map and other key strategy elements.

·         Market presence: market presence in terms of customer base, vendor size and presence, etc.

We are very excited to have Apama recognized in this way.  Together with the validation of our 115+ customer implementations (we don’t just score well in evaluations, but also in the real world of customers), this report speaks both to the vibrancy of the market and our leadership position.  But don’t take my word for it.  For a look at the actual report, including the graphical representation and the tabular scoring, you can check it out for yourself here.   

Monday, March 23, 2009

We're going on Twitter

Posted by Giles Nelson

Louis Lovas and myself, Giles Nelson, have started using Twitter to comment and respond to exciting things happening in the world of CEP (and perhaps beyond occasionally!).

The intent is to complement this blog. We'll be using Twitter to, perhaps, more impulsively report our thinking. We see Twitter as another good way to communicate thoughts and ideas.

We would be delighted if you chose to follow our "twitterings" (to use the lingo), and we'll be happy to follow you too.

Click here to follow Louis and here to follow Giles (you'll need to signup for a Twitter account).

Friday, October 31, 2008

CEP, EDA and SOA

Posted by Giles Nelson

I’d like to add my voice to the debate this week (here, here and here) on how Complex Event Processing (CEP) fits into the wider software architectural themes of Service Oriented Architectures (SOA) and Event Driven Architectures (EDA). Although I think I know how these three areas relate to one another fairly well, I was able to further clarify my thinking this week by spending some time with Neil Macehiter of Macehiter Ward-Dutton Advisors, a UK based software analyst. I found our discussion enlightening as Neil had a slightly different way of looking at these things than I had heard expressed previously. So let me try and express my own view on this in as clear a way as possible.

1. CEP is a technology. SOA and EDA are not technologies. SOA and EDA are philosophies for the design and build of modern distributed computing architectures.

2. A SOA is a loosely coupled set of services, the functionality of which closely reflects an organisation’s business functions and processes. A SOA will typically use modern, Web services technology and standards for implementation, but is not required to. Building a SOA requires much thinking about the services that the SOA will use.

3. An EDA is a loosely coupled architecture, the endpoints of which interact with one another in an event-driven fashion. Information flows around the EDA as events. An EDA will have endpoints which produce events and endpoints which consume events. An EDA works in a “sense and respond” fashion. Building an EDA requires much thinking on the event-types that the EDA will use.

4. An EDA may use business focussed services as endpoints. An EDA may therefore also be a SOA but it does not have to be.

5. CEP is a capability within an EDA, providing analysis and matching of multiple events being sent between endpoints. You can have an EDA without CEP.

6. If you’re building your architecture and focussing on defining event-types, it’s very likely you’re building an EDA.

7. If you are using CEP then you have at least the beginnings of an EDA because you will have been focussing on event-types. Your EDA may a simple one, with one event producer and consumer, but it’s still an EDA.

Comments welcome.

Monday, September 22, 2008

Reflections on the Gartner Conference and EPTS4

Posted by Louis Lovas


Like many of my colleagues in the event processing community, I thought I would share a few reflections on the recent happens at the two back-to-back technology conferences of the past week. Gartner sponsored their annual vendor-fest known as the Event Processing Summit, and the EPTS had their fourth annual symposium. This being my first EPTS, I've had some initial thoughts and reactions which I've shared over the weekend.  For this, I'll delve more into the conference's content.

I attended a number of the sessions at the Gartner conference. I did not have any set agenda so I picked the sessions more on a personal appeal rather than some well thought out plan. While I do work in an engineering team, I have a customer focus so I attended all the customer sessions. I always find it valuable to understand how customers are deploying event processing technology in real-world use cases. Their efforts clearly infiltrate the product roadmap of vendors.

     
  • Lou Morgan of HG Trading, a lively speaker described his use of event processing technology in high frequency trading. Lou has been an Apama user for quite a few years and we've invited him to speak on our behalf on a number of occasions. He's an entertaining soul with a clear understanding of the Capital Markets business. We're delighted he presented his use of Apama at this conference.
     
  • Albert Doolittle of  George Weiss Associates Inc. gave a talk on using event processing technologies in this firm.  Albert described his technique to pick a vendor for his CEP project, which if I were to paraphrase was a coin flip.  Towards the end of his talk, he digressed from CEP technologies to present a short discourse on high performance computing (HPC). The idea of leveraging supercomputing-like technologies and FPGA's for compute intensive operations like Black-Sholes Options pricing certainly has caught Mr. Doolittle's attention. Typically CEP and compute intensive tasks don't mix well because of latency considerations. However, a marriage of CEP and HPC is possibly one made in heaven. I was intrigued.
     
  • The ebullient Marc Alder gave his brusque, no-holds-barred perspective on the CEP project he embarked on at Citi. Marc did a great job of explaining the challenges of introducing a new technology at a large corporation, one with a well entrenched bureaucratic IT organization.  I think most of us have faced the bureaucratic fortress at some time or another in our careers. Knowing how to play the game is a skill only a few master well, kudos to Marc for his successful venture.  As Marc unfolded his project's architecture he wisely chose a course to prevent vendor lock-in.

The juxtaposition of these three use-cases was most curious. Lou Morgan jumped deep into CEP technology and bet-the-ranch on it. Albert Doolittle took a gamble with a coin flip in choosing a vendor and Marc Alder kept his choice of a CEP product isolated and contained within his overall system architecture. A safeguard in case he felt the need to replace it.  Nonetheless all great examples of how CEP is gaining momentum in main stream business.

One session I thoroughly enjoyed was Don  DeLoach's "Extending the range of CEP". Don is the CEO of Aleri. I'm not sure I enjoyed this session more for its content or for Don's presentation skills. As is usually the case at technology conferences, it's death-by-Powerpoint. Slideware is typically jammed with an overabundance of barely readable text and dazzling graphics.  Don's slides however had a clear minimalist slant. A plain monotone background with either a single word or a (very) short phase well choreographed with his oration. He spoke of CEP as an evolving technology from the simple ability to filter streaming data to managing complex application state. He used an example that has become the Pièce de résistance of Aleri, order book consolidation.

There were many sessions on SOA and Event Driven Architectures - so many I lost count. 

I attended the panel discussion on low-latency messaging protocols. This was a Q&A session moderated by Roy Schulte of Gartner. The panelists were the crop of high-speed/low-latency message vendors. TIBCO-killers as I've affectionately referred to them. Vendors such as 29West, RTI, Solace Systems, IBM and even TIBCO themselves (apologies to those vendors I've not mentioned). Each described how they have defied physics to achieve incredible speeds yet still provide reliable delivery, management tools and even application level services (i.e. RTI's last value cache).  However, its noteworthy to contrast these low-latency vendors, all focused on shaving microseconds off message delivery via proprietary, even hardware-based schemes, to the many standard-based messaging systems trumpeted in other sessions. Those SOA and EDA sessions paraded a whole barrage of Web Services based standards models (i.e. WSDL, WS-Eventing, WS-Notification, WSDM, the list goes on and on) as the right way to build applications. These certainly seem like opposing forces that will only foster confusion in the eyes of those who have a clear business need for low-latency yet desire to adhere to a standards approach.

The EPTS Symposium began its first day with a keynote address from a VC which had funded Event Zero.  I had first met with Event Zero about a year ago, they have appeared to recast themselves from an adapter/connectivity vendor to one delivering an Event Processing Network (EPN). An EPN can be defined as an infrastructure platform for event processing agents or services. Those CEP agents performing both independently and in concert with other agents (or services) act upon streaming data sources. Together the whole becomes greater than the sum of the parts. Such is the grandiose vision of an EPN.  SRI was also promoting a similar notion of event processing as a service, which I would argue is a variation on this same theme.  Unfortunately, I think there is trouble ahead. The problem is simply timing, maturity and standards (or lack thereof).  I don’t think customers will buy into EPN's or Event Zero's vision until there is a clear establishment of standards for CEP. As a perspective, Application Server vendors tried this and failed (anyone remember SilverStream? Apptivity?). It was not until the J2EE specification established a uniform model that created true viability for a network or service infrastructure platform for AppServers.  Until we see the formation of CEP standards for interoperability and integration, the appeal of CEP will remain as basically a standalone application platform and vendors will continue to market a solutions approach, just look at any CEP vendor's website for proof of this. Nonetheless, Event Zero has embarked on a bold initiative and I wish them all the best.

Speaking of standards, moving slightly up the stack one could clearly detect the prevailing wind blowing against streaming SQL as the language of choice for CEP.  Going back to the Gartner conference there were a few noticeable comments to that effect. Marc Adler, described streaming SQL as making the simple things difficult to do.  Don DeLoach, downplayed the SQL language in Aleri in favor of the SPLASH enhancements. The renowned Dr. Luckham in his closing keynote address outlined Holistic Event Processing as the future implied it required a language beyond streaming SQL. 

At the EPTS Alex Koslenkov from Betfair castigated the streaming SQL approach for his use case in managing complex long-running state. Alex is an advocate of the RuleML approach to CEP languages, as such it stands to reason he doesn't have a high regard for streaming SQL and it showed.

Susan Urban from Texas Tech University presented a research project on a language they've dubbed StreamCEDL. Susan denounced streaming SQL as lacking the algebraic expressiveness necessary to move beyond simple stream processing to true complex event processing. One example, she mentioned in the description of StreamCEDL is its support of an APERIODIC operator.  The intent is to process irregular or out-of-order data streams.

Lastly, Chris Ferris from IBM presented on Industry Software Standards. This was a great session that portrayed the far reaching impact of adopting standards across our industry.  He stressed the importance in making every attempt to get broad vendor agreement, customer validation and to be sure the adopted technology serves the needs of the community because you'll have to live with it for years to come.  This is such an important message in the quest for standardization of CEP. Open, widely accepted standards are exactly what the CEP community needs; the sooner we embark on this journey the better.

Friday, September 05, 2008

Acronym irrelevance

Posted by Giles Nelson

There’s been lots of discussion very recently (and lots of discussion about the discussion) about how CEP is related to other software acronyms and what constitutes a CEP use case or not. See here and here.


This kind of debate depresses me in two different ways. Firstly it displays symptoms of a more general software malaise – the wish to group and pigeon hole things into software classes which then confuse people who are not in the in-crowd. Once a name is agreed upon, let’s say Complex Event Processing, it then gets reduced to an acronym - CEP (excuse the pedantry but this is actually an initialism, not an acronym but that's enough of that). People feel a little sense of achievement. “We’ve made it – we’ve got a TLA!”. Debates then rage about how CEP relates to BRMS, BAM, ESB, BPM, ESP, BEP, EDA and EAI. Dreadful stuff, but, yes I know, we’re all guilty of it at times; it does give others the impression that the IT industry has its head up its own back passage.


The second reason this debate depresses me is that I really don’t understand this constant wish to class things as problems which fit into the "CEP class" or not (just see all the nonsense, albeit amusing, around routers of various types voiced by Messrs Bass and Palmer). Software is ultimately a productivity tool and what end-users really want to know is whether a product will help them achieve something they would be unable to do so by other means. End-users are using and considering using event processing products for a whole variety of purposes – trading, order routing, travel information dissemination, exception management in long-lived IT processes, detection of aberrant conditions in groups of pressure sensors, detection of fraud at retail point-of-sale terminals… the list could go on. People who have a problem to solve might think that event processing technology could help them. It is their responsibility, together with a vendor who may hope to sell something to them, to determine whether a product would help them or not. Were people doing event processing before off-the-shelf products came along? Yes. Do you need a CEP product to do event processing? No. Does everything you do with a CEP product have to involve complex, multiple streams of perhaps temporally related information existing in some distributed computing data cloud? No, of course it bloody doesn’t. I note that Opher Etzion seems similarly turned off by this type of debate.


And before I go I’m quite aware that I’ve used the CEP initialism eight times in this posting. I think I can be excused - this is a Complex Event Processing blog after all.