Education / Events

Wednesday, May 27, 2009

Location Tracking, fit-for-purpose design patterns in CEP

Posted by Louis Lovas


As the CEP community is well aware the technology often gets compared and contrasted to traditional data processing practices. It's a topic that ebbs and flows like the ocean's tides. It's a conversation that occurs regularly in various forums, with clients, analysts and of course within this community at large. Far be it from me to stir things up again but I believe there is some credible evidence the critics draw upon. This touched a nerve not too long ago when Curt Monash hammered the CEP community.  The response was swift, but frankly unconvincing.

In many respects, I believe this argument is rooted in how some CEP vendor's have architected their product. Many vendors have a focus of event stream processing as a variant of database processing. They see streaming data as just a database in motion and therefore have designed and built their products myopically around that paradigm. By doing so those vendors (possibly inadvertently) have plotted a course where the fit-for-purpose of their products is focused on use-cases that are data-intake dominate. They can consume the fire-hose of data, filter, aggregate and enrich it temporally but little else. What is missing in this equation is the definition of the semantics of the application. Whether that is a custom application such as algo trading or monitoring telco subscribers or some business intelligence (BI) dashboard. To those vendors, that is viewed as adjunct or external (i.e. the client) and solved with the typical complement of technologies; java, C++, .NET and considered outside of the direct domain of CEP. 

While this paradigm clearly does work, it incites the (CEP) critics; "where's the distinguishing characteristics? why can't I just do this with traditional data processing technologies?".

A challenging question when so many CEP products are stuck with that look and feel of a database, even a few of the academic projects I've reviewed seem to be myopically centered on this same paradigm. It reminds of that television commercial with the tag line:  "Stay Within the Lines. The Lines Are Our Friends." (I think it was for an SUV). Quite frankly such thinking does not represent the real world. Time to think outside the box (or table as the case may be).

Yes, the real world is full of in motion entities, most often interacting with each other in some way. Cars and trucks careening down the freeway zigzag from one lane to another at high speed with the objective of reaching a destination in the shortest possible time without a collision.  Would be an interesting CEP application to track and monitor the location and movement of such things.  

In fact, location tracking is beginning to show signs of being a common use-case with the Apama platform. Not long ago we announced a new customer, Royal Dirkzwager that uses Apama to track ship movements in sea lanes. My colleagues Matt Rothera and David Olson recently published a webinar on maritime logistics. This webinar follows the same basic premise as the Royal Dirkzwager use-case, that of  tracking the location of ships at sea.  In fact, we aren't the only one seeing activity in location tracking, here's a similar use-case for CEP in location-based defense intelligence.  The basic idea is the ability to track the movement of some entity, typically in relation to other entities, are they getting closer together (i.e. collision detection) or moving further apart (i.e. collision avoidance), are they moving at all? at what speed? will they reach a destination at an appropriate time? A CEP system needs, at it's core the ability to have both temporal and geospatial concepts to easily support this paradigm.  Here's a handful of examples where this applies:

  • Tracking ship movements at sea (as I mentioned with Royal Dirkzwager, and the Apama webinar on maritime logistics)
  • Airplanes moving into and out of an airspace
  • Baggage movement in an airport
  • Delivery trucks en route to destinations
  • Service-enabled mobile phones delivering content as people move through shopping and urban areas
  • Men, machines and material moving on the battlefield


These are just a handful of location tracking use-cases for which the Apama platform is well suited.

Another colleague, Mark Scannell has written a location tracking tutorial that is part of the Apama Studio product. This is a great example that exemplifies the power of the Apama EPL for building location tracking CEP applications. The tutorial provides a narrative description explaining it's purpose and the implementation. Below I've included a small snippet of that example to highlight the elegant simplicity, yet powerful  efficiency of the Apama EPL. If you're in need of a brief introduction to the Apama EPL, you can find that here, the beginning of a three part series on the language.


Location Tracking in the Apama EPL
.
// Track me - the tracked entity
action trackMe() {
       
  // Call self again when new location is detected
  on LocationUpdate( id = me.id ):me {
     trackMe();
  }

  // Detect a neighbor in our local area -
  // do this by excluding events that are for our id,
  // which will also cause us to reset this listener
  // through calling trackMe() again.
 
  LocationUpdate n;
  on all LocationUpdate( x in [ me.x - 1.0 : me.x + 1.0 ],
                         y in [ me.y - 1.0 : me.y + 1.0 ] ):n and
     not LocationUpdate( id = me.id ) {

     // Increment number of neighbors that have been spotted
     // and update the alarm
     spotted := spotted + 1;
     updateAlarm();           

     // Decrement count of spotted neighbors after one second
     on wait ( 1.0 ) {
       spotted := spotted - 1;
       updateAlarm();
     }
  }
}

As a brief introduction, the Location Tracker tutorial is designed to track the movement of Entities (i.e. cars, ships, trucks, planes, or any of those things I listed above) in relation to other Entities within a coordinate system or grid. An entity is considered a neighbor if it is within 2 grid units (-1,+1) of any other entity. The grid and the units within the grid are largely irrelevantly for the syntactic definition of entity tracking. Their semantic meaning on the other hand, is within the context of a specific use-case (i.e. a shipping harbor, air space, battlefield, etc.).

From the tutorial I pulled a single action, trackMe, it contains the heart and soul of the tracking logic.  As entities move they produce LocationUpdate events. The event contains the entities unique id and the X,Y coordinate of the new location. This trackMe action is designed to track their movement by monitoring LocationUpdate events. For each unique entity there is a spawned monitor instance (a running micro-thread so to speak) of this trackMe action. 

The idea is that when an entity moves its new location is instantly compared against all other tracked entities (except of course itself, witness the recursive call to trackMe when id's match (id = me.id)) to determine if it has become a neighbor (remember the 2 grid units).  This is elegantly implemented with the listener "on all LocationUpdate( x in [me.x - 1.0 : me.x + 1.0], ...". In a narrative sense, this can be read as "If the X,Y coordinate of this entities new location is within 2 grid units (-1.0, + 1.0)  of me then identify it as a neighbor and update an alarm condition"  (via  a call to updateAlarm()).

This small bit of code (about 20 lines) exhibits an immensely powerful geospatial concept, the ability to track the movement of 100's, 1000's even 10,000's of entities against each other as they move, and of course this is accomplished with millisecond latency.


 


This small example demonstrates a few of characteristics of the Apama EPL, specifically that it is an integrated well-typed procedural language with event expressions. It allows you to code complex event conditions elegantly expressed in the middle of your procedural program. This allows you to focus on the logic of your application instead of just selecting the right event condition.

However to get a clear picture, the language of CEP is just one aspect of an overall platform. The Apama strategy has also been focused on a whole product principle, one where the whole is greater than the sum of the parts. As a mandate to our vision we have outlined four key defining characteristics: 1) A scalable, high performing Event Engine. 2) Tools for rapid creation of event processing applications supporting business and IT users. 3) Visualization technologies for rich interactive dashboards and 4) An Integration fabric for the rapid construction of high performance, robust adapters to bridge into the external world.

The possibility exists that CEP will struggle to break out as a truly unique technology platform when so many just see a variant of database technology.  It's time to break out of the box, drive over the lines and succinctly answer the critics questions. CEP is not about tables, rows and columns but events. Events that are often artifacts of the real world. A world that is in constant motion, be it ships, planes, car, trucks, phones, or you and me. Information flows from it all in many forms but that does mean we have squeeze it into the database paradigm.

Once again thanks for reading, 
Louie


Monday, March 23, 2009

We're going on Twitter

Posted by Giles Nelson

Louis Lovas and myself, Giles Nelson, have started using Twitter to comment and respond to exciting things happening in the world of CEP (and perhaps beyond occasionally!).

The intent is to complement this blog. We'll be using Twitter to, perhaps, more impulsively report our thinking. We see Twitter as another good way to communicate thoughts and ideas.

We would be delighted if you chose to follow our "twitterings" (to use the lingo), and we'll be happy to follow you too.

Click here to follow Louis and here to follow Giles (you'll need to signup for a Twitter account).

Monday, February 16, 2009

Sun host Developers Workshop of complementary technology for low latency

Posted by Louis Lovas


Last month I was invited to speak at a Developers Workshop sponsored by Sun Microsystems on building low-latency trading applications. I had a 25 minute time slot to fill with the goal of educating an audience of 110 architects, engineers and technical managers from Wall Street on CEP and Apama's vision of it.  I'm usually able to sequester an audience for a lot longer than that to give them my perspective and since I tend to ramble just a bit, it was a tall order to me to whittle it down to this shorter time slot. 

To go one step further, I also did a demonstration of the Apama Algorithmic Trading Accelerator (ATA) and our graphical modeling tool, Event ModelerTM. So I had to move fast to accomplish this.

Since this was a Sun sponsored event, there were a number of sessions devoted to Solaris and Sun hardware. Sun has done some great work with the Solaris operating system to leverage multi-core processors for scaling low-latency applications. Still, you need ample knowledge and expertise to be able to fine-tune the OS to unlock that capability. There were a few demonstrations of how, using a few of the Solaris OS command tools you can better apportion  processor cycles to starving applications to achieve tremendous gains in performance. One has to be quite scholarly in Solaris systems management so one does not shoot thyself in the foot.

Besides myself representing Apama and CEP, I thought Sun did a great job of bringing together a group of complementary vendors that touched upon the theme for the workshop - low latency. Just to highlight a few... Patrick May from Gigaspaces discussed distributed cache technology and  Jitesh Ghai from 29West described the benefits of low-latency messaging to the trading infrastructure.  I've always considered both of these technologies very complimentary to CEP.  Among many other uses, distributed caching engines provide a basis for a recoverability model for CEP applications. Low-latency messaging brings new possibilities for architecting CEP applications in a distributed model.  

As for me, I presented a number of CEP themes in my talk:

1) Drivers for CEP Adoption. 

Fitting with the theme of the workshop, the drivers for CEP adoption are the increasing demands for low-latency applications. It's the race to the micro-second on Wall Street whether we like it or not. Additionally, the need for rapid development and deployment of complex applications is pushing CEP technology into the hands of the sophisticated business users. Graphical modeling tools empower these astute users, the ill prepared will get left behind.

2) Evolution of CEP  in Capital Markets.

 From single asset broker algos to cross asset, cross border smart order routing to real-time risk and surveillance. CEP is growing and maturing in Cap Markets on numerous fronts.


3) Anatomy of  CEP Architecture.

I presented a macro-level anatomy of the Apama vision of that architecture. There are 4 key components: 1) the CEP engine itself. 2) Integration to the external world. 3) Tools for development and deployment. 4) Visualization across a wide spectrum from richly interactive desktop click trading to widely distributed web-based user interfaces.

Lastly, I want to thank my pal Gareth Smith the Apama Product Manager and slide designer extraordinaire for these slides on the architecture. He's a wiz at putting ideas into compelling visuals.

You can download the slides of my Apama CEP presentation here.

As always thanks for reading,
Louie


Monday, January 19, 2009

A Transparent New Year

Posted by Louis Lovas


It's been a while since I've put a few thoughts in print on these pages on event processing, but that's not to say I've not been busy. We've wrapped up quite a successful year in the Apama division within Progress software. For this year, 2009 we've got quite a number of new features and capabilities in store for the Apama CEP platform. We will be extending the breadth and depth our event processing language (EPL) to meet the challenges we've seen in the market and a vision we have for the future. Of course you will see announcements in our standard marketing press releases but I (and others) will explore the technical merits of those new features in these blog pages in much more detail. Something we've not done that much of in the past. There are many historical reasons for that not really worth explaining. Suffice to say our intention is to be more transparent in the coming year.

To kick that off, it's worth starting a bit of a tutorial on the Apama EPL, a language we call MonitorScript.  I'll begin with the basics here and in subsequent blogs build upon these main concepts, providing insight into the power and flexibility of our EPL. And as we release new extensions and capabilities it will be easier to explain the benefits of those new features. So without further ado, here is the first installment.

First a few basic concepts...

  • Apama MonitorScript is an imperative programming language with a handful of declarative statements. This is an important consideration and one we highlight as a distinction in our platform against competitive platforms that are based on a declarative programming model. The imperative model provides a more natural style of development similar to traditional languages like java and C++.
  • The language is executed at runtime by our CEP engine called the Correlator.  

Second a few basic terms...

  • A monitor defines the outer most block scope, similar to a class in java or C++. It is the basic building block of Apama applications. A typical application is made up of many monitors. As you might imagine monitors need to interact with one another, I'll explore that capability in a later blog.
  • A event defines a well ordered structure of  data. The EPTS Glossary definition has a handful of great examples of Events
  • A listener, defined by the on statement, declares or registers interest in an event or event pattern. In a large application it's the Correlator runtime engine that will typically process 1,000's or even 10,000's of registered event patterns. In our example below we just have one.
  • An action defines a method or function similar to java or C++. 
  • The onload() action is a reserved name (along with with a few others) that is analogous to main() in a java program. 
To put it all together... "A monitor typically listens for events and takes one or more actions when an event pattern is triggered."


The language sports a number of capabilities that will be familiar to anyone schooled in java, C++ or any traditional imperative programming language. I won't bore with all the nuances of data types and such obvious basics, those are well articulated in our product documentation and customer training courses. I will however, focus on the unique paradigm of event processing.

Apama MonitorScript, a basic monitor.

event StockTrade {
  string symbol;
  float price;
  integer quantity;
}

monitor StockTradeWatch {

  StockTrade Trade;

    action onload {
      on all StockTrade():Trade processTick;
    }

    action processTick {
      log "StockTrade event received" +
      " Symbol = " + Trade.symbol +
      " Price = "  + Trade.price.toString() +
      " Quantity = " + Trade.quantity.toString() at INFO;
    }
}


This monitor called StockTradeWatch defines an event of type StockTrade. Events can originate from many sources in the Apama platform, the most obvious would be an adapter connected to a source of streaming data (i.e. stock trades as example shows), but they can come from files, databases, other monitors, even monitors in other Correlators.

The onload action declares a listener for events of type StockTrade (i.e. on all StockTrade). When the monitor StockTradeWatch receives StockTrade events, the action processTick is invoked. As you can see in this example we simply log a message to indicate that it occurred. The obvious intent is that this monitor will receive a continuous stream of StockTrade events, each one will be processed in-turn by the processTick action.  One can be more selective in the event pattern with a listener, such as on all StockTrade(symbol="IBM"). I will explore the details of event patterns and complex listeners later on.

As I mentioned, I've started with a simple example that shows the basics of the Apama EPL, MonitorScript. It demonstrates the simplicity by which one can declare interest in a particular event pattern, receive matching events and act on them in your application (i.e. action).

 In subsequent installments I will demonstrate more complex features highlighting the power of the language.  That's all for now.


You can find the second installment here.
Regards,
Louie


Friday, September 19, 2008

Sibos 2008 - the event processing angle

Posted by Giles Nelson

I am writing this at the end of the Sibos financial services show in Vienna. Sibos is the biggest global banking event of the calendar with pretty much everyone involved in the core banking area present including commercial banks, central banks, regulators, vendors and consultancies. It couldn’t, of course, have take place at a more interesting time. The extraordinary events we have witnessed this week in financial markets permeated every panel session, presentation and informal discussion held.

Event processing is big in financial services but, so far, it has generally only penetrated the front-office and middle-office risk management functions for use cases related to trading. There are good reasons for this: in general terms the front-office uses technology to directly enable more money to be made. Core banking on the other hand is about doing what banks were set up to do – to take deposits, to give credit and to be an arbiter of financial transactions. The reasons for technology investment are quite different and are driven by increasing operational efficiency, lowering costs and improving customer service. It’s more conservative and as yet event processing has not penetrated this area to any significant extent.

There’s no lack of use cases, I believe. Here are a couple of examples around the processing of payments. Earlier this year the UK launched its Faster Payments Initiative. Finally in the UK (the Netherlands, for example, have had this for 10 years) you can now pay a counterparty who banks with another UK bank in real-time, rather than waiting 3 days for the payment to clear (it’s remarkable it’s taken so long to fix such a, frankly rubbish, state of affairs and indeed it took the regulator itself to force change). As an end-user of this I am delighted with the results. I can now make an electronic payment using Web-based banking and it all happens immediately – the transaction occurs in the way one feels in the modern Internet era that it should. However this does raise a problem: how does a bank do its anti-money laundering checks, its comparison with counterparty blacklists and all the other fraud checks in the 12 seconds it has for the payment to go through? The answer is – currently with enormous difficulties.  Event processing is surely part of the answer.

Here’s another example. Currently the European payments industry is going through a lot of regulatory change to bring about lower prices and more competition for cross-border Euro payments (PSD and SEPA are the relevant acronyms if you’re interested). This will force technology investment because consolidation will mean a smaller number of banks will have to process more payments at lower cost. Furthermore competition will increase and, for example, a business in France will be able to use a bank in Germany to deal with its payments. Now, I reckon that having insight into what is going on with my payment systems, being able to identify processing exceptions, being able to identify when my customer SLAs are being exceeded and so on in real-time will be a crucial part of ensuring a world-class operation. Payment systems will continue to use many different types of technology, from mainframe to modern SOA environments,  so you need something to logically sit above this, extracting relevant real-time information and analysing and correlating it appropriately. There are offerings from conventional BAM vendors that address some of this now but I think they won't be performant or flexible enough to deal with future needs. Some customer engagements support this.

All of this is really about risk management and it seems inevitable that this area is going to be a booming area of investment in the next few years. Knowing what is going on in your business transactions, as they occur, will become more and more important. For example, in electronic trading it is becoming vital to not only regulators and trading venues (such as our customer Turquoise who we announced went live with Apama this week) but also to brokers. They want to know what their customers and they themselves are up to.

I think Richard Oliver, Executive Vice President at the Federal Reserve summed it up well. When asked about the future role of technology in core banking and payment systems he responded that the “immediacy of information is going to be vital” and that it was going to be all about “getting value from the information flows”. I think that provides a pretty good fit for event processing.

Wednesday, June 11, 2008

SIFMA 2008 - Hot and Busy

Posted by Chris Martins

This week is the SIFMA (Securities Industry and Financial Markets Association) technology management conference in NYC.  The city has been brutally hot and the NYC Hilton exhibit hall, at least where Apama is, has not been measurably cooler. Those $5.00 bottles of water that they sell in the exhibit hall might represent opportunism - or capitalism - at its best. Or maybe it’s just NYC. :-)

First day attendance has been quite good and I’d encourage any attending to stop by the Apama exhibit (Grand Ballroom, Level 2, #2415). You’ll see a range of demos, including our new 4.0 release, and see some of the “magic” of CEP in action (you’ll have to come by to understand the magic reference). A lot has been happening with Apama and we’ve used this show to make some significant announcements. 

  • A partnership with NYSE Euronext Advanced Trading Solutions that enables NYSE Euronext ATS to offer its customers Apama-based CEP capabilities for algorithmic trading, real-time risk management, and smart order routing. The announcement has received significant press pickup, including a nice story in the Financial Times.
  • A technology development collaboration with Sun Microsystems involving support of the Apama platform on Sun x64 systems, running the Solaris 10, as well as the launch of a new Apama “Accelerator”, the Apama Real-time Pricing Accelerator. Apama continues to focus on application templates that enhance the speed with which solutions can be deployed and this Accelerator extends that theme into the area of market-making for bonds and other instruments.
  • Announcement of the upcoming availability of above-mentioned Apama 4.0, which introduces a new development environment, Apama Studio. Studio brings all the Apama development tools together within a single, integrated environment, complete with tutorials, sample applications and other aids. You might consider Apama 4.0 as, at its heart, a CEP Accelerator. 4.0 also includes significant enhancements to Apama’s execution performance, partially achieved via a new event messaging transport.

So a lot is happening both in NYC and at Apama.

Now if the Celtics could only close out the Lakers.

Wednesday, June 04, 2008

Event Processing Technical Society

Posted by Chris Martins

Another milestone in the evolution of the CEP market (I'll purposely not use the word "maturity", given the consternation that term seems to generate) was yesterday's announcement of the Event Processing Technical Society.  This consortia of vendors, academia, and other industry participants has formed to "facilitate the adoption and  effective use of event processing."  As one of the founding members, Progress is happy to see this effort get going in a more public manner, after several years of meetings and online discussion.

Not a standards body, the EPTS will otherwise work to try to advance an understanding of what event processing is and how it can be used, via promotion of use cases, best practices, academic research and other initiatives.   One factoid in announcement was a Gartner citation that a typical  large company deals with 100,000 to 1 million business events per second.”  The value of harnessing that information suggests a vibrant market opportunity that hopefully all market participants can get behind.

Friday, February 29, 2008

Taking CEP to The Next Step

Posted by Chris Martins

Standards are an area of growing interest in the complex event processing market. What will be the role of standards.  In what aspects of CEP will they emerge? When? 

With the premise that “in the future, it will be necessary for heterogeneous CEP products to interoperate across organizational boundaries” the Object Management Group is hosting a Technical Meeting in Washington DC to address some of the issues. As part of the agenda, fellow Apama colleague and blog contributor, Matt Rothera, will be speaking on How CEP Takes the Next Step. So if you are going to be in the DC area on March 13, and interested in this topic, you definitely should check it out.

Friday, September 28, 2007

Unus per utilitas ut fulsi perspicuus ero laurifer

Posted by John Trigg

The Gartner CEP show in Orlando last week demonstrated a number of interesting things about CEP platforms

1. The story of core CEP is common across all vendors

2. Aside from the nature of how CEP is expressed within a platform, true differentiation starts to emerge in tooling and user constituencies that are using CEP

3. Future state as described by Dr Luckham indicates that CEP logic becomes available inside reusable libraries/repositories that can be folded into applications.

 

The premise of Apama has always been that you must embrace the full spectrum of users – encompassing IT, business analyst, end user, and senior management, to gain true productivity and application acceleration with CEP development.  Perhaps that is true for any application development platform.  But this has been a core tenet of the Apama platform since day 1 when the idea of having a platform that incorporates both a core event language aimed at programmers, as well as a metaphor for expressing & implementing event logic for the non-programmer who owns the core IP of the process.  Being able to express this information graphically - thus making it accessible and understandable, and then take action, completes the user stack we aim for with our Apama platform.

 

We have blogged about this before here and here, but what got my interest this time is the pitching of pure CEP programming approaches that are supposedly open to the core business user.  To construct a complete CEP application within a comprehensive platform you should be able to exercise the skills and knowledge of different users in a collaborative environment. Iterative and componentized development of interfaces, business logic, presentation and action come from the minds of many and in CEP all 4 elements are core to rapid, real time execution and adaptiveness.

 

The point that Dr Luckham makes about the accessibility and reuse of CEP components in CEP application construction as a future state is one which can be realized now.  The use of Smartblocks within Apama allows for the encapsulation of reusable logic that can subsequently be incorporated in other CEP processing.  For instance, logic to represent common trading algorithms or known air traffic congestion patterns or network intrusion patterns or supply chain metrics or internally developed analytic - all can be expressed as a CEP pattern.  Creating any of these as an Apama Smartblock, organizing them into meaningful catalogs for analysts to interrogate and select from, speeds application development and eliminates replication of core CEP logic within a larger implementation.

 

The mainstream adoption of CEP won't be based solely on it being cool technology (it is, but then again so was this). Being able to accelerate and redefine application construction in a real time world will further its adoption.  And maybe so will a solid and well thought out set of standards.  But that is for another time …

Tuesday, September 25, 2007

Thank You Gartner - Event Processing Conference #1 In the Books

Posted by John Trigg

Between September 19th and 21st, Gartner held its first conference on Complex Event Processing and Business Activity Monitoring in Orlando, Florida.  Some 200 people (by my estimation) came together to meet others interested in these technologies, as well as see and hear presentations from a range of Gartner analysts, CEP vendors, educators and thought leaders, and most importantly users of CEP.  The conference was bookended by impressive presentations from Roy Schulte and Bill Gassman on Day One setting out the current state of the CEP and BAM market, and by Dr David Luckham who closed the conference with a thoughtful and insightful look at the future of CEP. 

 

We’ll blog entries about different aspects of the conference over the coming days and weeks.  But for now it is important to stress how vital the timing of this conference was and how its attendees have shown that the principles of CEP are beginning to take hold in a wide array of industries and solutions.  Between the 3 conference tracks organized by Gartner (Event Processing, Business Activity Monitoring and Event Processing in Banking and Capital Markets) and the vendor sponsored sessions, we heard descriptions of applications of CEP in a variety of scenarios ranging from algo trading to clickstream analysis to content distribution to manufacturing and many more.

 

Architectural presentations were also prevalent with many sound ideas being put forward on the relationship between the ever evolving alphabet soup of CEP, BAM, SOA, EDA, BPM, OLTP, ESB and I am sure, many others.  Bringing together an audience such as this to discuss both practical implementations and more theoretical research allows insight to flow around the CEP community and to understand the ramifications for when CEP is seen as more than just event feeds and event processing speeds. For true application infrastructures to be built on the principles and technologies of CEP, a wide understanding of how we can evolve the relationships between these disciplines will be key.  And that understanding will come from the continued holding of conferences such as this one (already looking forward to next year in
New York) and interplay between the many disciplines, vendors and consumers of these technologies.

 

Dr Luckham posited that CEP will become a ubiquitous infrastructure technology in some 30 years.  For that to be true - indeed for it to happen sooner - we all have a lot of work to put in … but you can be sure that it will be worth it.