EDA and SOA

Thursday, July 17, 2008

Rendering Unto Caesar - The Role of the Business User in CEP

Posted by Chris Martins

"Render unto Caesar the things which are Caesar's
and unto God the things that are God's"

A recent posting in the Enterprise Decision Management Blog entitled "Can we trust business users" addresses a topic that seems equally pertinent to the CEP market. I think there is a tendency to become so enamored with the technical virtuosity of new technology that we may lose sight of who the real users are. In terms of the development of CEP applications, an understanding of the prospective roles of business users vs. that of IT developers seems to be an evolving thing. Apama has long been a proponent of the participation of the business user in the process of building CEP-driven applications. In Capital Markets, the notion of "empowering the trader" has been a key element of our strategy and the Apama product offers a graphical development tool, Event Modeler, that focuses on that constituency. We have another RAD tool that is intended for developers who can create applications in our event processing language, as well.

We are beginning to see third party validation of the value of this approach from outside of Capital Markets, as well. For example, a report from Forrester Research published earlier this year indicated that early adopters of CEP have tended to come from the line of business rather than IT because "developers and architects often know painfully little about these [non-traditional, non-IT] events and how they are used to run a business." That Forrester quote is certainly not intended to diss IT. It just recognizes that there are lots of different kinds of events that are important to a business and not all those events are traditional IT-aware or IT-generated events. In order to make sense and respond to such events, it seems quite logical that providing tools that are amenable to a more "business" oriented audience is important.

But I would argue that it is not just the nature of events - and their varied sources - that suggests a strong correlation between CEP and business users. It is also the nature of the CEP-driven applications themselves. CEP applications are not "one and done", they tend to be iterative and evolving, because they are crafted to respond to what is happening and what is happening is often a frequently changing set of conditions. Or, if the conditions are not changing, how you choose to respond to them may be changing. So you need to continually calibrate and revise.

In another Forrester report published earlier this year, this characteristic was noted within the context of a review of Business Activity Monitoring best practices. Event-driven BAM is a particularly strong use case for CEP and the report stated that BAM efforts "are typically never-ending projects" with a "high degree of change." That makes sense since BAM monitors 'business activities' and the nature of most businesses will change over time. So to support BAM applications, it seems perfectly logical to provide tools for business users who can take on some role in the initial development and ongoing operational calibration of these applications. There is clearly an important role for developers in building these applications – no one would suggest otherwise, but we best not forget what the “B” in BAM refers to.

What seems to be emerging is the notion that we are should not look at CEP and/or BAM deployments as discrete, finite projects with clearly prescribed end dates. They are continuously iterative projects that must evolve to remain effective. That’s the environment in which they operate. Given that, perhaps we should not see the roles of business users and IT as fitting within well prescribed boundaries. The development and ongoing management of these applications will have evolving roles for the line of business user and for IT over time. We might expect IT-centric development to have a more dominant role in the initial deployment, but over time the goal might be to have the line-of-business assume greater and greater roles - because the business will be the dominant user and best positioned to react to changing circumstances.

Perhaps the EDM blog posting says it best, though it expresses it within a "business rules" context. “Too many rules specialists focus on rules that will get executed and not enough on the people that will maintain those rules, although this is where the success of the project resides.” That is quite the same for CEP and BAM. There is a role for business users, driven by the nature of events and the continuously evolving nature of the applications that are “event-driven.” And it is incumbent on the technology provider to offer tools that will facilitate that evolution. All the CEP performance in the world will be of little use, unless that performance is well-aligned with the needs of the business.

So we might debate who is the metaphorical Caesar and who is God in CEP development, but the success may well rest on giving each their due.

Monday, June 16, 2008

SIFMA Retrospective

Posted by John Bates

Img00040As a follow-up to my colleague Louis' report on last week's SIFMA show, I thought I'd add some thoughts of my own. My conclusion is that it was the most exciting SIFMA show I have experienced.  While I think attendance was down from previous years, I also think the quality of attendees was up. And for me personally, the excitement of being involved in some industry-moving announcements as well as meeting up with many of my colleagues from capital markets firms, vendors, press and analysts was highly invigorating.

So what were the highlights and take-aways for me?

1. CEP is clearly a theme that is getting a lot of mindshare. So many people said that CEP was a key theme of the show – which is great to hear after many years of working to help define the market. It’s also great to see this add to the momentum so soon after the Event Processing Technical Society was launched. The use cases of CEP are many and varied – and there was a lot of interest and questions around this at SIFMA. We demonstrated on our SIFMA booth 5 different CEP use cases on 5 different pods - algorithmic trading, smart order routing, managing FX market fragmentation, market surveillance and real-time bond pricing. Also the demands of CEP applications continue to make demands on the technology, and we were thrilled to demonstrate Apama 4.0 – which extends performance and user experience of CEP to new levels. Another supporting factor in the maturing of CEP is that there are starting to be very senior people in Capital Markets firms focusing on CEP as an enabling technology. Marc Adler from Citigroup   is a key example. He’s active in the community and on the STAC CEP committee, helping to define benchmarks. It was great to meet Marc at SIFMA and also to catch up with many  other esteemed colleagues from the CEP space.

2. The liquidity wars are hotting up. It was our pleasure to be involved in a press release with NYSE-Euronext which was certainly one of the big releases of the conference. Progress Apama will be hosted in the NYSE Euronext as part of the exchange's Advanced Trading Solutions offering. Traders will be able to download custom logic for algorithmic trading, risk management and smart order routing into the NYSE itself - with low latency connectivity to other trading venues via Wombat and Transact Tools. This arrangement turns NYSE into a technology provider as well as a one-stop-shop liquidity provider. This announcement was picked up by major press, including the Financial Times - in Europe, America and Asia -- see the article here.

3. Hardware is important – and so is “green”. The increase of capital markets data volumes require completely new software architectures – like CEP. But software is not always enough to support the low latency transport , processing and storage requirements. Many firms are turning to specialized hardware, combined with software – to create high performance solutions. Vhayu, for example, launched Squeezer – which combines hardware and software to supercharge their tick data offering. Also, Progress Apama were pleased to put out a joint announcement with Sun on a collaboration for end-to-end CEP solutions – combining Sun hardware and operating systems with Apama’s CEP platform and solutions. We demonstrated an end-to-end bond pricing application using the whole stack. Sun was one of the vendors who have a “green” aspect to their hardware – for example on a major CEP deployment, the hardware can be scoped for peak throughput – but can selectively shut down capacity to save power when event throughput is reduced. In this era of high energy costs and global warming there seems to be a lot of interest in this approach.

4. I love partying on the trading floor. Progress Apama were honored to be invited to a party at the NYSE to celebrate the latest developments at NYSE-Euronext (see picture at the top). It was a great pleasure to speak with our friends at NYSE-Euronext and to meet many of our old friends from the capital markets industry there – while sipping some delicious wine in that amazing place. In a way it is a shame that electronic trading is making the traditional trading floor a thing of the past – but there is something amazing about that place and I hope it stays just the way it is – even if it becomes a cool venue for other purposes. Thanks to NYSE-Euronext for inviting us – we had a great time.

I’m sure I’ve neglected a load of other trends and themes – but there’s my brain dump for the day. I’m interested to hear if you all agree.

John

Tuesday, April 22, 2008

Asia Report: Fighting White Collar Crime

Posted by John Bates

Titchy_johnHello from Hong Kong. As always it is fascinating to see how CEP is evolving in Asia. One trend I am observing is the huge interest in Hong Kong in rogue traders and white collar crime – and how CEP can be used to detect and prevent this – before it moves the market. Obviously the original rogue trader, Nick Leeson, is well known here. But there has been a great deal of interest in more recent goings-on, at firms such as SocGen. Amazingly, until a couple of years ago, insider trading was not illegal in Hong Kong! Now we have a highly volatile market, with a lot of uncertainty, huge event volumes and a real problem of seeking out and preventing rogue trading activities, as well as managing risk exposure proactively.

Of course CEP provides a compelling approach. In market surveillance - the ability to monitor-analyze and act on complex patterns that indicate potential market abuse or potential dangerous risk exposure can allow a regulator, trading venue or bank to act instantly. Banks want the reassurance that they are policing their own systems. Regulators need to protect the public. The media and public here find this fascinating.

On the topic of a different kind of white collar crime – consider using CEP to detect abuse in the gaming industry. The gambling phenomenon that has propelled Macau to overtake Las Vegas as the world’s biggest gambling hub is also an exciting opportunity for CEP. We have customers using CEP to monitor and detect various forms of potential abuse in casinos. Events that are analyzed to find these patterns include gamblers and dealers signing on at tables, wins and losses, cards being dealt etc. It is possible to detect a range of potential illegal activities, ranging from dealer-gambler collusion to card counting.

As a final thought - having met with some of our customers that operate both in Hong Kong and mainland China, it is clear that China is a massive market opportunity for CEP. Exciting times ahead for CEP in Asia.

Saturday, April 19, 2008

CEP down under

Posted by John Bates

Titchy_john_4I’m sitting here at Melbourne Airport in Australia on my way to Hong Kong. I’ve been delayed by a typhoon – probably a good reason to delay. After a very successful week in Sydney and Melbourne visiting customers, I thought I’d report that the CEP market is hotting up down under! As you would expect financial services is an early adopter and Apama has had several customers in Australia in this space for a few years now. But the demand is increasing. This is driven by factors such as increasing competitive pressures in the trading space and the impending fragmentation of the Australian market. Just like in Europe and North America, it is likely that several new trading venues will join the Australian Stock Exchange in offering liquidity in Australia. My diagram shows some of these in the form of Chi-X, AXE and Liquidnet.

Complex Event Processing offers a powerful way of monitoring, aggregating and analyzing the liquidity across all of these markets, as well as making real-time routing decisions. This of course can work in parallel with traders and algorithms. In fact it is becoming very interesting to see trading decision algorithms routing messages to execution algorithms, routing messages to liquidity tracking algorithms, routing trades to the market, which are being checked by market surveillance algorithms -- and all part being implemented in CEP. I am biased of course, but what other technology can offer the seamless federation of such systems. Events provide a powerful and low latency mechanism for such interoperation. Each component can be built independent of the other - but yet they can work together seamlessly. But I am getting off topic!

Over the last few years Australia has mainly been interested in equities algorithms, but now the interest in FX, futures, bonds and commodities is growing. While I was in Sydney, I was pleased to deliver the keynote address at the Trading Technology conference and met many interesting sellside and buyside participants with a variety of trading interests. It was fascinating to see how the market is developing.

And it is not just financial services where CEP is being applied down under. I also met with organizations in a number of other spaces including travel, transportation and location-based services. I hope to report more on these in the near future.

And now I look forward to finding out what is happening in Hong Kong and Asia beyond. Hopefully I can avoid the typhoon!

John

Frag_aus_4

Monday, January 07, 2008

Apama and Sonic Win Technology Innovation Awards

Posted by Chris Martins

Progress products have won two "Leaders in Innovation" technology awards in wholesale transaction banking by Financial-i magazine Financial-i.  As judged by a panel of industry experts, Apama  won the award for CEP products and our fellow Progress product, Sonic, won for  Enterprise Service Bus.  Financial-i notes that the awards are for technology leadership demonstrated over the last 12 months, which the magazine judges to be "an ongoing commitment to innovation....built on previous innovations to stay ahead of their competitors." 

Wednesday, July 04, 2007

Why use SQL?

Posted by Giles Nelson

SQL is certainly one of the successes of the computing industry. It all started with the much cited and seminal paper of Codd in 1970 which first described the relational model. Over the next few years and after efforts by both IBM and Relational Software (which later became Oracle Corp) SQL was launched into the commercial domain in the late 1970s. Standardisation then followed in the mid-1980s and further support for more modern trends such as XML added more recently. Database management systems now have matured into highly sophisticated environments for the storage, manipulation and retrieval of enterprise information. SQL is the standard language of use in the database world. Attempts to move this on and break with the Codd paradigm, such as the move towards object oriented databases in the 80s and 90s have, apart from in niche areas, failed.

We now see a trend by a number of event processing vendors to represent SQL as the language of choice for building CEP applications. For example, Streambase, Coral8 and now BEA. Why is this? Well, Streambase is simple to explain. Michael Stonebraker is one of the key forces behind Streambase and his background in the database industry is second to none. He was involved in Ingres in the 1970s and also behind some of the work integrating object-oriented principles with relational databases with Illustra in the 1990s. Databases, and therefore SQL, is part of Streambase’s DNA. In comparison, BEA’s use of SQL is harder to understand. BEA’s business is built (still) upon their application server technology and they are strongly going to market with an enterprise integration offering – Aqualogic. Databases haven’t really formed part of BEA’s background. The use of Xquery would have been more obvious.

Perhaps these vendors have concluded rightly that SQL is actually the right way of doing things in an event processing world. I don’t believe it is. It’s certainly a way, but it’s not the best. I believe it can confuse people as to what event processing is all about and can serve to inhibit adoption. SQL is certainly well understood, but by providing an SQL interface to event processing products, practitioners assume that an SQL way of thinking will be appropriate. It isn’t. By thinking of event processing as actually a real-time database you get stuck in a database-centric design pattern.

When John Bates and I were doing some of the academic research which resulted in the formation of Apama in 1999, we were looking at how to support effectively applications which were powered by high-volume streams of data – stock tick analysis, location-aware telco applications and others. We looked at using database technologies to support this, but had to rip the work up. Not only did these technologies not perform, we realised we were force fitting one paradigm into another. Taking a clean sheet of paper we came up with the beginnings of a much more elegant, performant architecture. It was data, not query driven. The use of SQL to be a language interface to this just didn’t seem to be appropriate.

It seems that others agree. I was at a conference at CITT in Germany recently where CEP formed a major topic of discussion. In particular we talked about some of the challenges of using SQL to build event processing applications underpinned by practical implementations of use cases using a variety of event processing technologies. What became apparent was that the SQL approaches appeared to hinder, not help, the developer. The baggage that SQL brought made it difficult for people to get their heads around the thinking required to implement event processing use cases. The resulting SQL was clunky and difficult to follow.

So, am I going to conclude by saying that SQL should be shunned? Well no, I’m not. As a vendor, Progress Software is all too well aware that its products exist only to help organisations solve their business problems. Partly this is allowing problems to be solved that could not be solved previously. Partly, this is also to enable productive development. Giving a choice of a development environment with which many technologists are familiar is important and SQL can provide some of this familiarity. We therefore are observing and listening closely to the opinion of the wider market and to our prospects and customers. SQL may be one of the ways in which organisations should be able to interact with an event processing system.

However I maintain that it is certainly not the best choice, nor should it be the only one.

Monday, May 14, 2007

In Piam Memoriam Fundatoris Nostri

Posted by John Bates

There have been a number of exchanges recently on the cep-interest group and on this blog on the topic of “the origins of event processing and CEP." As someone who has been involved in event processing research and products for 17 years I’ve been asked to add a perspective here. Wow this makes me feel old.


Although I started researching composite/complex event processing as part of my PhD at Cambridge in 1990, I certainly wasn’t the first. So I can’t claim to have invented CEP. As Opher Etzion correctly observed in an email to cep-interest, my experience was also that this discipline originated from the “active database” community. There was much work done prior to 1990, which added events and event composition to databases. The term “ECA rules” – or Event-Condition-Action rules were a popular way of describing the complex/composite event processing logic.


When I was experimenting with multimedia and sensor technologies in the early 90s – and trying to figure out how to build applications around distributed asynchronous occurrences (such as tagged people changing location) – I realized that building “event-driven” applications in a distributed context was a new and challenging problem from a number of angles. I looked for prior work in the area. Although I didn’t specifically find any work in this area, I was able to look to the active database community. In particular, a paper by Gehani et al on “composite event expressions” (as recently mentioned by Opher) looked ideal for the applications I had in mind. This paper outlined an algebra for composing events and a model to implement the subsequent state machines. I implemented the Gehani model as part of my early work. While it was a great concept, it had a number of shortcomings:


  • Although it claimed to be able to compose any complex event sequence, it was incredibly difficult to compose even a simple scenario.
  • It didn’t consider the issues of distributed event sources, such as network delays, out-of-order events etc.
  • It didn’t consider the key issue of performance – how could you process loads of events against a large number of active composite event expressions.

Active databases had only considered events within the database. And databases had fundamental problems of store-index-query – which are not ideally suitable for such fast-moving updates. In order to make composite events applicable as a way of building applications, the above shortcomings had to be addressed.


Composite event expressions was only one aspect of my work initially, but as the volumes of real-time data continued to grow and new sources of data continued to emerge, it became clear that the work in distributed composite/complex event processing had legs. Also, it seemed to excite many people.


There were of course the cynics. Many of my Cambridge colleagues thought that events were already well understood in hard and soft real-time systems and in operating systems – and that’s where they belonged. It is true that event processing has been part of systems for several decades. Traditional systems handle events in the operating system. However, never before had events been exposed at the user level as ad hoc occurrences, requiring specific responses. There was a new requirement for applications that could “see” and respond to events.


Some closed “event-based systems”, such as GUI toolkits, like X-windows, allowed users to handle events in “callback routines”. For example, when someone clicks on a window, a piece of user code could be called. This approach tried to make the most of traditional imperative languages, and make them somewhat event-based. But this paradigm is tough to program and debug – and events are implicit rather than explicit. Also the program has to enter an “event loop” in order to handle these events – to make up for the fact that the programming paradigm wasn’t designed to handle events explicitly.


So we began to realize that events had to be explicit “first class citizens” in development paradigms. Specifically, we saw a new set of capabilities would be required:


  • An event engine – a service specifically designed to look for and respond to complex event patterns. This engine must be able to receive events from distributed sources and handle distributed systems issues.
  • An event algebra – a way of expressing event expressions, involving composing events using temporal, logical and spatial logic, and associated actions. These might be accessible though a custom language or maybe even through extensions of existing languages.
  • Event storage – a services specifically designed to capture, preserve in temporal order and analyze historic event sequences.

A number of my colleagues in Apama worked on these and other areas. As far as a research community went, we published mostly in distributed systems journals and conferences, such as SigOps. We worked closely with other institutions interested in events, such as Trinity College Dublin.


In 1998 I, along with a colleague, Giles Nelson, decided to start Apama. I later found out that concurrent with this, and coming from different communities, other academics had also founded companies – Mani Chandy with iSpheres and David Luckham with ePatterns. These companies had different experiences – David told me ePatterns unfortunately overspent and became a casualty in the 2000 Internet bubble bursting.  David of course went on to write a very successful book on event processing. iSpheres went on to do brilliantly in energy trading but was hurt by the Enron meltdown and struggled to compete with Apama in capital markets. Apama focused primarily on capital markets, with some focus on telco and defence, and went on to be very successful, being acquired by Progress in 2005. Interestly, long after these pioneering companies were started, several new startups appeared – all claiming to have invented CEP !!


So that’s my potted history of CEP. I don’t think any of us can claim to have invented it. I think some of us can claim to have been founding pioneers in taking it into the distributed world. Some others of us can claim to have started pioneering companies. All of us in this community are still in at an early stage – and it is going to get even more fun.


There’s one bit I haven’t talked about yet – and that’s terminology. Most researchers originally called this area “composite event processing”. The term “complex event processing” now seems to be popular – due to David’s book. There are some arguments about the differences between “complex/composite event processing” and “event stream processing”. From my perspective, when Apama was acquired by Progress, Mark Palmer invented the term “event stream processing” to avoid using the word “complex” – which Roy Schulte from Gartner thought would put off customers looking for ease-of-use. However, then it seemed that the industry decided that event stream processing and complex event processing were different – the former being about handling “simple occurrences” in “ordered event streams” and the latter being about handling “complex occurrences” in “unordered event streams”. Now in my opinion, any system that can look for composite patterns in events from potentially distributed sources, is doing complex/composite event processing. Yes, there may be issued to do with reordering but there may be not. It depends on the event sources and the application.


It's often tricky to work out "who was first" in academic developments.  But it's good to know we have some excellent pioneers active within our CEP community who all deserve a lot of respect.

Sunday, April 29, 2007

Can Events Yield Eternal Life?

Posted by John Bates

Titchy_john The quest for eternal life has fascinated people through the ages. Theorizing on the topic has usually considered the physical body and how either it can be continuously repaired, the brain transplanted into another body or the body preserved cryogenically until medical science can repair it. Of course this discounts the many religious theories about “eternal life” after death – but I don’t intend to get into that one here!


However, consider a scenario in that rather than preserving the actual physical person, you could preserve a multi-dimensional digitized record of that person. So detailed a record, in fact, that it could be used to “reconstruct” the person. What do I mean?


Well the closest we have come to this in recent history is a combination of physical evidence, still and moving images and historical writings. Consider Lenin; His body was preserved in Red Square in Moscow, there are numerous writings about his behavior at certain points in his life and even some early video. Probably not enough to reconstruct Lenin – but enough to understand something about his behavior and motivations.


So how do events fit into this scenario? Well, start by considering an event as the “digitization of a real-world occurrence”. For example, a portable sensor combining GPS and wireless communication can digitally capture the changing location of an object and communicate it as events describing the X,Y and Z coordinates of a particular object. Other finer-grained technologies could track movements inside buildings. Overlaid on the coordinate system can be geospatial databases that interpret where the coordinate actually are – such as “Mark’s living room”. Applying this to a person on a continuing basis and you have captured one dimension of their life – where they are. All you have to do is record the events in time-order to have a historic view of their movements. Other forms of digitized recordings can include digitally capturing what the person is typing on a computer, whatever a person says, whatever a person hears, where a person’s eyes are looking, what the weather conditions are etc. Each sequence can be captured as events and recorded in time-order, for example, at a particular point in time, Mark heard John say “event processing”.


Where things really get fun, though, is using the power of event processing on top of this. All sorts of interesting information can be discerned by correlating simple events recorded about a person. For example, event rules can determine that “at 9am, on a sunny Thursday April 19th 2007, Mark and John discussed event processing in Mark’s office” -- because it knew that both Mark and John were together in a room, the room was Mark’s office, that they were meeting between 9am and 10am and the topic of the conversation was “event processing”.


Some early projects (such as one that I ran with Mark Spiteri at Cambridge and another that my friends Mick Laming and William Newman ran at Xerox Research) tried to capitalize on this fact -- that by recording activities, complex things could be automatically “remembered”. Imagine, for example, that John couldn’t remember a key piece of information he wanted to use in a paper; All he could remember is “that he discussed it with Mark in an early morning meeting on a sunny day within the last 3 months”. Using complex event capture and query techniques, it is possible to retrieve event sequences for every early-morning meeting in the last 3 months between John and Mark when the weather was sunny (that narrows it down a bit in New England J) . John could then observe the event sequences for those meetings – or narrow the search down further. In the end he was able to find the relevant information, by correlating the relationships between a set of multi-dimensional events.


So, I know this isn’t quite eternal life! But it’s a start. Recording events from every possible angle and then being able to correlate them is a much richer way of recording a person than 2 dimensional video. It enables all sorts of previously unanticipated scenarios and thought processes to be reconstructed about a person. But more importantly, that person’s interactions with the rest of the world – in particular other people, can be determined. Unlike in early experiments, sensor technologies are becoming ubiquitous and non-intrusive (no more wearing a range of Robo-cop-style equipment J). Of course this kind of capture opens up all sorts of privacy issues – but let’s park those for now.


And of course we haven’t even considered the real-time aspects of this technology. We used to have great fun with event-based rules, such as “When it’s coffee time and Mark and Scott are together, then play this video message on the nearest terminal to them”.


But back to eternal life for a moment…. If you capture events about an individual from a rich enough number of dimensions, have you captured that individual’s soul? Could you recreate that individual by modeling their response to events? This is very much the “black box” approach. In other words, rather than actually understanding how a system works, we model it from its inputs and outputs. I typed the title of this article very much “tongue in cheek” – and I’m skeptical about whether we could ever model anything as complex as a human. However, at a very minimum, we can use event capture, replay and correlation to reconstruct a historical view of an individual from any “angle”. Your legacy could be preserved, even if your body couldn’t. However, history often lends a dusting of romanticism to the imperfect individual – this is something that event processing can’t do. It just gives you the facts ma’am.

Sunday, April 01, 2007

Sentient CEP and the Rights of Algorithms

Posted by John Bates

Titchy_john_3I have just returned from the UK where, as part of my duties for the week, I spoke at a conference on Fixed Income and another on Foreign Exchange. At both these events, the delegates were interested to hear of the latest trends in the industry – which include aggregating multiple market data streams from disparate trading venues into a single view and using rules-based trading techniques to rapidly detect complex patterns and make and execute trading decisions. In these areas and beyond, Complex Event Processing is being successfully used by Investment Banks and Hedge Funds to enable such complex and low latency requirements.


While I was in the UK, one of the items in the news was the marking of the 200th anniversary of the abolition of the slave trade by Britain with a service in Westminster Abbey, attended by the Queen. I hope none of my comments seem to in any way belittle this significant incident, but it did rekindle some thoughts about “will algorithms ever have rights?”. We look back on the past and find it inconceivable that any class of people could be considered as less human than others. Society has adopted that principle, quite rightly, as part of our programming. However, we’re quite happy to turn a blind eye to the suffering of animals in unnecessary cosmetic testing and in horrific factory farm conditions. In the same way that most people in the 18th Century were quite happy to receive cheap cotton and sugar, now we are quite happy to receive cheap cosmetics and food. History suggests, however, that this will change eventually.


So what of the rights of the algorithm? Now you may think this is totally mad – and you’d probably be right. However, consider for a moment the field of algorithmic trading. While speaking at both of the conferences this week, I illustrated the changing role of the trader. Rather than the trader watching the markets for key trading opportunities and then manually executing, he/she now can deploy an army of algorithms to look for and respond to the key opportunities. The algorithms are the trader’s slaves. As a customer of Apama’s, Lou Morgan of HG Trading, put it recently – “…. They don’t need a lunch-break and they don’t need a bonus…!”. Of course these algorithms are not sentient, and therefore they don’t have any rights – but what if they were!?

Together with my colleague and mentor at Cambridge - Professor Andy Hopper, I ran a research group that looked into what we termed “Sentient Computing”. This is a very grandiose title for what Gartner might describe in commercial terms as the “Enterprise Nervous System”. It was all about how complex asynchronous stimuli in a wide-area computing environment could be detected and used to automate responses. There were lots of fun hardware in this environment, like “Active Badges” - that could detect your location to within 2 centimeters in 3 dimensions, “Active surfaces” – that provided a high bandwidth network to your wearable computers when in contact with the surface, and a variety of other ubiquitously deployed sensors, actuators, compute terminals and devices.


But the whole glue that made Sentient Computing possible was Complex Event Processing and Event-Driven Architectures – as they are called today. Sensors generated events – that needed to be analyzed; Actuators could be triggered – but needed something to do the triggering. Event-based rules provided the glue to enable complex circumstances to be modeled. For example “When John Trigg and Chris Martins are within 10 feet of each other  and there is a video terminal in their view, then play message ‘good morning’ on the nearest video terminal”. Some people described this kind of environment as an “Exploded Robot” – because rather than a single object having various sensors and actuators attached to it, the network cloud is the medium to which they are attached – and CEP and EDA are the mechanism through which the “neuron firing” is wired together. Nowadays, we are familiar with how CEP and EDA are enabling all sorts of “exploded robot” applications – such as logistics companies that monitor the wide-area movements of their fleets of trucks, and optimize based on the perishability of their goods, weather, traffic conditions etc.


Although we are at an early stage with Sentient Computing, the technology approaches of CEP and EDA definitely provide a great framework. An event is a nerve impulse from the body. CEP provides the brain – which can be changed with new rules (new thinking?) at any time – rather than being hardwired. The next stage of course is rules that are written and refined by the system itself. But the exciting thing is that there doesn’t have to be one brain; the powerful thing about events is they can flow on to another brain that can be processed and analyzed in a different way (a different perspective on the information; a different opinion?). And then events resulting from decisions can be routed to actuators/services/humans to ultimately cause things to happen – the resulting nerve impulse. All the time, EDA is providing a framework to bring these various nerve inputs, brains and nerve outputs together. Clearly we’re not there yet with EDA – and CEP is being used mainly in individual systems unconnected – but it is coming to offer an exciting “event cloud”, enabling various CEP systems to interoperate.


So should algorithms have rights? Well of course not – and it is April 1st as I’m writing this. But I do ask you to think about what you would have been like it you’d be alive in the 18th Century and enjoyed cheap tobacco, sugar and cotton. Would you have “thought differently”? I remember an episode of “Star Trek – The Next Generation” in which a scientist wanted to take Mr Data apart to see how he worked. Captain Picard successfully proved to Star Fleet that Mr Data was sentient and thus an individual, rather than being property. I don’t think you necessarily have to be sentient to have rights – there’s no reason to pull the wings off a butterfly just because it isn’t familiar with Descartes. I used to comment to one of my friends – who is a Maths Professors – that his computer programs were so badly written that they were like strange mutated creatures that probably suffered. Of course this is very silly!


Anyway, I leave you with the thoughts that perhaps Event-driven Architectures offer us the ability to have a truly global “information nervous system”. We are asynchronous; the world is asynchronous – so computing paradigms should support this. Of course realistically this is just a set of inputs, rules and outputs. However, as we add more intelligence to the “brains” then we must be careful this system doesn’t become too autonomous. After all – you know what happened with "Skynet" in Terminator!

Tuesday, March 06, 2007

Don't Shoehorn Event Processing into SOA

Posted by Progress Apama

Joe McKendrick followed up on the SOA / EDA debate.  His piece really nailed it.  Joe quoted the Apama blog post on "Is EDA the "New" SOA", which discusssed the role of complex event processing (CEP) and event driven architecture (EDA), and discussed some recent comments from Todd Biske.  Nice job Joe!