BAM

Monday, January 11, 2010

Why businesses must evolve their business processes to be highly responsive, dynamic and predictive – or they will cease to be competitive

Posted by John Bates

Today Progress Software announced the acquisition of Savvion http://web.progress.com/inthenews/progress-software-co-01112010.html. I believe this heralds the beginning of a very exciting phase for Progress Software. Now Progress has become a leader in Business Process Management (BPM). But more than that, combined with our other solutions, Progress is now uniquely able to empower businesses to be operationally responsive – through responsive, dynamic and predictive business processes. And this is critical to keep modern businesses competitive.

You might wonder about the journey Progress went through to realize what the market needed. It was all about understanding the emerging needs of our customers and where they needed their businesses to go. The part of my job I enjoy the most is spending time with customers and understanding what pain points they have - with the ultimate goal of working with them to address the pain and making them highly competitive.

Over the last couple of years I have been hearing more and more from customers about the need to be operationally responsive. For example, many customers have expressed their desire to proactively – and often in real-time - address the needs of their customers and respond to the behavior of their competitors. The goals are to win new business, increase customer satisfaction and triumph over their competitors. These findings hold true whether the customer be in banking, insurance, communications, travel, transport, logistics, energy, gaming or many other industries. It could be British Airways ensuring their high value customers are looked after first in the event of a flight delay, or wireless carrier 3Italia pushing real-time offers to their customers based on their profile, activity and location, or maritime logistics provider Royal Dirkzwager dynamically adjusting the course and speed of a container ship to optimize fuel usage, based on weather conditions and port berth availability.

Operational responsiveness is thus about being highly responsive to opportunities and threats – and even anticipating such scenarios. Market research supports what I’ve been hearing, such as the recent survey by Vanson Bourne http://web.progress.com/en/inthenews/companies-stuck-in-o-10062009.html – suggesting Operational Responsiveness has moved from a nice-to-have to a must-have.

There are a number of business facing solutions that have shown great promise in addressing operational responsiveness. One of those is Business Transaction Assurance (BTA). This enables businesses to discover their business processes and gain visibility on the effectiveness of these business processes – even if they are built in a wide variety of heterogeneous technologies and work across legacy applications. BTA non-disruptively discovers business processes – without any modification to existing applications – and monitors to ensure processes run to completion. BTA also discovers bottlenecks and hotspots in the processes – enabling businesses to understand just how efficiently they run.

Another important solution is Business or Complex Event Processing (BEP or CEP). This enables business users to model the detection of and reaction to patterns indicating business opportunities and threats in real-time. Examples could be an opportunity to up-sell to a customer on the web-site now (opportunity) or risk exceeding a key level (threat).

And then of course there’s Business Process Management (BPM). This enables business users to model and execute a business process flow. BPM is also widely used for Business Process Improvement (BPI) – the re-engineering of (parts of) existing processes to improve their effectiveness.

The really cool thing we realized in talking with our customers is what happens when you use BTA, BEP/CEP and BPM together. Suddenly businesses are empowered to discover how effective they run, to detect opportunities and threats dynamically and to invoke business processes in response. The business becomes dynamic and responsive. Business users can take control and model the behavior they want their business to exhibit under certain circumstances, and through dashboards they can track the effectiveness of the business. Over time, the areas of the business processes that should be improved can also be detected.

Progress already has leading products in BTA and BEP/CEP with Actional and Apama. Progress chose Savvion to complete the story for a number of reasons. Savvion has a history of innovation and is a leading pure-play BPM provider. But Savvion also has a very rich platform, which includes not just BPM modeling and execution, but also an event engine, a business rules engine, a document management system and an analytics engine. The fact that Savvion enables business processes that respond to events means it immediately works well with Actional and Apama. And with high performance, scalability and availability, Savvion fits perfectly into Progress – where we pride ourselves that all of our products exhibit these characteristics.

In summary, Progress is now a best-of-breed BPM vendor – and not just at the departmental level – but at the enterprise level. But we’re also more than that. Our goal is to enable operational responsiveness and ensure our customers gain competitive advantage through the power of responsive, dynamic and predictive business processes.

10 Reasons Why Progress Chose Savvion

Posted by John Bates

Today Progress announced the acquisition of Savvion http://web.progress.com/inthenews/progress-software-co-01112010.html

The reason that Progress chose to enter the BPM market is clear. Businesses are increasingly turning to BPM to implement and improve their business processes. Why? Firstly because no other solution can help enterprises achieve real-time visibility, agility, efficiency and business empowerment the way BPM does. Secondly BPM enables this to be achieved with low Total Cost of Ownership (TCO) and ease of use.

But why did Progress choose Savvion? Here are 10 reasons to start off with…

  1. Savvion is a trailblazer and industry leader – Savvion is a pioneer in BPM but is also still at the cutting edge. We wanted the best BPM thinkers at Progress. 
  2. Savvion has been proven to work at the enterprise level. Some BPM systems only work at the departmental level, but Savvion works at either departmental level or enterprise levels.
  3. Savvion offers System-centric and Human-centric BPM – Savvion can orchestrate processes but can also involve human users in workflow.
  4. Savvion is event-enabled – so business processes can respond to events. Progress has a lot of momentum behind event-driven business systems through our Actional and Apama solutions – and Savvion will work seamlessly in event-driven business solutions.
  5. Savvion offers vertical industry solutions – Analogous to Progress’ Solution Accelerators, Savvion offers out-of-the-box vertical solutions in industries including Financial Services and Telecommunications.
  6. Savvion offers an integrated Business Rules Management System – Expressing logic in terms of rules can often be very important. Savvion have developed a rules engine, integrated with their BPM system, enabling decision-oriented BPM – modifying the process flow based on rule conditions. This is a powerful capability.
  7. Savvion offers an integrated Analytics Engine – Business Intelligence has proved its worth but it is a “rear view mirror” technology – analyzing facts that have already happened. Savvion’s analytics engine enables continuous analytics to augment business processes and human user with advanced real-time analytics, enabling better decision-making.
  8. Savvion offers an integrated Document Management System (DMS) – Savvion’s integrated DMS enables rich document handling and empowers document-centric BPM.
  9. Savvion BPM suite is highly scalable, high performance and highly available – At Progress we pride ourselves on the strength of our underlying technology. We want to offer our customers a complete solution that embodies scalability, performance and availability. Thus selecting a BPM vendor in-keeping with this philosophy was key – and Savvion is just such a vendor.
  10. Savvion is a great cultural fit with Progress – An often-overlooked point is that cultural fit is key to acquisition and integration success. The Savvion team pride themselves on being innovative, customer-focused and fun - just like the Progress team. We’re looking forward to working together. 

Tuesday, December 22, 2009

My Baby Has Grown Up

Posted by John Bates

20090625_7172 copy_2 I was proud to recently be appointed CTO and head Corporate Development here at Progress Software http://web.progress.com/en/inthenews/progress-software-ap-12102009.html. But I don’t want anyone to take that as an indication that I won’t still be involved with event processing – au contrair. Event processing (whether you call it CEP or BEP) is now a critical part of enterprise software systems – I couldn’t avoid it if I tried!!

But taking a broader role does give me cause to reflect upon the last few years and look back at the growth of event processing and the Progress Apama business. Here are some observations:

  • It’s incredibly rare to have the pioneer in a space also be the leader when the space matures. I’m really proud that Progress Apama achieved that. Our former CEO Joe Alsop has a saying that “you don’t want to be a pioneer; they’re the ones with the arrows in their backs!” Usually he’s right on that one – but in the case of Progress Apama, the first is still the best! Independent analysts, including Forrester and IDC, all agree on it. Our customers agree on it too.
  • It’s tough at the top! I had no idea that when you are the leader in a space, many other firms’ technology and marketing strategies are based completely around you. I have met ex-employees of major software companies that have told me that there are Apama screenshots posted on the walls of their ex firms’ development centers – the goal being to try to replicate them or even improve on them. Other firms’ marketing has often been based on trying to criticize Apama and say why they are better – so their company name gets picked up by search engines when people search for Apama.
  • Event processing has matured and evolved. Yes it is certainly used to power the world’s trading systems. But it’s also used to intelligently track and respond to millions of moving objects, like trucks, ships, planes, packages and people. It’s used to detect fraud in casinos and insider trading. It’s used to detect revenue leakage in telecommunications and continually respond to opportunities and threats in supply chain, logistics, power generation and manufacturing. It enables firms to optimize their businesses for what’s happening now and is about to happen – instead of running solely in the rear view mirror.
  • Despite all the new application areas, Capital Markets remains a very important area for event processing. Critical trading operations in London, New York and around the world are architected on event processing platforms. The world’s economy is continually becoming more real-time, needs to support rapid change and now needs to support the real-time views of risk and compliance. We recognize the importance of Capital Market. My congratulations to Richard Bentley who takes on the mantle of General Manager of Capital Markets to carry on Progress Apama’s industry-leading work in this space. With his deep knowledge and experience with both Apama and Capital Markets, Richard is uniquely placed to carry on the solutions-oriented focus that has been the foundation to Progress Apama’s success.
  • Even in a terrible economy, the value of event processing has been proven – to manage costs, prevent revenue leakage and increase revenue.  Progress announced our fourth quarter results today http://web.progress.com/en/inthenews/progress-software-an-12222009.html which saw a double digit increase for Apama and triple digit for Actional. Apama and Actional are used, increasingly together, to gain visibility of business processes without modifying applications, to turn business process activity into events and to respond to opportunities and threats represented by event patterns – enabling the dynamic optimization of business performance.
  • But one thing I do believe: that soon there will be no such thing as a pure-play CEP vendor. CEP is part of something bigger. We’ve achieved the first mission, which is to raise the profile of event processing as a new technique that can solve hitherto unsolvable problems. Now the follow on mission is to ensure event processing finds its way into every solution and business empowerment platform. It is one of a set of key technologies that together will change the world.

I wish everyone Happy Holidays and a successful and profitable 2010 !!!

Wednesday, September 30, 2009

EPTS, the Symposium of Trento

Posted by Louis Lovas

EPTS, the Symposium of Trento
How many angels can dance on the head of a pin? I suppose that was a question debated at the Council of Trent that took place in Trento, Italy back in the 16th century. However, the Event Process Technical Society's (EPTS) annual symposium just last week took up residence in Trento to discuss and debate a host of lofty topics on event processing.

  • CEP's role and relationship to BPM (or more appropriately event-driven BPM)
  • Event Processing in IT Systems management
  • Event-based systems for Robotics
  • EPTS Working Groups ...
While the sessions and discussions on event processing did not have the global significance of angels on pin heads or the Counter Reformation it did provide a clear indication of just how broadly and deep event based systems can reach. Whether it's a business application monitoring mortgage applications, IT management systems in a Network Operation Center, bedside monitoring systems in a hospital or a robot packing pancakes into boxes they all have a common underpinning, consuming and correlating streaming event data.

Granted, not everyone approaches it with the same viewpoint. IT Systems Management people don't think about processing and correlating events, they think about device management, KPI's, Alerts and the like. Someone building, managing a business process is likely concerned with managing Orders - validating them, stock allocations, warehouses and shipments. Nonetheless, a common framework model behind these systems is event processing.

Two of my favorite sessions at the EPTS Symposium were a panel session on the EPTS Mission and an open forum on Grand Challenges, a brainstorming session focused on identifying barriers to the adoption of CEP.

EPTS Mission

Four panelists, myself included presented their expectations of the EPTS and it's role as an industry consortium, it's goals and what improvements can be made. As a baseline, the EPTS does have a existing mission statement defined as ...

To promote understanding and advancement of Event Processing technologies, to assist in the development of Standards to ensure long-term growth, and to provide a cooperative and inclusive environment for communication and learning.


Given this mission statement and my own expectations there are a number of basic intentions the EPTS should provide to the uninitiated to event processing:

Awareness   Provide commercial business and industry the necessary knowledge of event processing as a technology supported by numerous vendors with continuing research in academia.
Definition Provide a concise and definitive meaning of event processing,  a Taxonomy of Event Processing so to speak. This is both from the horizontal technology perspective and also a vertical focus for a handful of specific industries. It's often difficult for business people to understand technology without the context of a business or application focus.
Differentiation  Provide a clear distinction that defines event processing and distinguishes it from other technologies. Event processing is available is many forms, this symposium provided evidence of that.  Much of it is available in specialized form as in IT Systems management. There are also pure play event processing (CEP) vendors, such as Progress/Apama. But there are also Rules engines, Business Intelligence platforms, Analytic platforms, etc. This easily presents a bewildering world filled for choice, conflicting and overlapping marketing messages. The EPTS is in the perfect position to provide that clarity behind defining what is CEP and what isn't.
Cooperative Event Processing rarely operates in a vacuum. There are many synergistic technologies that closely pair with CEP. Often this can have a specific vertical business flavor, but often it's other platform technology such as BPM and temporal databases.


The EPTS has four working groups that have been active for the last year: Use-cases, Reference Architecture, Language Analysis and Glossary. To a large extent the working groups have provided and are working towards the definition of CEP that is clear. However, there still a need to highlight the salient value of event processing. For specific vertical domains, the value of CEP is clear-cut simply because the fit and function is tailor made. In Capital Markets, for example algo trading has all the hallmarks of a CEP application - high performance, low latency, temporal analytics and a streaming data paradigm fit-for-purpose. However, there are other application domains where CEP is equally viable but much more subtle.  I believe the EPTS can provide a vendor-neutral taxonomy of event processing - from the basics to the advanced. Explain why it's unique and different, why language is important and how it is synergistic with a host of other technologies. To this end, the group has decided to form two new working groups to focus on many of these areas. Clearly a forward thinking move.

The Event Processing Technical Society is an organization made of up both vendors and academics. We're held together by a common thread, a goal that the whole is greater than the sum of the parts and our collective will benefit all even as many of us are undeniably competitors.

Once again thanks for reading,  you can also follow me at twitter, here.
Louie



Monday, March 23, 2009

We're going on Twitter

Posted by Giles Nelson

Louis Lovas and myself, Giles Nelson, have started using Twitter to comment and respond to exciting things happening in the world of CEP (and perhaps beyond occasionally!).

The intent is to complement this blog. We'll be using Twitter to, perhaps, more impulsively report our thinking. We see Twitter as another good way to communicate thoughts and ideas.

We would be delighted if you chose to follow our "twitterings" (to use the lingo), and we'll be happy to follow you too.

Click here to follow Louis and here to follow Giles (you'll need to signup for a Twitter account).

Monday, September 22, 2008

Reflections on the Gartner Conference and EPTS4

Posted by Louis Lovas


Like many of my colleagues in the event processing community, I thought I would share a few reflections on the recent happens at the two back-to-back technology conferences of the past week. Gartner sponsored their annual vendor-fest known as the Event Processing Summit, and the EPTS had their fourth annual symposium. This being my first EPTS, I've had some initial thoughts and reactions which I've shared over the weekend.  For this, I'll delve more into the conference's content.

I attended a number of the sessions at the Gartner conference. I did not have any set agenda so I picked the sessions more on a personal appeal rather than some well thought out plan. While I do work in an engineering team, I have a customer focus so I attended all the customer sessions. I always find it valuable to understand how customers are deploying event processing technology in real-world use cases. Their efforts clearly infiltrate the product roadmap of vendors.

     
  • Lou Morgan of HG Trading, a lively speaker described his use of event processing technology in high frequency trading. Lou has been an Apama user for quite a few years and we've invited him to speak on our behalf on a number of occasions. He's an entertaining soul with a clear understanding of the Capital Markets business. We're delighted he presented his use of Apama at this conference.
     
  • Albert Doolittle of  George Weiss Associates Inc. gave a talk on using event processing technologies in this firm.  Albert described his technique to pick a vendor for his CEP project, which if I were to paraphrase was a coin flip.  Towards the end of his talk, he digressed from CEP technologies to present a short discourse on high performance computing (HPC). The idea of leveraging supercomputing-like technologies and FPGA's for compute intensive operations like Black-Sholes Options pricing certainly has caught Mr. Doolittle's attention. Typically CEP and compute intensive tasks don't mix well because of latency considerations. However, a marriage of CEP and HPC is possibly one made in heaven. I was intrigued.
     
  • The ebullient Marc Alder gave his brusque, no-holds-barred perspective on the CEP project he embarked on at Citi. Marc did a great job of explaining the challenges of introducing a new technology at a large corporation, one with a well entrenched bureaucratic IT organization.  I think most of us have faced the bureaucratic fortress at some time or another in our careers. Knowing how to play the game is a skill only a few master well, kudos to Marc for his successful venture.  As Marc unfolded his project's architecture he wisely chose a course to prevent vendor lock-in.

The juxtaposition of these three use-cases was most curious. Lou Morgan jumped deep into CEP technology and bet-the-ranch on it. Albert Doolittle took a gamble with a coin flip in choosing a vendor and Marc Alder kept his choice of a CEP product isolated and contained within his overall system architecture. A safeguard in case he felt the need to replace it.  Nonetheless all great examples of how CEP is gaining momentum in main stream business.

One session I thoroughly enjoyed was Don  DeLoach's "Extending the range of CEP". Don is the CEO of Aleri. I'm not sure I enjoyed this session more for its content or for Don's presentation skills. As is usually the case at technology conferences, it's death-by-Powerpoint. Slideware is typically jammed with an overabundance of barely readable text and dazzling graphics.  Don's slides however had a clear minimalist slant. A plain monotone background with either a single word or a (very) short phase well choreographed with his oration. He spoke of CEP as an evolving technology from the simple ability to filter streaming data to managing complex application state. He used an example that has become the Pièce de résistance of Aleri, order book consolidation.

There were many sessions on SOA and Event Driven Architectures - so many I lost count. 

I attended the panel discussion on low-latency messaging protocols. This was a Q&A session moderated by Roy Schulte of Gartner. The panelists were the crop of high-speed/low-latency message vendors. TIBCO-killers as I've affectionately referred to them. Vendors such as 29West, RTI, Solace Systems, IBM and even TIBCO themselves (apologies to those vendors I've not mentioned). Each described how they have defied physics to achieve incredible speeds yet still provide reliable delivery, management tools and even application level services (i.e. RTI's last value cache).  However, its noteworthy to contrast these low-latency vendors, all focused on shaving microseconds off message delivery via proprietary, even hardware-based schemes, to the many standard-based messaging systems trumpeted in other sessions. Those SOA and EDA sessions paraded a whole barrage of Web Services based standards models (i.e. WSDL, WS-Eventing, WS-Notification, WSDM, the list goes on and on) as the right way to build applications. These certainly seem like opposing forces that will only foster confusion in the eyes of those who have a clear business need for low-latency yet desire to adhere to a standards approach.

The EPTS Symposium began its first day with a keynote address from a VC which had funded Event Zero.  I had first met with Event Zero about a year ago, they have appeared to recast themselves from an adapter/connectivity vendor to one delivering an Event Processing Network (EPN). An EPN can be defined as an infrastructure platform for event processing agents or services. Those CEP agents performing both independently and in concert with other agents (or services) act upon streaming data sources. Together the whole becomes greater than the sum of the parts. Such is the grandiose vision of an EPN.  SRI was also promoting a similar notion of event processing as a service, which I would argue is a variation on this same theme.  Unfortunately, I think there is trouble ahead. The problem is simply timing, maturity and standards (or lack thereof).  I don’t think customers will buy into EPN's or Event Zero's vision until there is a clear establishment of standards for CEP. As a perspective, Application Server vendors tried this and failed (anyone remember SilverStream? Apptivity?). It was not until the J2EE specification established a uniform model that created true viability for a network or service infrastructure platform for AppServers.  Until we see the formation of CEP standards for interoperability and integration, the appeal of CEP will remain as basically a standalone application platform and vendors will continue to market a solutions approach, just look at any CEP vendor's website for proof of this. Nonetheless, Event Zero has embarked on a bold initiative and I wish them all the best.

Speaking of standards, moving slightly up the stack one could clearly detect the prevailing wind blowing against streaming SQL as the language of choice for CEP.  Going back to the Gartner conference there were a few noticeable comments to that effect. Marc Adler, described streaming SQL as making the simple things difficult to do.  Don DeLoach, downplayed the SQL language in Aleri in favor of the SPLASH enhancements. The renowned Dr. Luckham in his closing keynote address outlined Holistic Event Processing as the future implied it required a language beyond streaming SQL. 

At the EPTS Alex Koslenkov from Betfair castigated the streaming SQL approach for his use case in managing complex long-running state. Alex is an advocate of the RuleML approach to CEP languages, as such it stands to reason he doesn't have a high regard for streaming SQL and it showed.

Susan Urban from Texas Tech University presented a research project on a language they've dubbed StreamCEDL. Susan denounced streaming SQL as lacking the algebraic expressiveness necessary to move beyond simple stream processing to true complex event processing. One example, she mentioned in the description of StreamCEDL is its support of an APERIODIC operator.  The intent is to process irregular or out-of-order data streams.

Lastly, Chris Ferris from IBM presented on Industry Software Standards. This was a great session that portrayed the far reaching impact of adopting standards across our industry.  He stressed the importance in making every attempt to get broad vendor agreement, customer validation and to be sure the adopted technology serves the needs of the community because you'll have to live with it for years to come.  This is such an important message in the quest for standardization of CEP. Open, widely accepted standards are exactly what the CEP community needs; the sooner we embark on this journey the better.

Friday, September 19, 2008

Sibos 2008 - the event processing angle

Posted by Giles Nelson

I am writing this at the end of the Sibos financial services show in Vienna. Sibos is the biggest global banking event of the calendar with pretty much everyone involved in the core banking area present including commercial banks, central banks, regulators, vendors and consultancies. It couldn’t, of course, have take place at a more interesting time. The extraordinary events we have witnessed this week in financial markets permeated every panel session, presentation and informal discussion held.

Event processing is big in financial services but, so far, it has generally only penetrated the front-office and middle-office risk management functions for use cases related to trading. There are good reasons for this: in general terms the front-office uses technology to directly enable more money to be made. Core banking on the other hand is about doing what banks were set up to do – to take deposits, to give credit and to be an arbiter of financial transactions. The reasons for technology investment are quite different and are driven by increasing operational efficiency, lowering costs and improving customer service. It’s more conservative and as yet event processing has not penetrated this area to any significant extent.

There’s no lack of use cases, I believe. Here are a couple of examples around the processing of payments. Earlier this year the UK launched its Faster Payments Initiative. Finally in the UK (the Netherlands, for example, have had this for 10 years) you can now pay a counterparty who banks with another UK bank in real-time, rather than waiting 3 days for the payment to clear (it’s remarkable it’s taken so long to fix such a, frankly rubbish, state of affairs and indeed it took the regulator itself to force change). As an end-user of this I am delighted with the results. I can now make an electronic payment using Web-based banking and it all happens immediately – the transaction occurs in the way one feels in the modern Internet era that it should. However this does raise a problem: how does a bank do its anti-money laundering checks, its comparison with counterparty blacklists and all the other fraud checks in the 12 seconds it has for the payment to go through? The answer is – currently with enormous difficulties.  Event processing is surely part of the answer.

Here’s another example. Currently the European payments industry is going through a lot of regulatory change to bring about lower prices and more competition for cross-border Euro payments (PSD and SEPA are the relevant acronyms if you’re interested). This will force technology investment because consolidation will mean a smaller number of banks will have to process more payments at lower cost. Furthermore competition will increase and, for example, a business in France will be able to use a bank in Germany to deal with its payments. Now, I reckon that having insight into what is going on with my payment systems, being able to identify processing exceptions, being able to identify when my customer SLAs are being exceeded and so on in real-time will be a crucial part of ensuring a world-class operation. Payment systems will continue to use many different types of technology, from mainframe to modern SOA environments,  so you need something to logically sit above this, extracting relevant real-time information and analysing and correlating it appropriately. There are offerings from conventional BAM vendors that address some of this now but I think they won't be performant or flexible enough to deal with future needs. Some customer engagements support this.

All of this is really about risk management and it seems inevitable that this area is going to be a booming area of investment in the next few years. Knowing what is going on in your business transactions, as they occur, will become more and more important. For example, in electronic trading it is becoming vital to not only regulators and trading venues (such as our customer Turquoise who we announced went live with Apama this week) but also to brokers. They want to know what their customers and they themselves are up to.

I think Richard Oliver, Executive Vice President at the Federal Reserve summed it up well. When asked about the future role of technology in core banking and payment systems he responded that the “immediacy of information is going to be vital” and that it was going to be all about “getting value from the information flows”. I think that provides a pretty good fit for event processing.

Thursday, July 17, 2008

Rendering Unto Caesar - The Role of the Business User in CEP

Posted by Chris Martins

"Render unto Caesar the things which are Caesar's
and unto God the things that are God's"

A recent posting in the Enterprise Decision Management Blog entitled "Can we trust business users" addresses a topic that seems equally pertinent to the CEP market. I think there is a tendency to become so enamored with the technical virtuosity of new technology that we may lose sight of who the real users are. In terms of the development of CEP applications, an understanding of the prospective roles of business users vs. that of IT developers seems to be an evolving thing. Apama has long been a proponent of the participation of the business user in the process of building CEP-driven applications. In Capital Markets, the notion of "empowering the trader" has been a key element of our strategy and the Apama product offers a graphical development tool, Event Modeler, that focuses on that constituency. We have another RAD tool that is intended for developers who can create applications in our event processing language, as well.

We are beginning to see third party validation of the value of this approach from outside of Capital Markets, as well. For example, a report from Forrester Research published earlier this year indicated that early adopters of CEP have tended to come from the line of business rather than IT because "developers and architects often know painfully little about these [non-traditional, non-IT] events and how they are used to run a business." That Forrester quote is certainly not intended to diss IT. It just recognizes that there are lots of different kinds of events that are important to a business and not all those events are traditional IT-aware or IT-generated events. In order to make sense and respond to such events, it seems quite logical that providing tools that are amenable to a more "business" oriented audience is important.

But I would argue that it is not just the nature of events - and their varied sources - that suggests a strong correlation between CEP and business users. It is also the nature of the CEP-driven applications themselves. CEP applications are not "one and done", they tend to be iterative and evolving, because they are crafted to respond to what is happening and what is happening is often a frequently changing set of conditions. Or, if the conditions are not changing, how you choose to respond to them may be changing. So you need to continually calibrate and revise.

In another Forrester report published earlier this year, this characteristic was noted within the context of a review of Business Activity Monitoring best practices. Event-driven BAM is a particularly strong use case for CEP and the report stated that BAM efforts "are typically never-ending projects" with a "high degree of change." That makes sense since BAM monitors 'business activities' and the nature of most businesses will change over time. So to support BAM applications, it seems perfectly logical to provide tools for business users who can take on some role in the initial development and ongoing operational calibration of these applications. There is clearly an important role for developers in building these applications – no one would suggest otherwise, but we best not forget what the “B” in BAM refers to.

What seems to be emerging is the notion that we are should not look at CEP and/or BAM deployments as discrete, finite projects with clearly prescribed end dates. They are continuously iterative projects that must evolve to remain effective. That’s the environment in which they operate. Given that, perhaps we should not see the roles of business users and IT as fitting within well prescribed boundaries. The development and ongoing management of these applications will have evolving roles for the line of business user and for IT over time. We might expect IT-centric development to have a more dominant role in the initial deployment, but over time the goal might be to have the line-of-business assume greater and greater roles - because the business will be the dominant user and best positioned to react to changing circumstances.

Perhaps the EDM blog posting says it best, though it expresses it within a "business rules" context. “Too many rules specialists focus on rules that will get executed and not enough on the people that will maintain those rules, although this is where the success of the project resides.” That is quite the same for CEP and BAM. There is a role for business users, driven by the nature of events and the continuously evolving nature of the applications that are “event-driven.” And it is incumbent on the technology provider to offer tools that will facilitate that evolution. All the CEP performance in the world will be of little use, unless that performance is well-aligned with the needs of the business.

So we might debate who is the metaphorical Caesar and who is God in CEP development, but the success may well rest on giving each their due.

Tuesday, April 22, 2008

Asia Report: Fighting White Collar Crime

Posted by John Bates

Titchy_johnHello from Hong Kong. As always it is fascinating to see how CEP is evolving in Asia. One trend I am observing is the huge interest in Hong Kong in rogue traders and white collar crime – and how CEP can be used to detect and prevent this – before it moves the market. Obviously the original rogue trader, Nick Leeson, is well known here. But there has been a great deal of interest in more recent goings-on, at firms such as SocGen. Amazingly, until a couple of years ago, insider trading was not illegal in Hong Kong! Now we have a highly volatile market, with a lot of uncertainty, huge event volumes and a real problem of seeking out and preventing rogue trading activities, as well as managing risk exposure proactively.

Of course CEP provides a compelling approach. In market surveillance - the ability to monitor-analyze and act on complex patterns that indicate potential market abuse or potential dangerous risk exposure can allow a regulator, trading venue or bank to act instantly. Banks want the reassurance that they are policing their own systems. Regulators need to protect the public. The media and public here find this fascinating.

On the topic of a different kind of white collar crime – consider using CEP to detect abuse in the gaming industry. The gambling phenomenon that has propelled Macau to overtake Las Vegas as the world’s biggest gambling hub is also an exciting opportunity for CEP. We have customers using CEP to monitor and detect various forms of potential abuse in casinos. Events that are analyzed to find these patterns include gamblers and dealers signing on at tables, wins and losses, cards being dealt etc. It is possible to detect a range of potential illegal activities, ranging from dealer-gambler collusion to card counting.

As a final thought - having met with some of our customers that operate both in Hong Kong and mainland China, it is clear that China is a massive market opportunity for CEP. Exciting times ahead for CEP in Asia.

Wednesday, January 02, 2008

Dr. John Bates on Fox Business News

Posted by Chris Martins

As readers of this blog know, Apama was chosen by the UK regulatory agency, the Financial Services Authority, to provide the CEP technology that delivers real-time market surveillance as part of the the FSA's SABRE II project (FSA/Apama Announcement). In a recent interview by the Fox Business Network, John Bates discussed how Apama can be used to detect patterns of suspicious trading behavior - much like Apama is so often used to identify and respond to market patterns in support of algorithmic trading applications.  The link below will take you to the Fox URL that plays the video.

Fox Business Interview with Dr. John Bates