Some recent news
on improvements and changes in approaches to BI architectures caught my eye.
New technologies suggest that there maybe alternatives to traditional BI
architectures (see the recent posting by Curt Monash on in-memory BI and Philip Howard of the Bloor Group on data
warehouse appliances). Though I am
not intimately familiar with these new approaches, they seem to suggest the
kind of blazing speed and application to some areas, (for instance in-memory
analytics and activity monitoring) that overlap with the capabilities of CEP applications.
Maybe a new turf
war is on the horizon.
In an article in
DM Review earlier this year, Larry Goldman
of AmberLeaf took on the daunting task of whether a new event processing
technology is required to support a more responsive BI architecture. Larry posed
a series of questions for determining whether you should go the CEP route or
can make do with existing technology. In
light of the new commentary referenced above, I’d like to augment/question some
of the thoughts in the Goldman article to show that there are other criteria
that argue for going the CEP platform route and that, as we are fond of saying,
it’s not just about ‘feeds and speeds.’
from DM Review January 2007, Customer Intelligence: Event Processing Alphabet
Soup) with comments interspersed:
1. Do I already have competencies in
real-time messaging and streaming? If
you do, you may not need an application [specifically designed for CEP}. If you
don't, these products may decrease the learning curve.
Agreed that one may have
competencies in real time messaging and streaming in terms of accepting the
data and storing it, but are you processing it as it arrives? You must
also consider what benefit you can draw from handling this data ‘in flight’ vs.
persist, query and analyze?
2. Can my reporting infrastructure handle
operational BI, scaling
to hundreds or thousands of users? If it cannot, these tools may be able to
scale without forcing you to be a performance guru.
Can my infrastructure
handle operational BI? What is
operational BI? I believe it’s the notion that traditional BI tools do great at
mining vast quantities of captured, processed and transformed data to produce
graphs, charts and metrics. But how do
you transform those graphs and charts and metrics into actions – this is what
operational BI is looking at. And this is where the intersection with
BAM, CEP, and EDA comes into play.
3. Can users easily identify or specify events
to track? If they can't,
these tools may help you identify and monitor events without IT involvement.
Can users easily identify
or specify events to track? One of the things that I think is on the
forefront in CEP is technology that can determine or detect meaningful
patterns, rather than be programmed or setup to react to known/defined
patterns. We see this as a major wave for CEP evolution.
4. What does real time mean to me? How fast
do I need to make decisions? Do I have the people or the processes to react in
I don’t disagree with
that. This was central to the recent Roy Schulte presentation on BAM at
the Gartner CEP conference in Orlando (September 2007). Roy has created strata to show that there are different applications and verticals that have different perceptions of real-time,
ranging from those measured in milliseconds (e.g. trading) to those measured in
minutes and hours (e.g. supply chain management).
5. Perhaps there is a 5th
question here and that is one that presents the unique capabilities of CEP to
the audience. Do I need to monitor event data across time windows (A and
B happen within X of one another [or not])? Do I need to monitor large
numbers of permutations of each rule simultaneously? Do I need to derive
or infer activity from my event flows? Traditional query based approaches
struggle with these issues especially if the demand or query refresh rate is
As the world of
traditional BI architecture evolves and users look to determine whether CEP
based architectures are appropriate, it is important to note that there may be
additional benefits to the use of CEP rather than just ‘trading up’. Why not
look at the two technologies as two parts to a greater solution? Augmenting
an existing BI infrastructure with CEP is one approach (in which one applies
event processing logic to the streams before they are passed into the data
warehouse/analysis layer) as is augmenting a CEP solution with analytics/KPI
from an existing BI infrastructure. There are opportunities for both sets of technology and collaboration in
this instance may help to clarify rather than obfuscate for the target user.