Louis Lovas

Monday, March 08, 2010

Rumblings in the Cloud

Posted by Louis Lovas

Rumblings in the Cloud
Cloud computing... it's on everyone's mind these days. Personally I think it's a term that has attained such aggrandized acclaim that vendors, analysts, bloggers and anyone with marketing muscle has pulled and stretched its definition to such and extent that it could mean just about anything hosted. Cloud Computing Journal polled twenty-one experts to define Cloud Computing.  Just the fact they had to ask the question of twenty-one experts is rather telling in itself.  Well I read what the experts had to say.

So armed with my newly minted (yet fully stretched, but not of my own making) Cloud definition I happened upon this commentary about CEP in the Cloud or the lack thereof.  There's a great quote in the article: "I don’t care where a message is coming from and I don’t care where it’s going”. Correctly indicated, this in a sense defines a key aspect of CEP. Event-based applications should be transparent to messages (or events to which messages transform) origin and destination (sans a logical or virtual name).  However, unlike the author Colin Clark, I do believe the current crop of vendor products, most notably Progress Apama maintain this separation of the physical from the virtual.

The rationale behind the lack of CEP-based applications in the Cloud (ok, there's that word again) are found in other factors. To explain my reasoning I'll start by dividing CEP-based applications into two categories. Of course there are many ways to categorize CEP-based applications, but for the sake of this discussion, I'll use these two:

CEP-based Application Categories
  1. Those that do things
  2. Those that observe other applications doing things
Not sure I could make a simpler layman-like description, but needless to say it warrants further explanation (or definition in sticking with our theme)

CEP-based applications that do things
This category is best explained by example. Typical of event processing applications that do things are those in Capital Markets like algorthmic trading, pricing and market making. These applications perform some business function, often critcal in nature in their own right. Save connectivity to data sources and destinations, they are the key ingredient or the only ingredient to a business process.  In the algo world CEP systems tap into the firehose of data, and the data rates in these markets (Equities, Futures & Options, etc.) is increasing at a dizzying pace. CEP-based trading systems are focused on achieiving the lowest latency possible. Investment banks, hedge funds, and others in the arms race demand the very best in hardware and software platforms to shave microseconds off each trade. Anything that gets in the (latency) way is quickly shed.

In other verticals, an up and coming usage of CEP is location-based services. This is one that leveraging smart mobile devices (i.e "don't care where the message is going") to provide promotions and offers.  
    • Algo Trading, Pricing, Market Aggregation
    • Location Based Services (providing promotional offers and alerts)
CEP-based applications that observe other applications doing things
Conversely, event-based applications that observe other applications doing things are classified as providing visibility or greater insight into some existing business function. These event-based applications overlay business processes to take measures to improve their effectiveness. As is often the case critical business applications provide little visibility or the information is silo’ed. There is a need to provide a broader operational semantic across a heterogeneous mix of business applications and processes.  Here are a few typical examples of event-based visibility applications observing other business systems.
    • Telco Revenue Assurance
    • Click Stream Analysis
    • Fraud Detection
    • Surveillance
Of  course the demarcation line between these two classifications is not clear cut. Providing greater visibility is just a starting point, monitoring for opportunities to take action is just as important such as kicking-off a fraud watch if a suspected wash-trade occurred  (so in a sense they are doing things).

Where for art thou oh CEP
When considering the Cloud, an important point to consider is dependency. Specifically, there is a dependency that the underlying applications and business processes exist in the Cloud for (observing) CEP to overlay them.  I would offer that Enterprise business has not yet migrated their key business processes to the Cloud on a widespread scale just yet. Why not? What are the barriers? Security, regulatory compliance, DR, investment costs, limited skill sets are just a few of the challenges mentioned in this ITProPortal article.  I suspect these barriers are far reaching, keeping the pace of Cloud deployment in check to the point where it's not as yet strategic to many.
 
One of key things that makes the Cloud a reality is virtualization, it has clearly revolutionized PaaS as the Cloud. Virtualization does come at a cost, there is a latency penality for the conveinence, no matter how small for some use-cases that cost is too great.

Make no mistake, I am certain the Cloud with all it's twenty-one definitions is the future of computing. It's an imperative that will knock down the barriers and change the face of the Enterprise and when it reaches critical mass CEP will be there.

Once again thanks for reading, you can follow me at twitter, here.
Louie




Monday, February 22, 2010

Peas and Carrots

Posted by Louis Lovas

In the words of the auspicious Forrest Gump some things go together like peas and carrots. Truer words were never spoken. Some things just do go together well, sometimes by design, often by accident. I don't think anyone actually planned milk and cookies or popcorn at the movies but nonetheless these things are made for each other.  When it comes to technology the same harmonious relationships exist.

In the recent Aite report on High Performance Databases (HPDB),  the market for specialized databases is surveyed along with a handful of vendors in this space.  This is a cottage industry where the big database vendors don't play. It's hard to imagine in this day and age where database technology is so standardized and mature and a multitude of choice abounds from commercial products to open source that any other database technology and a gang of vendors would have a chance. Yet it is happening and it's thriving.  

I believe it has to do with a synergistic relationship to event processing. If CEP is the "peas" then HPDB's are the "carrots". These two technologies share two fundamental precepts:

  •  A focus on Extreme Performance
  •  Temporal Awareness

I. Extreme Performance, Speeds and Feeds
These HPDB's which are often referred to as Tick databases, are found in the same playground as event processing technologies. In the Capital Markets industry they connect to the same market sources, consume the same data feeds. Both technologies are designed to leverage modern multi-core hardware to consume the ever-increasing firehose of data. By the same token, once that data is stored on disk, database query performance is equally important.  The massive amount of data collected and is only as good as the database's ability to query it efficiently thus creating another (historical) firehose of data which an event processing engine would be the consummate consumer.  

II. Temporal Awareness, when is the data
Time is a basic principle in event processing technology, applications typically have as a premise to analyze data-in-motion within a window of time. HPDB's design center is to store and query time series data. Some of the database vendors even bring time to a higher level business function. They understand the notion of a business Calendar, knowing business hours, business week, holidays, trading hours, etc.  Imagine the simplicity of a query where you want 'Business hours Mon-Fri for the month of February' and the database itself would know the third Monday was Presidents Day, skipping over that, thus preventing analytic calculations from skewing erroneously.

Leveraging the Synergy
These two fundamental shared principles provide the basis for a unique set of business cases that are only realized by leveraging Event Processing platforms and High Performance Databases

  • Back testing algorithms across massive volumes of historical data compressing time
What if you could test new trading algorithms against the last 6 months or 1 - 2 years of historical market data but run that test in a matter of minutes? What if you could be assured that the temporal conditions of the strategies (i.e. timed limit orders) behaved correctly and deterministically matching the movement of time in complete synchronicity with the historical data? These are just a few of the characteristics that define the harmony between event processing and high performance (Tick) databases.
  • Blending live and historical data in real-time
Querying historical data in-flight to obtain volume curves, moving averages, the latest VWAP and other analytics calculations are possible with these high performance databases. Leading edge trading algorithms are blending a historical context with the live market and even News. The winners will be those that can build these complex algo's and maintain ultra low-latency.
  • Pre-Trade Risk Management
Managing positions, order limits and exposure is necessary, doing it in real-time to manage market risk is a mandate.  In addition to market data, these high performance databases can store pre and post trade activity to complement event-based trading systems and become the basis for trade reporting systems.

In the Trading LifeCycle, Event Processing and High Performance databases are partner technologies harmoniously bound together to form a union where the whole is greater than the sum of the parts. They are the peas and carrots that together create a host of real-world use-cases that would not be possible as individual technologies.

Myself along with my colleague Dan Hubsher we are doing a 3-part Webinar series entitled "Concept to Profit". The focus is on event processing in the trade lifecycle, but we include cases that touch upon high performance databases. You can still register for part 2: Building Trading Strategies in the Apama WorkBench where I will focus on the tools for strategy development aimed at the IT developer.

Once again thanks for reading, you can follow me at twitter, here.
Louie

Monday, February 01, 2010

From Concept to Profit in No Time Flat – High Frequency Trading

Posted by Chris Martins

Colleagues Dan Hubscher and Louie Lovas have begun a great webinar series that outlines the “lifecycle” of an algorithmic trading strategy.  Illustrated with Apama’s Event Modeler, they use a Commodity Futures trading strategy to illustrate how trading firms can accelerate the delivery of trading strategies with a development tool that is accessible to the trading desk. This can help make significant cuts in development times, allowing firms to capitalize more quickly on opportunities.  Future sessions will explore some of the other aspects of the platform that target developers, recognizing that firms have different business models and development styles.

Lifecycle Image

We’ll be posting the recorded version shortly for on-demand viewing, but if you have not registered, I’d encourage you to click here and get on board for parts 2 and 3.

Friday, November 20, 2009

Exploration of Apama 4.2 Feature Set Podcast

Posted by Apama Audio

Louis Lovas, Chief Architect of Progress Apama, discusses aspects of the Apama 4.2 release that focus on application developer productivity and how Apama enhances an organization’s ability to built event-driven applications.

Wednesday, November 04, 2009

Apama 4.2 Deeper Exploration: Enhanced Support for Parallelism

Posted by Apama Audio

In this podcast Louis Lovas, Apama Architect, discusses some of the details of the enhanced parallelism that is now available in Progress Apama 4.2.


.

Monday, October 19, 2009

Progress Apama Announcing Latest Release 4.2

Posted by Apama Audio

As a follow up to the Louie Lovas blog posting on October 16th , this  podcast captures a discussion between David Olson and Giles Nelson on Apama 4.2 features.


Friday, October 16, 2009

Apama 4.2 release - Cruising in the fast lane

Posted by Louis Lovas

Apama 4.2 release - Cruising in the fast lane
The Apama engineering team has done it once again. True to our record of releasing significant new features in the Apama product every 6 months, the v4.2 release is hot off the presses with major new functionality. The Apama roadmap is driven by a keen sense of our customer requirements, the competitive landscape and an opportunistic zeal. The engineering team is a dedicated R&D team driven to excellence and quality. We are dedicated to delivering value to our customers. A consistent comment we've heard from analysts and customers alike is the maturity of the Apama product.  

The current v4.2 release, the third in the v4.x family adds significant enhancements in three concurrent themes - Performance, Productivity and Integration. This consistent thematic model is one we've held for a number of years. Below I've touched upon the highlights of the current release along these themes:


  • Performance
High Performance Parallelism for Developers.  The Apama Event Processing Language (EPL) provides a set of features uniquely suited to build scalable event-driven applications.  The language natively offers capabilities for event handling, correlating event streams, pattern matching and defining temporal logic, etc. Equally important, the language provides a flexible means to process events in parallel.  For this we provide a context model and a new high performance scheduler. Contexts can be thought of as silos of execution, where CEP applications run in parallel. The scheduler's role is to manage the runtime execution in an intelligent high-performance way, and to leverage the underlying operating system threading model. It’s via the context architecture that the Apama Correlator squeezes the most out of operating system threads to achieve maximum use of multi-core processors for massive vertical scalability. For IT developers, this is a effective and efficient means to build high performance, low latency CEP applications without the pitfalls of thread-based programming, such as deadlocks and race conditions.

High Performance Parallelism for Business Analysts.  Not to be left out of the race, we've also ensured the scalable parallelism provided in the Apama CEP engine is available through our graphical modeling tool, the Event Modeler. We've had this graphical modeling capability since the very first release of Apama. This tool designed for analysts, quantitative researchers and of course developers, allows you to design and build complete CEP applications is a graphical model.  Parallelism is as easy as an automatic transmission, simply select P for parallel.

  • Productivity

Real men do use Debuggers (and Profilers too). The Apama Studio now sports major new functionality for development, a source level debugger and a production profiler. Building applications for an event-driven world presents new programming challenges. Having state-of-the-art development tools for this paradigm is a mandate. The Apama EPL is the right language for building event-driven applications - now we have a source-level debugger designed for this event paradigm. Available in the Eclipse-based Apama Studio it provides breakpoints to suspend applications at specific points, examine contents of program variables and single stepping. It works in concert with our parallelism as well. Profiling is a means to examine deployed Apama applications to identify possible bottlenecks in CPU usage.

Jamming with Java. We've enhanced our support for Java for building CEP applications. The Apama Studio includes a complete set of wizards for creating monitors, listeners, and events to improve the development process when building java-based CEP applications in Apama.

  • Integration

The (relational) world plays the event game. While we have provided connectivity to relational databases for many years we've made a significant re-design in the architecture of how we do it with the new Apama Database Connector (ADBC). The ADBC provides a universal interface to any database and includes standard connectors to ODBC and JDBC.  Through the ADBC, Apama applications can store and retrieve data in standard database formats using general database queries, effectively turning these relational engines into timeseries databases. The data can be used for application enrichment and playback purposes. To manage playback the Apama Studio includes a new Data Player that enables back-testing and event playback from a range of data sources via the ADBC. One can replay at varying speeds event data and time itself. The tested CEP applications behaves temporally consistent even as data is replayed at lightening speed.

Cruising at memory speed with MemoryStore. The MemoryStore is a massively scalable in-memory caching facility with in-built navigation,  persistence and visualization functionality.  This allows CEP applications, which typically scan, correlate and discard data very quickly to retain selected portions in memory for later access at extreme speed. This could be for managing a financial Order Book, Payments or other data elements that the application needs to be able to access at user’s requests quickly. Furthermore, if required the in-memory image can be persisted to a relational database for recovery or other retrieval purposes, and lastly the MemoryStore allows selected portions of the in-memory cache to be automatically mapped to dashboards.

Well that's the highlights. There were also about a dozen other features within each of these three themes, just too numerous to mention.

We are committed to improving the Apama product by listening to our many customers, paying close attention to the ever-changing competitive landscape and researching new opportunities.

Again thanks for reading, you can also follow me at twitter, here.
Louie



Thursday, October 08, 2009

If You Build It They Will Come, An Apama Algo Webinar

Posted by Louis Lovas

IF You Build It They Will Come
My colleague Dan Hubscher and I just finished the first of a two part Webinar entitled "Build Quickly, Run Fast". In this Webinar we explained and demonstrated Apama as an Algo platform for high frequency and order execution algorithms.

As I've blogged in the recent past it is an arms race in High Frequency trading.  The need to build quickly is a demanding requirement to keep ahead in the race. Being armed with the right tools is paramount. Rapid development and customization of strategies using graphical modeling tools provides the leverage necessary to keep pace with fast moving markets.

To that point, in this webinar I demonstrated a couple of algo examples. The first was a complete strategy that incorporates an alpha element with multiple order execution options. In  designing and building strategies the trading signal detection is just the first part of the problem. This typically involves an analytic calculation over the incoming market data within some segment or window of time. For example a moving average calculation smooths out the peaks and valleys or the volatility of an instrument's price. Once the signal is detected it's time to trade and manage the order's executions. This is a key distinction between other CEP products and the Apama platform for building trading strategies. While it's possible to define an Event Flow in most or all CEP products for data enrichment and data analysis (i.e. the signal detection), for most other CEP products you have to switch out to some other environment & language to build the rules to manage the executions. The Apama platform is about building complete event-driven applications. So trade signal detection and order executions, whether it's a simple iceberg execution or something much more complex it can easily be designed, built and backtested in the same Apama graphical modeling environment (Of course for those more inclined to traditional development tools and methodologies, Apama offers a full suite of developer tools, an EPL, debugger, profiler and java support).

MovingCrossover Image


The second example in the Webinar demonstration was to build a small, but working strategy from scratch. I did this live in full view of the attendees. For this I did a basic price momentum strategy. This tracked the velocity of price movements. The trading signal was a parameterized threshold which indicated when that price moved up (or down) a specific amount for a specific duration.

This webinar is focused on highlighting the ever-present challenges investment firms face in high frequency trading:
  • Fears of the Black Box
  • The simple fact that markets are continually evolving
  • First Mover Advantage
  • Customization is king
Along with my colleague Dan Hubscher,  the Build Quickly webinar describes how the Apama platform delivers solutions to the Capital Markets industry to meeting these needs and challenges. 

Stay tuned for a link to the recording and don't forget to dial in to part II where we focus on performance requirements and characteristics. Again thanks for reading (plus watching the webinar), you can also follow me at twitter, here.

A follow up note, here's the link to the recordings for both part I and part II on Build Quickly Run Fast.

Louie



Wednesday, September 30, 2009

EPTS, the Symposium of Trento

Posted by Louis Lovas

EPTS, the Symposium of Trento
How many angels can dance on the head of a pin? I suppose that was a question debated at the Council of Trent that took place in Trento, Italy back in the 16th century. However, the Event Process Technical Society's (EPTS) annual symposium just last week took up residence in Trento to discuss and debate a host of lofty topics on event processing.

  • CEP's role and relationship to BPM (or more appropriately event-driven BPM)
  • Event Processing in IT Systems management
  • Event-based systems for Robotics
  • EPTS Working Groups ...
While the sessions and discussions on event processing did not have the global significance of angels on pin heads or the Counter Reformation it did provide a clear indication of just how broadly and deep event based systems can reach. Whether it's a business application monitoring mortgage applications, IT management systems in a Network Operation Center, bedside monitoring systems in a hospital or a robot packing pancakes into boxes they all have a common underpinning, consuming and correlating streaming event data.

Granted, not everyone approaches it with the same viewpoint. IT Systems Management people don't think about processing and correlating events, they think about device management, KPI's, Alerts and the like. Someone building, managing a business process is likely concerned with managing Orders - validating them, stock allocations, warehouses and shipments. Nonetheless, a common framework model behind these systems is event processing.

Two of my favorite sessions at the EPTS Symposium were a panel session on the EPTS Mission and an open forum on Grand Challenges, a brainstorming session focused on identifying barriers to the adoption of CEP.

EPTS Mission

Four panelists, myself included presented their expectations of the EPTS and it's role as an industry consortium, it's goals and what improvements can be made. As a baseline, the EPTS does have a existing mission statement defined as ...

To promote understanding and advancement of Event Processing technologies, to assist in the development of Standards to ensure long-term growth, and to provide a cooperative and inclusive environment for communication and learning.


Given this mission statement and my own expectations there are a number of basic intentions the EPTS should provide to the uninitiated to event processing:

Awareness   Provide commercial business and industry the necessary knowledge of event processing as a technology supported by numerous vendors with continuing research in academia.
Definition Provide a concise and definitive meaning of event processing,  a Taxonomy of Event Processing so to speak. This is both from the horizontal technology perspective and also a vertical focus for a handful of specific industries. It's often difficult for business people to understand technology without the context of a business or application focus.
Differentiation  Provide a clear distinction that defines event processing and distinguishes it from other technologies. Event processing is available is many forms, this symposium provided evidence of that.  Much of it is available in specialized form as in IT Systems management. There are also pure play event processing (CEP) vendors, such as Progress/Apama. But there are also Rules engines, Business Intelligence platforms, Analytic platforms, etc. This easily presents a bewildering world filled for choice, conflicting and overlapping marketing messages. The EPTS is in the perfect position to provide that clarity behind defining what is CEP and what isn't.
Cooperative Event Processing rarely operates in a vacuum. There are many synergistic technologies that closely pair with CEP. Often this can have a specific vertical business flavor, but often it's other platform technology such as BPM and temporal databases.


The EPTS has four working groups that have been active for the last year: Use-cases, Reference Architecture, Language Analysis and Glossary. To a large extent the working groups have provided and are working towards the definition of CEP that is clear. However, there still a need to highlight the salient value of event processing. For specific vertical domains, the value of CEP is clear-cut simply because the fit and function is tailor made. In Capital Markets, for example algo trading has all the hallmarks of a CEP application - high performance, low latency, temporal analytics and a streaming data paradigm fit-for-purpose. However, there are other application domains where CEP is equally viable but much more subtle.  I believe the EPTS can provide a vendor-neutral taxonomy of event processing - from the basics to the advanced. Explain why it's unique and different, why language is important and how it is synergistic with a host of other technologies. To this end, the group has decided to form two new working groups to focus on many of these areas. Clearly a forward thinking move.

The Event Processing Technical Society is an organization made of up both vendors and academics. We're held together by a common thread, a goal that the whole is greater than the sum of the parts and our collective will benefit all even as many of us are undeniably competitors.

Once again thanks for reading,  you can also follow me at twitter, here.
Louie



Sunday, August 09, 2009

Riding the Crest of the Wave... the Forrester Wave

Posted by Louis Lovas


In just a few short days of its announcement news of the Forrester CEP Wave has spread to all corners of the globe. From trade magazines, online journals and blogs to Facebook and Twitter, the headlines are everywhere. A Google search yields thousands of hits.  "Independent Research names Progress® Apama® as a Standout Leader in CEP ..."

The Forrester team of Mike Gualtieri and John Rymer state 'The Fledgling CEP Platform Market Is Vibrant, Competitive, And Dynamic'. Of course those of us that have been immersed in event processing for the past few years already knew that. It was our job to convince Mike and John. On behalf of Progress Apama and the CEP community, I would like to extend a word of thanks and appreciation to both of them for their efforts, diligence and patience in putting this Wave together. An enormous task given they reviewed 9 CEP products and vendor strategies in depth. Considering this was the first CEP Wave they also had to define an initial blueprint on CEP by which to evaluate vendors, they did a commendable job. Well done. You can get a complimentary copy of the pdf version from us here.  

It was quite a few months ago when I and a few of my esteemed colleagues began the CEP Wave process. In the abstract it was not too much different from responding to the questions in a prospect's RFP/RFI, for which I and my colleagues have much practice. However, a difference that I found unique was the format. A client proposal is generally a Word document where one can provide plenty of written detail, and diagrams to depict product architecture and function. Forrester Waves are MS Excel spreadsheets. Vendor's responses to the Wave's questions are to fit into an Excel cell. Being a long-winded person, it was a challenge to have the necessary succinctness dictated by the confines of a cell.  My colleagues were quite helpful to this end. 

In short order, it became clear as to the benefits of the spreadsheet format. While many documents - proposals, reviews, evaluations or other become static paper the moment they're published that is not the case with Forrester's Wave. There is a clear intent behind Forrester's use of the spreadsheet format; it creates a living/dynamic document for their clients.  Spreadsheet's by their very nature can be interactive. Spreadsheet formula's can accept user input and recalculate. This capability is exactly what Forrester leverages in the CEP Wave.

The Forrester CEP Wave is divided into three categories:
  • Current Offering: A platform feature breakdown, development and deployment tools and performance characteristics.
  • Strategy: The vendors investment for the future.
  • Market Presence: Customer base.
Within each of these categories is an entire litany of subcategories containing features and criteria by which the product and vendor are measured. Each is assigned a weight as deemed appropriate by Forrester in reviewing the CEP industry at large.  Each vendor is then judged by their merits and scored. The most important aspect of this is the weighting. This is the key that gives the Wave that dynamic nature. From a client perspective, the weighting can be adjusted to suit your specific requirements. If for example, your shop is Windows-only you don't need to have a high weight on multi-platform support, you can lower that value. Likewise, if you have strong need for high availability/disaster recovery you can increase that weighting. Making these adjustments will tune the Wave for your specific requirements. You will then see how vendors stack up against each other with your customized weights. By doing so, what you will find is that the Apama platform pops to the top of the list all too often.

Once again thanks for reading, you can follow me at twitter, here.
Louie