Chris Martins

Wednesday, November 12, 2008

Apama & CQG - Partnering for Low Latency Futures Execution

Posted by Chris Martins

The 24th Annual FIA Futures & Options Expo took place in Chicago this week in and we used that venue to announce an Apama partnership with CQG.  A quite significant player in the Futures trading market, CQG offers real time quotation, charting and technical analysis tools. Via this new partnership, CQG users can now automatically execute trades that have been signaled from within CQG's technical analysis models, leveraging algorithms that are offered via the Apama CEP platform.

A key element in the partnership is the deployment model.  Apama is being hosted in the CQG infrastructure such that their users do not have to install Apama locally in order to access the benefits of the integrated offering.  They can select and parameterize the execution algorithms via Apama dashboards and gain very low latency execution through CQG’s co-located facilities at major global exchanges.

Deals like this signal the growing interest in Apama. Part of the appeal is the flexibility of the underlying Apama CEP platform (and its supporting tools).  That flexibility encompasses different asset classes, the different application use cases that are needed to support those asset classes and, as is evidenced here, different deployment models. 

Shown below is a CQG screen, with the Apama dashboard component highlighted.

Cqg_screen

Monday, November 10, 2008

Apama Capital Markets Framework

Posted by Chris Martins

There can be a tendency to presume that CEP’s early adoption in Capital Markets has been predicated solely on the performance characteristics of CEP engines. Though performance is clearly a key attribute for these systems, it is far from the only determinant of success. Apama’s success has been equally attributable to having a strong understanding of how to apply CEP to specific Capital Markets requirements.  That is backed up by a reservoir of experience – within both the Apama engineering team and the field organization – that knows how to apply that experience within deployments.

The Apama commitment to this approach is exemplified by our recent announcement of the Apama Capital Markets Framework. Over the year, we have announced a series of Solution Accelerators that focus on specific Capital Markets application areas and extend Apama with features that “accelerate” the speed with which our clients can deploy applications in those areas. This approach belies the “feeds and speeds” myopia that often dominates discussions about CEP, both within Capital Markets and without. Yes, Apama is fast – certainly the equal, if not the superior, to other products in the market. But all the speed in the world is of little use if your application is still under development. The speed of developing the application is equally important and it is a mistake to consider the effort to bridge the gap between the power of the underlying engine and a solution to the customer as just a “simple matter of programming”. Thus, Apama devotes considerable time and energy on both development tools, as well as our solution accelerators, to faciliate the speed with which our customers get to that end solution.

And while we note that others are beginning to follow our path with their own "solutions", the new Capital Markets Framework takes this approach further. Apama is effectively providing a solution-focused framework for the ongoing enhancement to our existing Accelerators and the creation of new ones. This is more than just marketing lip-service - not that we ever do that anyway :-)  There is a substantive engineering commitment to the CMF and its Accelerators that truly capitalizes on the Apama experience with the use of our technology to address real world use cases.

---------------------------------------------------------------------

Ii_logo As a bit of further evidence that Apama’s presence in Capital Markets is substantial, let this blog posting also briefly acknowledge that Apama co-founder, Dr. John Bates, was recently recognized by Institutional Investor as one of thirty financial technology providers that are critical to the success of the financial markets.  The publication describes CEP as a "critical component in algorithmic trading systems for analyzing and acting upon vast quantities of market information."    While that statement is true, the potential for CEP in Capital Markets extends far beyond algorithmic trading.   And that is why the Capital Markets Framework will be a key component of Apama's efforts going forward.

Friday, August 22, 2008

CEP - Some Applications within Capital Markets

Posted by Chris Martins

There is occasionally discussion in the blogosphere around the role of CEP within Capital Markets.  Beyond the online chatter, we also hear via offline discussions that Apama comes up in that context, given our strong (arguably dominant) presence in that vertical.  I'd rather not get embroiled in a debate about what is or is not CEP or what are the historical antecedents of CEP.  Actually I would, but I won't here. Let's just say that there have been suggestions that Apama is really not CEP, because it is an algorithmic trading platform.  Or, on occasion, there is the corollary assertion that algorithmic trading is not CEP and since Apama does algorithmic trading, therefore it is not a CEP product, which is false both in terms of the facts and the logical structure of the argument.  And it goes on - and on.

John Bates recently conducted a series of "audio interviews" that talk about some of the different usages of Apama and CEP within Capital Markets.  They might be illustrative to those who see Apama and CEP solely in terms of algorithmic trading or don't really understand algorithmic trading.  The information is not intended to be deeply technical from either a CEP or Cap Markets perspective, but hopefully provides some introductory context for understanding the real potential for CEP in delivering value within that market - and beyond.

Download Rogue Trading >

http://apama.typepad.com.nyud.net:8090/podcast/1_rogue_trading.mp3

Download Early Adoption of Complex Event Processing >

http://apama.typepad.com.nyud.net:8090/podcast/2_cep_early_adoptions.mp3 

Download Risk Management and Market Surveillance >

http://apama.typepad.com.nyud.net:8090/podcast/3_cep_and_risk_management_V2.mp3
 

Thursday, July 17, 2008

Rendering Unto Caesar - The Role of the Business User in CEP

Posted by Chris Martins

"Render unto Caesar the things which are Caesar's
and unto God the things that are God's"

A recent posting in the Enterprise Decision Management Blog entitled "Can we trust business users" addresses a topic that seems equally pertinent to the CEP market. I think there is a tendency to become so enamored with the technical virtuosity of new technology that we may lose sight of who the real users are. In terms of the development of CEP applications, an understanding of the prospective roles of business users vs. that of IT developers seems to be an evolving thing. Apama has long been a proponent of the participation of the business user in the process of building CEP-driven applications. In Capital Markets, the notion of "empowering the trader" has been a key element of our strategy and the Apama product offers a graphical development tool, Event Modeler, that focuses on that constituency. We have another RAD tool that is intended for developers who can create applications in our event processing language, as well.

We are beginning to see third party validation of the value of this approach from outside of Capital Markets, as well. For example, a report from Forrester Research published earlier this year indicated that early adopters of CEP have tended to come from the line of business rather than IT because "developers and architects often know painfully little about these [non-traditional, non-IT] events and how they are used to run a business." That Forrester quote is certainly not intended to diss IT. It just recognizes that there are lots of different kinds of events that are important to a business and not all those events are traditional IT-aware or IT-generated events. In order to make sense and respond to such events, it seems quite logical that providing tools that are amenable to a more "business" oriented audience is important.

But I would argue that it is not just the nature of events - and their varied sources - that suggests a strong correlation between CEP and business users. It is also the nature of the CEP-driven applications themselves. CEP applications are not "one and done", they tend to be iterative and evolving, because they are crafted to respond to what is happening and what is happening is often a frequently changing set of conditions. Or, if the conditions are not changing, how you choose to respond to them may be changing. So you need to continually calibrate and revise.

In another Forrester report published earlier this year, this characteristic was noted within the context of a review of Business Activity Monitoring best practices. Event-driven BAM is a particularly strong use case for CEP and the report stated that BAM efforts "are typically never-ending projects" with a "high degree of change." That makes sense since BAM monitors 'business activities' and the nature of most businesses will change over time. So to support BAM applications, it seems perfectly logical to provide tools for business users who can take on some role in the initial development and ongoing operational calibration of these applications. There is clearly an important role for developers in building these applications – no one would suggest otherwise, but we best not forget what the “B” in BAM refers to.

What seems to be emerging is the notion that we are should not look at CEP and/or BAM deployments as discrete, finite projects with clearly prescribed end dates. They are continuously iterative projects that must evolve to remain effective. That’s the environment in which they operate. Given that, perhaps we should not see the roles of business users and IT as fitting within well prescribed boundaries. The development and ongoing management of these applications will have evolving roles for the line of business user and for IT over time. We might expect IT-centric development to have a more dominant role in the initial deployment, but over time the goal might be to have the line-of-business assume greater and greater roles - because the business will be the dominant user and best positioned to react to changing circumstances.

Perhaps the EDM blog posting says it best, though it expresses it within a "business rules" context. “Too many rules specialists focus on rules that will get executed and not enough on the people that will maintain those rules, although this is where the success of the project resides.” That is quite the same for CEP and BAM. There is a role for business users, driven by the nature of events and the continuously evolving nature of the applications that are “event-driven.” And it is incumbent on the technology provider to offer tools that will facilitate that evolution. All the CEP performance in the world will be of little use, unless that performance is well-aligned with the needs of the business.

So we might debate who is the metaphorical Caesar and who is God in CEP development, but the success may well rest on giving each their due.

Wednesday, June 11, 2008

SIFMA 2008 - Hot and Busy

Posted by Chris Martins

This week is the SIFMA (Securities Industry and Financial Markets Association) technology management conference in NYC.  The city has been brutally hot and the NYC Hilton exhibit hall, at least where Apama is, has not been measurably cooler. Those $5.00 bottles of water that they sell in the exhibit hall might represent opportunism - or capitalism - at its best. Or maybe it’s just NYC. :-)

First day attendance has been quite good and I’d encourage any attending to stop by the Apama exhibit (Grand Ballroom, Level 2, #2415). You’ll see a range of demos, including our new 4.0 release, and see some of the “magic” of CEP in action (you’ll have to come by to understand the magic reference). A lot has been happening with Apama and we’ve used this show to make some significant announcements. 

  • A partnership with NYSE Euronext Advanced Trading Solutions that enables NYSE Euronext ATS to offer its customers Apama-based CEP capabilities for algorithmic trading, real-time risk management, and smart order routing. The announcement has received significant press pickup, including a nice story in the Financial Times.
  • A technology development collaboration with Sun Microsystems involving support of the Apama platform on Sun x64 systems, running the Solaris 10, as well as the launch of a new Apama “Accelerator”, the Apama Real-time Pricing Accelerator. Apama continues to focus on application templates that enhance the speed with which solutions can be deployed and this Accelerator extends that theme into the area of market-making for bonds and other instruments.
  • Announcement of the upcoming availability of above-mentioned Apama 4.0, which introduces a new development environment, Apama Studio. Studio brings all the Apama development tools together within a single, integrated environment, complete with tutorials, sample applications and other aids. You might consider Apama 4.0 as, at its heart, a CEP Accelerator. 4.0 also includes significant enhancements to Apama’s execution performance, partially achieved via a new event messaging transport.

So a lot is happening both in NYC and at Apama.

Now if the Celtics could only close out the Lakers.

Thursday, May 01, 2008

An Apama Hat Trick

Posted by Chris Martins

Last week proved to be a busy one for Apama on the marketing front as we issued three separate announcements in conjunction with our presence at the TradeTech show in Paris.  Two of the announcements focused on customers, while the third focused on work that a partner is jointly doing with Apama in the area of market surveillance.

  • ING Wholesale Banking announced that it is expanding its use of Apama, previously focused on algorithms for Benelux Small and Mid-Caps.  ING has re-engineered those algorithms to address markets in Hungary and Poland with a variety of features that include hybrid cross asset algorithms that leverage ING’s direct access within those emerging markets.
  • SEB, the Scandinavian financial group, announced it will expand its use of Apama to deliver advanced order flow monitoring services within a compliance application.  This was a second SEB announcement, following one last year regarding the SEB deployment of Apama to support client trading in Exchange-traded equities and futures.
  • And lastly, but certainly not least, together with Detica, a key partner, we jointly announced a Market Surveillance Accelerator.  Accelerators are extensions to the core Apama platform that help our customers jumpstart their deployments, incorporating business logic and other components like sample dashboards and adapters for connectivity.  In this instance, we are combining the technical know-how and experience of Detica and Apama – both of which are now supporting projects at the FSA and Turquoise - to address the growing demand for real-time market surveillance capabilities.  We’ve previously announced Accelerators for Market Aggregation and Smart Order Routing.   And there'll be more to come.

These three announcements collectively illustrate that a key part of the value of the Apama CEP platform is its versatility.  Apama may initially be deployed in support of a specific application like algorithmic trading or a specific asset class, but upon experience with the product, many of our customers expand their use to different asset classes, different geographic markets and/or entirely different applications – like compliance or risk or market surveillance.

Friday, March 21, 2008

CEP and Real-Time Risk – “The Dog Whisperer”

Posted by Chris Martins

This week’s Financial Times published an interesting article by Ross Tieman on technology’s role as a “scapegoat” (his term) for some of the problems in financial markets. With its both evocative and provocative title, “Algo Trading: the dog that bit its master”, you can get a sense of the article theme, though a complete reading of the piece is worthwhile.

One of the challenges noted by the author is in the area of risk management. Too often existing risk processes - and their supporting technology - focus upon performing end-of-day assessments. But when much of the trading activity is quantitative, that can be much too late to detect when positions are careening out of control and risk thresholds have been breached. There seems to be growing recognition for the need for continuous calibration of positions – what amounts to real-time risk management – in order to keep pace. Perhaps the notion of a “daily VAR” may well become an artifact of the 1990’s. 

Just as it powers a number of real-time trading deployments, CEP can be an equally rich technology foundation for building the kind of real-time visibility that is needed by modern risk management systems. The same technology that drives quantitative trading can equally be applied to the task of monitoring that trading and keeping it in check, if necessary.


Now risk management is an extremely complex endeavor, so I would not argue that CEP alone is the answer. But rightly implemented, CEP clearly offers the low latency infrastructure that can help drive the real-time calculations that are needed. Market regulators (e.g. FSA) and trading exchanges (e.g. Turquoise) have begun to recognize the potential of CEP to monitor market behavior. It is likely only a matter of time before trading firms awaken to the possibilities of CEP driving real-time risk systems that monitor behavior within the firms themselves.

So, if you’re concerned about the technology “dog” biting its master, perhaps it’s time to consider CEP as a prospective “dog whisperer” that can help manage the risk. 

Wednesday, March 05, 2008

SmartBlocks

Posted by Chris Martins

In the world of CEP, attention to rapid application development has tended to take a back seat to the focus on latency and performance.  However, the ability to develop and deploy CEP solutions relatively quickly is important if the applications are to be able to adjust to changing circumstances.  And rapid application development and rapid application execution are certainly not mutually exclusive. For CEP to become more widely adopted – and arguably to fulfill its potential to become a pervasive mode of computing – there must be increased focus on the process of building CEP applications. 

Apama is a longstanding proponent of tools that 1) foster rapid development and 2) make the development of CEP applications more accessible to the business users.  This week we announced some enhancements in this area, in the context of Apama's SmartBlocks.  As a bit of serendipity, this announcement follows some recent data gathered by Forrester Research that speaks to the role of business in the adoption of CEP - "CEP Adoption is Broader, Deeper, and More Business-Driven than IT May Expect", January 31, 2008.  That report is gated, but a summary excerpt is available.

Friday, February 29, 2008

Taking CEP to The Next Step

Posted by Chris Martins

Standards are an area of growing interest in the complex event processing market. What will be the role of standards.  In what aspects of CEP will they emerge? When? 

With the premise that “in the future, it will be necessary for heterogeneous CEP products to interoperate across organizational boundaries” the Object Management Group is hosting a Technical Meeting in Washington DC to address some of the issues. As part of the agenda, fellow Apama colleague and blog contributor, Matt Rothera, will be speaking on How CEP Takes the Next Step. So if you are going to be in the DC area on March 13, and interested in this topic, you definitely should check it out.

Monday, February 11, 2008

Apama CEP Code Snippet

Posted by Chris Martins

We've posted examples of coding in alternative CEP languages in the past to illustrate how concise or verbose those approaches might be in expressing an event processing function.  An example of Apama's language has made an appearance in a blog posting by Lab49.  In the example, the code defines very crisply an operation in which the system responds to incoming price events, ensuring that you skip intermediate events and always process the latest event.