« May 2008 | Main | July 2008 »

June 2008

Sunday, June 22, 2008

Apama's Fit in the Future of FX

Posted by John Doherty

It seems like only yesterday that Atriax (2002) died leaving FXall and FXConnect the only multibank portals offering the Buy Side “competitive” FX prices via RFQ pricing from the banks.


In the past few years we’ve seen exponential growth in the number of FX trading venues and the services that they offer. No longer are those venues which provide streaming prices and anonymous matching exclusive to the banks, with even the electronic FX trading duopoly of the 80’s & 90’s forced to allow Buy Side participants a seat at the table (via Prime Brokerage) because of the competition.


The changes have blurred the roles between the Buy & Sell Sides, primarily benefiting the Buy Side through greater transparency, enhanced prices and real-time market access. To the Sell Side, ECN may no longer be a term that evokes hatred but it’s difficult for them not to remember the “good ole days” before paper thin spreads and anonymity.


In today’s ever fragmenting environment Complex Event Processing (CEP) has already solved many problems for both the Buy & Sell Side. We have customers who’ve automated a variety tasks, produced both Trading and Execution Algorithms and claim that they’re doing twenty times the number of trades with the same number of dealers.


In the past year FX Aggregation has been “the flavour of the month.” I particularly liked it when a senior executive who saw our system and simply said “There is no fragmentation. With this system my dealers can access the whole market through just one screen.”  (learn more)


The May Edition of Profit and Loss (P&L) contained an article called the “Shifting Landscape: DMA, BANKS AND ECNS” which said “many refer to today as the Golden Age of ECNs.” It pointed out how the “Credit Crisis” and the events of last August forced many in the Buy Side to rethink the importance of having a relationship with banks to ensure access to the liquidity banks is maintained when other pools start drying up. 


Although P&L doesn’t see the death of ECN’s, the article made a number of observations and predictions on how the banks shall reassert themselves in the ever changing landscape of FX. I’m writing to comment about CEP’s role in what’s being predicted.


The cornerstone to P&L’s future lies in banks offering Direct Market Access (DMA) to their customers to guarantee transparency and best execution. DMA in FX of course will have its own intricacies, but this is an evolution not a revolution as DMA has been part of the Equity, Futures and Options markets for years.

Through FX Aggregators banks are already merging the fragmented FX market to guarantee their own access to best prices and deeper liquidity. The extension of this to their customers for DMA is relatively straight forward in much the same way that the Sell Side currently offers DMA via Smart Order Routers in the Equity space.


The next prediction is that banks will add their own liquidity to the aggregated pool and, when adequate numbers of customers are connected, to allow them to transact with each other. This sounds very much like a Crossing Network and a perfect home for a very fast CEP engine that can evaluate streams or orders, looking for matches.


Finally, P&L sees the banks interconnecting their systems to extend the liquidity pool, thus becoming ECN’s in their own right. Will this lead to the Buy Side then Aggregating the Aggregators?

Sprinkle in a few trading, execution and liquidity seeking algorithms and what P&L is predicting makes a lot of sense.


So what does this mean and what problems does it pose? It means that trading banks must be capable to quickly produce and deploy new services and be proficient in adapting these to address ever evolving requirements from the market (i.e. be swift & agile). The problem is many organisations aren’t particularly good at rapid development and deployment.  Those who are shall hold a real competitive advantage going forward.


Complex Event Processing fits the infrastructural needs for DMA, Aggregation & Crossing Networks well, but it’s not good enough to simply have a very fast engine. The engine must readily connect to a disparate world, provide tools for rapid development and back testing facilities to ensure successful deployments.

Apama fits all these requirements as it’s not only a very, very fast engine but has graphical tools that enable both business and IT staff to collaborate and rapidly produce solutions, back testing facilities to mitigate deployment risk and an open integration framework that includes many existing “off the shelf” adaptors.


It’ll be five years before my 20/20 hindsight enables me to recall what our future was. For now the only thing that I can say with certainty is Complex Event Processing and Apama will be playing a major role in whatever the future of FX holds.

Monday, June 16, 2008

SIFMA Retrospective

Posted by John Bates

Img00040As a follow-up to my colleague Louis' report on last week's SIFMA show, I thought I'd add some thoughts of my own. My conclusion is that it was the most exciting SIFMA show I have experienced.  While I think attendance was down from previous years, I also think the quality of attendees was up. And for me personally, the excitement of being involved in some industry-moving announcements as well as meeting up with many of my colleagues from capital markets firms, vendors, press and analysts was highly invigorating.

So what were the highlights and take-aways for me?

1. CEP is clearly a theme that is getting a lot of mindshare. So many people said that CEP was a key theme of the show – which is great to hear after many years of working to help define the market. It’s also great to see this add to the momentum so soon after the Event Processing Technical Society was launched. The use cases of CEP are many and varied – and there was a lot of interest and questions around this at SIFMA. We demonstrated on our SIFMA booth 5 different CEP use cases on 5 different pods - algorithmic trading, smart order routing, managing FX market fragmentation, market surveillance and real-time bond pricing. Also the demands of CEP applications continue to make demands on the technology, and we were thrilled to demonstrate Apama 4.0 – which extends performance and user experience of CEP to new levels. Another supporting factor in the maturing of CEP is that there are starting to be very senior people in Capital Markets firms focusing on CEP as an enabling technology. Marc Adler from Citigroup   is a key example. He’s active in the community and on the STAC CEP committee, helping to define benchmarks. It was great to meet Marc at SIFMA and also to catch up with many  other esteemed colleagues from the CEP space.

2. The liquidity wars are hotting up. It was our pleasure to be involved in a press release with NYSE-Euronext which was certainly one of the big releases of the conference. Progress Apama will be hosted in the NYSE Euronext as part of the exchange's Advanced Trading Solutions offering. Traders will be able to download custom logic for algorithmic trading, risk management and smart order routing into the NYSE itself - with low latency connectivity to other trading venues via Wombat and Transact Tools. This arrangement turns NYSE into a technology provider as well as a one-stop-shop liquidity provider. This announcement was picked up by major press, including the Financial Times - in Europe, America and Asia -- see the article here.

3. Hardware is important – and so is “green”. The increase of capital markets data volumes require completely new software architectures – like CEP. But software is not always enough to support the low latency transport , processing and storage requirements. Many firms are turning to specialized hardware, combined with software – to create high performance solutions. Vhayu, for example, launched Squeezer – which combines hardware and software to supercharge their tick data offering. Also, Progress Apama were pleased to put out a joint announcement with Sun on a collaboration for end-to-end CEP solutions – combining Sun hardware and operating systems with Apama’s CEP platform and solutions. We demonstrated an end-to-end bond pricing application using the whole stack. Sun was one of the vendors who have a “green” aspect to their hardware – for example on a major CEP deployment, the hardware can be scoped for peak throughput – but can selectively shut down capacity to save power when event throughput is reduced. In this era of high energy costs and global warming there seems to be a lot of interest in this approach.

4. I love partying on the trading floor. Progress Apama were honored to be invited to a party at the NYSE to celebrate the latest developments at NYSE-Euronext (see picture at the top). It was a great pleasure to speak with our friends at NYSE-Euronext and to meet many of our old friends from the capital markets industry there – while sipping some delicious wine in that amazing place. In a way it is a shame that electronic trading is making the traditional trading floor a thing of the past – but there is something amazing about that place and I hope it stays just the way it is – even if it becomes a cool venue for other purposes. Thanks to NYSE-Euronext for inviting us – we had a great time.

I’m sure I’ve neglected a load of other trends and themes – but there’s my brain dump for the day. I’m interested to hear if you all agree.

John

Saturday, June 14, 2008

Successful languages - show me the code please

Posted by Louis Lovas

The SIFMA show has just ended. It was a great success, we announced a few new partnerships and our latest version, Apama 4.0. We had a constant flow of people coming by our booth which of course increased dramatically at 3:00 pm each day when we opened the bar. We also had a magician who mesmerized the crowd with his sleight-of-hand exploits.  While our fearless leader John Bates had a seemingly endless stream of journalist and analyst interviews, myself and my colleagues did a three day tour of (booth) duty.  In addition to showing off the new Apama 4.0 Studio, we had demonstrations of our various Accelerators in Pricing, Smart Order Routing, Market Surveillance, FX Aggregation and good ol' Algo Trading.

SIFMA is a true technology show, vendors large and small had dazzling displays of their wares - hard and soft. For the average attendee it was the quintessential kid in a candy store experience. It was truly geek heaven.  My compatriots and I fielded a wide range of questions about the Apama platform, from the basic explanation of CEP in Capital Markets to how Apama is deployed in a wide range of asset classes.  A consistent theme I heard from many of the attendees coming by our booth was  "show me some code". The challenge of course was explaining a programming language in five minutes. It made me think of a recent blog by Mark Tsimelzon of Coral8 on what makes a programming language successful.

One of the most prominent characteristics of CEP, yet one of the most contentious is language. Mark's reference to slashdot links to a well written tutorial by Daniel Pietraru on the success of general-purpose programming languages and the likelihood of newcomers (i.e. Ruby, Python) unseating the mainstays (i.e. Java, C++). In a nutshell the answer to that question is an emphatic no. Interestingly, there are numerous aspects of this research on language that are applicable to CEP.

Event Processing Languages come in many shapes and sizes. They build upon prior art deriving their base syntax and semantic logic from a variety of older languages, whether that's SQL or general purpose languages like Java or C++. The marketing departments of all vendors trumpet the merits of their chosen course. In the past, I certainly have not been too quiet about this particular topic myself (although I have soften a bit lately).  The very fact that the language of CEP is so fractured among vendors is a clear sign of CEP's lack of maturity.  Yet it's what makes us unique, it's our special sauce and quite frankly gives this blog community the level of interest it enjoys. Standardization and commoditization has that Borg sense of sameness that I find a bit dull and boring (oh I've probably just incited a riot by making that statement).

There are a number of attributes that make for a successful programming language. The mainstays of C, C++, Java and (now) C# hold a commanding 49.915% popularity rating. Daniel Pietraru ascribes this success to a number of rationale, the first being similar syntax. A recognizable syntax is what gives languages that sense of familiarity as Mark Tsimelzon so thoughtfully pointed out.  But I believe there is one aspect of a programming language that is perhaps the most important of all, that of readability. Readability plays a huge role in the long term survival of not only a language but the software projects built in it.  The Java language, as shown in the popularity chart of this tutorial,  holds a commanding lead (20.176%) over all other general purpose languages. This success is arguably due to a natural evolution, a Darwinian survival of the fittest so to speak (or most popular as the case may be). It's authors wisely pruned the obtuse and wildly unreadable aspects of C++.  It's interesting to note where SQL (or PL/SQL to be specific) ranks, you'll have to check that for yourself.

How does all this relate to the language of CEP? The same set of characteristics still ring true. A familiar recognizable syntax is important yet readability is vitally important. This is the approach we at Apama have taken. Our EPL, MonitorScript is quite purposed for the event processing paradigm yet has the familiar readability of those mainstay languages like Java.  MonitorScript is predominately an imperative programming language with declarative constructs purposed for the event paradigm. Yet even the declarative on all Tick(symbol="IBM") is an easily understood concept.  In a short five minute session a Java, C++ or C# programmer will be able to not only see a familiar syntax but also a readable semantic of even a reasonably complex MonitorScript application.  This makes the first impression of Apama and our EPL a good one.

 

Thursday, June 12, 2008

Not Trading but Pricing

Posted by Richard Bentley

(with apologies to Stevie Smith)

Wednesday's announcement at the SIFMA conference of our collaboration with Sun Microsystems highlighted two aspects of our joint endeavor; our support for the Sun Solaris 10 platform on x64 and SPARC in the latest Apama 4.0 release and the development of a new Accelerator in the area of real-time pricing. The latter is a very interesting application of CEP technologies which I want to highlight further.

If there was still a need to justify a broader remit for CEP in Capital Markets beyond Algorithmic Trading, the area of real-time pricing fits the bill nicely. It bears all the hallmarks of an ideal CEP use case: an over-arching need to respond in real-time to high-volumes of streaming events from a variety of sources. Real-time response is critical, as a bad price being shown to the market, even for a few seconds, can result in huge exposure for the price maker as clients gleefully hit the price as fast as they can punch the keys - or more likely nowadays, as algo engines react to the sudden opportunity.

Over the last few years we have deployed the Apama platform in support of real-time pricing use cases in many clients across several asset classes. However, besides the real-time imperative, other commonalities from these use cases exist:

  • a multitude of (real-time) inputs: prices are derived as a function of multiple sources, for example real-time price of underlying (derivatives) or related instruments, market volatility, direction and size of deals by "indicative" clients, sector indices, news headlines and more;
  • a desire for price differentiation: the ability to make prices specific to individual clients, or "tiers" of clients, based on client relationships, agreed deal sizes, previous and desired success at winning client business (the "hit rate") etc;
  • risk management: real-time tracking of the current position or inventory accumulated through dealing on the published prices; the greater the inventory, the higher the potential exposure - so make prices to drive client trade flows in a certain direction to reduce the magnitude of the inventory and therefore the risk.

Of course, the Banks don't stream prices to their clients out of charity - clients pay a premium to deal on instuments they would otherwise need their own direct market connections, memberships and trading software to access. In addition, some Banks - the "Market Makers" - are contractually obliged to provide liquidity through streaming prices for specific instruments to electronic markets (before you start to fret, they get well-rewarded for this seeming altruism). Market Makers are usually contractually obliged to provide prices, within certain spreads, for an agreed number of hours of the trading day; monitoring adherence to ensure a Bank is fulfilling its market making obligations is yet another facet of the real-time pricing space where CEP can - and does - play a part.

Wednesday, June 11, 2008

SIFMA 2008 - Hot and Busy

Posted by Chris Martins

This week is the SIFMA (Securities Industry and Financial Markets Association) technology management conference in NYC.  The city has been brutally hot and the NYC Hilton exhibit hall, at least where Apama is, has not been measurably cooler. Those $5.00 bottles of water that they sell in the exhibit hall might represent opportunism - or capitalism - at its best. Or maybe it’s just NYC. :-)

First day attendance has been quite good and I’d encourage any attending to stop by the Apama exhibit (Grand Ballroom, Level 2, #2415). You’ll see a range of demos, including our new 4.0 release, and see some of the “magic” of CEP in action (you’ll have to come by to understand the magic reference). A lot has been happening with Apama and we’ve used this show to make some significant announcements. 

  • A partnership with NYSE Euronext Advanced Trading Solutions that enables NYSE Euronext ATS to offer its customers Apama-based CEP capabilities for algorithmic trading, real-time risk management, and smart order routing. The announcement has received significant press pickup, including a nice story in the Financial Times.
  • A technology development collaboration with Sun Microsystems involving support of the Apama platform on Sun x64 systems, running the Solaris 10, as well as the launch of a new Apama “Accelerator”, the Apama Real-time Pricing Accelerator. Apama continues to focus on application templates that enhance the speed with which solutions can be deployed and this Accelerator extends that theme into the area of market-making for bonds and other instruments.
  • Announcement of the upcoming availability of above-mentioned Apama 4.0, which introduces a new development environment, Apama Studio. Studio brings all the Apama development tools together within a single, integrated environment, complete with tutorials, sample applications and other aids. You might consider Apama 4.0 as, at its heart, a CEP Accelerator. 4.0 also includes significant enhancements to Apama’s execution performance, partially achieved via a new event messaging transport.

So a lot is happening both in NYC and at Apama.

Now if the Celtics could only close out the Lakers.

Friday, June 06, 2008

Apama Wins The Banker Award for Third Time

Posted by Chris Martins

As a bookend to a posting last week about an award by Profit & Loss,  Apama has won another award, this time by The Banker magazine.  Apama's Market Aggregation Accelerator was selected for FX in the Wholesale & Capital Markets category.   The publication indicates that its evaluation criteria span a broad range of factors, which I won't attempt to list here.  Details should be in the next issue of the magazine.

We like to call out these awards because they illustrate the real world power of CEP and believe they illustrate how a well architected platform can address a broad set of requirements, spanning asset classes and different use cases.  This particular award also validates the Apama Accelerator strategy, extending the core platform with capabilities that focus on specific market requirements.  We suspect that some other vendors in the CEP space might follow our lead in this.  Raw feeds and speeds have typified the marketing focus of others.  But the real power of CEP is in addressing business requirements, which has long been the Apama focus.  Otherwise all the performance in the world will be for naught.

Of particular note is that this is third time in the five years that Apama capabilities have been cited by The Banker, as Apama also won in 2004 and 2006.  So even years seem to be particularly favourable for us (I'll insert the "u" in favourable) in deference to the fact that The Banker is a British publication. :-)

The_banker_2008_fx

Wednesday, June 04, 2008

Event Processing Technical Society

Posted by Chris Martins

Another milestone in the evolution of the CEP market (I'll purposely not use the word "maturity", given the consternation that term seems to generate) was yesterday's announcement of the Event Processing Technical Society.  This consortia of vendors, academia, and other industry participants has formed to "facilitate the adoption and  effective use of event processing."  As one of the founding members, Progress is happy to see this effort get going in a more public manner, after several years of meetings and online discussion.

Not a standards body, the EPTS will otherwise work to try to advance an understanding of what event processing is and how it can be used, via promotion of use cases, best practices, academic research and other initiatives.   One factoid in announcement was a Gartner citation that a typical  large company deals with 100,000 to 1 million business events per second.”  The value of harnessing that information suggests a vibrant market opportunity that hopefully all market participants can get behind.

Monday, June 02, 2008

CEP Maturity Models

Posted by Louis Lovas

                                    <p>CEP Maturity Models</p>                                                                                                            


With the contentious debate on CEP maturity brewing, Tim Bass is using the Garner Hype Cycle to indicate that CEP is in the Technology Trigger phase - certainly an arguable point. Considering the next two phases
in the Gartner Hype Cycle, are "Peak of Inflated Expectations" and "Trough of Disillusionment" I'm not so sure we are at such an early phase. Those two terms give the illusion CEP is very nascent and I personally don't think so. One's view of the maturity of a software platform is predicated on past experiences and use cases. In my most recent blog on high availability the idea of a mandate on maturity was a point I was attempting to convey in the notion of Lost Opportunity vs. Loss-Less.  A Loss-Less use case clearly requires a mature platform to support such a business critical function.

As with most software infrastructure platforms, CEP being no exception, one can describe or categorize multiple maturity models. What a platform has and what a platform does.   

A CEP platform has (or should have) development tools, deployment & management tools, connectivity adapters, database adapters, dashboards, a robust architecture for reliability, scalability and high availability.  All of these infrastructure capabilities have a maturity life cycle within a CEP platform and are paramount to customer IT organizations responsible for the care and feeding of applications.
What a CEP platform does refers to what sort of applications one can build with the technology. Can one just do simple pattern detection or infinitely more complex analytics?

These two categories have independent maturity life cycles but are inter-related. To illustrate consider the following fictitious example. Say we have an application whose sole purpose is to count. What this application does is count an integer number continually for each connected client. It simply needs to count upwards - 7x24 without missing a beat, failure to count means catastrophic business failure with financial and legal ramifications. It starts out counting for just a few clients but over time grows to support thousands or tens of thousands of clients.   In order to support the deployment of our application, we need a platform with the reliability demanded of a 7x24 operation. A high availability architecture is mandated in the event of a failure and it must be able
to scale as the business growsThis simple example illustrates the point that even the simplest application, if critical to the business needs a mature platform.  Just substitute a counting application for any more complex use-case - Smart Order Routing, Dark Pool trading, Fraud detection, etc. 

What a CEP platform has tracks independently of what it is capable of doing. The maturity of what a CEP platform has is much more quantifiable, we just need to look at other platforms such as enterprise class Application Servers, Message Systems and Database systems and for the most part follow suit. What CEP does, is likely what Tim is referring to when he states we're in the Technology Trigger phase.  This maturation process will drive much more slowly and will expand as new use cases are discovered.