« September 2009 | Main | November 2009 »

October 2009

Friday, October 30, 2009

Progress Apama and Revenue Management Podcast

Posted by Apama Audio

Listen to this podcast to learn how organizations are using Progress Apama Business Event Processing for Revenue Management.

Wednesday, October 21, 2009

Business Event Processing Podcast Overview

Posted by Apama Audio

Listen to this podcast to get an overview of how Progress Apama Business Event Processing helps organizations to monitor, analyze and act on information in real-time.

Monday, October 19, 2009

Progress Apama Announcing Latest Release 4.2

Posted by Apama Audio

As a follow up to the Louie Lovas blog posting on October 16th , this  podcast captures a discussion between David Olson and Giles Nelson on Apama 4.2 features.


Friday, October 16, 2009

Apama 4.2 release - Cruising in the fast lane

Posted by Louis Lovas

Apama 4.2 release - Cruising in the fast lane
The Apama engineering team has done it once again. True to our record of releasing significant new features in the Apama product every 6 months, the v4.2 release is hot off the presses with major new functionality. The Apama roadmap is driven by a keen sense of our customer requirements, the competitive landscape and an opportunistic zeal. The engineering team is a dedicated R&D team driven to excellence and quality. We are dedicated to delivering value to our customers. A consistent comment we've heard from analysts and customers alike is the maturity of the Apama product.  

The current v4.2 release, the third in the v4.x family adds significant enhancements in three concurrent themes - Performance, Productivity and Integration. This consistent thematic model is one we've held for a number of years. Below I've touched upon the highlights of the current release along these themes:


  • Performance
High Performance Parallelism for Developers.  The Apama Event Processing Language (EPL) provides a set of features uniquely suited to build scalable event-driven applications.  The language natively offers capabilities for event handling, correlating event streams, pattern matching and defining temporal logic, etc. Equally important, the language provides a flexible means to process events in parallel.  For this we provide a context model and a new high performance scheduler. Contexts can be thought of as silos of execution, where CEP applications run in parallel. The scheduler's role is to manage the runtime execution in an intelligent high-performance way, and to leverage the underlying operating system threading model. It’s via the context architecture that the Apama Correlator squeezes the most out of operating system threads to achieve maximum use of multi-core processors for massive vertical scalability. For IT developers, this is a effective and efficient means to build high performance, low latency CEP applications without the pitfalls of thread-based programming, such as deadlocks and race conditions.

High Performance Parallelism for Business Analysts.  Not to be left out of the race, we've also ensured the scalable parallelism provided in the Apama CEP engine is available through our graphical modeling tool, the Event Modeler. We've had this graphical modeling capability since the very first release of Apama. This tool designed for analysts, quantitative researchers and of course developers, allows you to design and build complete CEP applications is a graphical model.  Parallelism is as easy as an automatic transmission, simply select P for parallel.

  • Productivity

Real men do use Debuggers (and Profilers too). The Apama Studio now sports major new functionality for development, a source level debugger and a production profiler. Building applications for an event-driven world presents new programming challenges. Having state-of-the-art development tools for this paradigm is a mandate. The Apama EPL is the right language for building event-driven applications - now we have a source-level debugger designed for this event paradigm. Available in the Eclipse-based Apama Studio it provides breakpoints to suspend applications at specific points, examine contents of program variables and single stepping. It works in concert with our parallelism as well. Profiling is a means to examine deployed Apama applications to identify possible bottlenecks in CPU usage.

Jamming with Java. We've enhanced our support for Java for building CEP applications. The Apama Studio includes a complete set of wizards for creating monitors, listeners, and events to improve the development process when building java-based CEP applications in Apama.

  • Integration

The (relational) world plays the event game. While we have provided connectivity to relational databases for many years we've made a significant re-design in the architecture of how we do it with the new Apama Database Connector (ADBC). The ADBC provides a universal interface to any database and includes standard connectors to ODBC and JDBC.  Through the ADBC, Apama applications can store and retrieve data in standard database formats using general database queries, effectively turning these relational engines into timeseries databases. The data can be used for application enrichment and playback purposes. To manage playback the Apama Studio includes a new Data Player that enables back-testing and event playback from a range of data sources via the ADBC. One can replay at varying speeds event data and time itself. The tested CEP applications behaves temporally consistent even as data is replayed at lightening speed.

Cruising at memory speed with MemoryStore. The MemoryStore is a massively scalable in-memory caching facility with in-built navigation,  persistence and visualization functionality.  This allows CEP applications, which typically scan, correlate and discard data very quickly to retain selected portions in memory for later access at extreme speed. This could be for managing a financial Order Book, Payments or other data elements that the application needs to be able to access at user’s requests quickly. Furthermore, if required the in-memory image can be persisted to a relational database for recovery or other retrieval purposes, and lastly the MemoryStore allows selected portions of the in-memory cache to be automatically mapped to dashboards.

Well that's the highlights. There were also about a dozen other features within each of these three themes, just too numerous to mention.

We are committed to improving the Apama product by listening to our many customers, paying close attention to the ever-changing competitive landscape and researching new opportunities.

Again thanks for reading, you can also follow me at twitter, here.
Louie



Friday, October 09, 2009

Developing Event Processing Applications

Posted by Apama Audio

Listen to this podcast to hear Chris Martins and Giles Nelson discuss development of event processing applications.


Thursday, October 08, 2009

If You Build It They Will Come, An Apama Algo Webinar

Posted by Louis Lovas

IF You Build It They Will Come
My colleague Dan Hubscher and I just finished the first of a two part Webinar entitled "Build Quickly, Run Fast". In this Webinar we explained and demonstrated Apama as an Algo platform for high frequency and order execution algorithms.

As I've blogged in the recent past it is an arms race in High Frequency trading.  The need to build quickly is a demanding requirement to keep ahead in the race. Being armed with the right tools is paramount. Rapid development and customization of strategies using graphical modeling tools provides the leverage necessary to keep pace with fast moving markets.

To that point, in this webinar I demonstrated a couple of algo examples. The first was a complete strategy that incorporates an alpha element with multiple order execution options. In  designing and building strategies the trading signal detection is just the first part of the problem. This typically involves an analytic calculation over the incoming market data within some segment or window of time. For example a moving average calculation smooths out the peaks and valleys or the volatility of an instrument's price. Once the signal is detected it's time to trade and manage the order's executions. This is a key distinction between other CEP products and the Apama platform for building trading strategies. While it's possible to define an Event Flow in most or all CEP products for data enrichment and data analysis (i.e. the signal detection), for most other CEP products you have to switch out to some other environment & language to build the rules to manage the executions. The Apama platform is about building complete event-driven applications. So trade signal detection and order executions, whether it's a simple iceberg execution or something much more complex it can easily be designed, built and backtested in the same Apama graphical modeling environment (Of course for those more inclined to traditional development tools and methodologies, Apama offers a full suite of developer tools, an EPL, debugger, profiler and java support).

MovingCrossover Image


The second example in the Webinar demonstration was to build a small, but working strategy from scratch. I did this live in full view of the attendees. For this I did a basic price momentum strategy. This tracked the velocity of price movements. The trading signal was a parameterized threshold which indicated when that price moved up (or down) a specific amount for a specific duration.

This webinar is focused on highlighting the ever-present challenges investment firms face in high frequency trading:
  • Fears of the Black Box
  • The simple fact that markets are continually evolving
  • First Mover Advantage
  • Customization is king
Along with my colleague Dan Hubscher,  the Build Quickly webinar describes how the Apama platform delivers solutions to the Capital Markets industry to meeting these needs and challenges. 

Stay tuned for a link to the recording and don't forget to dial in to part II where we focus on performance requirements and characteristics. Again thanks for reading (plus watching the webinar), you can also follow me at twitter, here.

A follow up note, here's the link to the recordings for both part I and part II on Build Quickly Run Fast.

Louie



Wednesday, October 07, 2009

Business Events and Operational Responsiveness - our research

Posted by Giles Nelson

Yesterday, we published a press release on some research that we commissioned from a independent research firm. I wanted to give a bit more background to the research and how we intend to use it.

Our intent in doing this research was twofold:

(a) To discover something new about the markets that Progress operate in and validate some of our own beliefs about the market (or dispell them).

(b) To gather some interesting and relevant information to act as talking points around the things we think are important for our customers and prospective customers, as well, of course, as being commercially relevant to us.

We commissioned the research company Vanson Bourne to do this research and whilst we worked with them on the scoping of it, it was left entirely to them to execute on that scope.

We wanted to hear from end-users so a range of questions were posed to 400 organisations in Europe and the US in three industries - telecommunications, energy generation and logistics. No vendors, analysts or systems integrators were approached.

The questions were all around the theme of "operational responsiveness" - how good are firms at monitoring their operations, identifying issues with process execution, interacting with their customers, extracting and integrating information etc. In particular how good are firms at dealing with the business events which are flowing around, both internally and externally, and how good are they at acting on them in a timely fashion?

Why did we pick these three verticals? Firstly, we couldn't cover everybody and we wanted to go to more companies in a few verticals rather than go very broad. Secondly, we believe that these three verticals are the most interesting when it comes to the demands being placed upon them to cope with business events (Financial services is another obvious one but we know quite a lot about the demands in that industry already). Telecommunications firms are very dependent upon IT to differentiate their services; logistics companies are using more and more technology to track goods, trucks, ships etc. to streamline and automate their operations; energy producers are having to rapidly plan for the introduction of smart metering.

We're still digesting the results. But a few are worth highlighting here. Social networking is creating a significant challenge for all organisations in dealing with customer feedback - consumers expect instant feedback to their interactions. Organisations aspire to more dynamic and real-time pricing of goods and services to increase their competitiveness and maintain margins. And companies struggle with achieving a holistic view of how their processes are operating, both to operationally identify issues before they become expensive to fix or affect customer services, and to identify ways in which processes can be shortened.

We'll be talking more about the research results soon, both qualitatively and quantitatively.


Monday, October 05, 2009

Progress Apama Capital Markets in Brazil

Posted by Apama Audio

In recent years, the Brazilian market has grown stronger, and become very aggressive with algorithmic trading. Just back from a conference in Brazil, listen to this podcast where Dan Hubscher shares insight into the current state of Brazil’s market, and what the people down there are buzzing about.


Friday, October 02, 2009

High Frequency Trading on Jon Stewart's Daily Show

Posted by Chris Martins

Well, you know that High Frequency Trading has gone mainstream when it makes Jon Stewart's The Daily Show.  Check out this segment with Samantha Bee.  We don't see many folks in cow costumes at our trade shows, but maybe that is a sign that HFT is going mainstream?