« January 2010 | Main | March 2010 »

February 2010

Monday, February 22, 2010

Peas and Carrots

Posted by Louis Lovas

In the words of the auspicious Forrest Gump some things go together like peas and carrots. Truer words were never spoken. Some things just do go together well, sometimes by design, often by accident. I don't think anyone actually planned milk and cookies or popcorn at the movies but nonetheless these things are made for each other.  When it comes to technology the same harmonious relationships exist.

In the recent Aite report on High Performance Databases (HPDB),  the market for specialized databases is surveyed along with a handful of vendors in this space.  This is a cottage industry where the big database vendors don't play. It's hard to imagine in this day and age where database technology is so standardized and mature and a multitude of choice abounds from commercial products to open source that any other database technology and a gang of vendors would have a chance. Yet it is happening and it's thriving.  

I believe it has to do with a synergistic relationship to event processing. If CEP is the "peas" then HPDB's are the "carrots". These two technologies share two fundamental precepts:

  •  A focus on Extreme Performance
  •  Temporal Awareness

I. Extreme Performance, Speeds and Feeds
These HPDB's which are often referred to as Tick databases, are found in the same playground as event processing technologies. In the Capital Markets industry they connect to the same market sources, consume the same data feeds. Both technologies are designed to leverage modern multi-core hardware to consume the ever-increasing firehose of data. By the same token, once that data is stored on disk, database query performance is equally important.  The massive amount of data collected and is only as good as the database's ability to query it efficiently thus creating another (historical) firehose of data which an event processing engine would be the consummate consumer.  

II. Temporal Awareness, when is the data
Time is a basic principle in event processing technology, applications typically have as a premise to analyze data-in-motion within a window of time. HPDB's design center is to store and query time series data. Some of the database vendors even bring time to a higher level business function. They understand the notion of a business Calendar, knowing business hours, business week, holidays, trading hours, etc.  Imagine the simplicity of a query where you want 'Business hours Mon-Fri for the month of February' and the database itself would know the third Monday was Presidents Day, skipping over that, thus preventing analytic calculations from skewing erroneously.

Leveraging the Synergy
These two fundamental shared principles provide the basis for a unique set of business cases that are only realized by leveraging Event Processing platforms and High Performance Databases

  • Back testing algorithms across massive volumes of historical data compressing time
What if you could test new trading algorithms against the last 6 months or 1 - 2 years of historical market data but run that test in a matter of minutes? What if you could be assured that the temporal conditions of the strategies (i.e. timed limit orders) behaved correctly and deterministically matching the movement of time in complete synchronicity with the historical data? These are just a few of the characteristics that define the harmony between event processing and high performance (Tick) databases.
  • Blending live and historical data in real-time
Querying historical data in-flight to obtain volume curves, moving averages, the latest VWAP and other analytics calculations are possible with these high performance databases. Leading edge trading algorithms are blending a historical context with the live market and even News. The winners will be those that can build these complex algo's and maintain ultra low-latency.
  • Pre-Trade Risk Management
Managing positions, order limits and exposure is necessary, doing it in real-time to manage market risk is a mandate.  In addition to market data, these high performance databases can store pre and post trade activity to complement event-based trading systems and become the basis for trade reporting systems.

In the Trading LifeCycle, Event Processing and High Performance databases are partner technologies harmoniously bound together to form a union where the whole is greater than the sum of the parts. They are the peas and carrots that together create a host of real-world use-cases that would not be possible as individual technologies.

Myself along with my colleague Dan Hubsher we are doing a 3-part Webinar series entitled "Concept to Profit". The focus is on event processing in the trade lifecycle, but we include cases that touch upon high performance databases. You can still register for part 2: Building Trading Strategies in the Apama WorkBench where I will focus on the tools for strategy development aimed at the IT developer.

Once again thanks for reading, you can follow me at twitter, here.
Louie

Friday, February 19, 2010

The Debate on Dark Pools

Posted by Chris Martins

Dr. Giles Nelson, Chief Technology Strategist with Progress Software and a co-founder of Apama, has weighed in on the continuing debate about the possible need to regulate dark pools (Why The Outlawing Of "Dark Liquidity Pools" Debate Rumbles On) in FreshBusinessThinking.com.  Beyond the thoughtful summary of the current issues, one might want to read any article that includes a reference to "the devil's spawn."   

Tuesday, February 09, 2010

Brazil Embraces High Frequency Trading - Do You?

Posted by Dan Hubscher

The trading business feels like a fight, now as ever, with the threat of sweeping regulations as the most pressing concern of the moment.  This business even sounds like a war, too - with algorithms names like "Raider" and "Sniper;" and with terms like "dark pools," and "low latency arms race" drawing focus from regulators and media alike.  But the war-like aspect remains because trading is a highly competitive business. 

The imperative to increase market share will remain a top priority this year along with risk management and regulatory compliance, and the technology required to compete is available to everyone.  As Apama has expressed before here, the markets are still driven by those with the flexibility to quickly adapt to new regulations, the insight to understand new market behaviors, and the imagination to conceive a trading strategy that can capitalize on the opportunity.  It's no wonder that the relatively small number of firms using high frequency trading strategies are responsible for over 70% of US equity trading volume.  These pressures push the rest of the capital markets in the same direction and the trend is unlikely to reverse. 

On February 8th 2010, Apama announced that Banco Fator Corretora, a Brazilian bank and brokerage firm, has deployed the Progress® Apama® Algorithmic Trading Accelerator. Apama plays a critical role in Banco Fator’s new electronic trading strategy, enabling it to more effectively develop high frequency, proprietary trading tactics, achieve rapid customization, and perform low latency execution of trades on behalf of its buy-side clients.  Banco Fator is also working with its clients to design customized algorithmic trading strategies that provide them significant competitive advantage, and the bank explicitly emphasized the importance of providing its clients with a fast method to enter the high frequency trading business.

Why so much emphasis on high frequency trading (HFT)?  Reasons will differ among traders and regions, but a short primer on HFT and some ideas are here.  The Brazilian market has expressed a strong opinion on the matter, with over 15 customers deploying Apama internally to automate execution and/or alpha-seeking strategies in the past year-and-a-half, with many further rolling out the platform to downstream clients. 

So, do you have an opinion on HFT as well?  What characteristics should a platform for HFT have to enable you to be more competitive?  Let us know - leave a comment, or take our poll.

-Dan

Friday, February 05, 2010

CEP consolidation continues

Posted by John Bates

It’s been another remarkable week. I told my wife there would be a lot of traveling for me in the first part of the year and I was right. Last week was New Jersey and New York. This week was Dallas Fort Worth and Silicon Valley. I’ve been visiting key customers, journalists and analysts.

This week has also seen further consolidation in the CEP market. I have been predicting that there could not be a stand-alone CEP market and that CEP will either find a home in applications, databases, stacks or business application platforms.  In this case Sybase has snapped up Aleri to extend its database business into the CEP domain, as well as solutions in the risk space. Aleri are a good company with good people and good products. They come from the “in memory database” perspective but developed a high performing CEP engine and learned lessons from real customers that a SQL approach is not adequate to address real applications, and embedded actions statements are need within a CEP language. Also they learned that the best way to sell CEP is not as a technology but as solutions.

I think Sybase have made a smart move – for probably a bargain price – judging by the release that says they acquire the assets only. I wish my friends at Aleri all the best for the future.

Monday, February 01, 2010

From Concept to Profit in No Time Flat – High Frequency Trading

Posted by Chris Martins

Colleagues Dan Hubscher and Louie Lovas have begun a great webinar series that outlines the “lifecycle” of an algorithmic trading strategy.  Illustrated with Apama’s Event Modeler, they use a Commodity Futures trading strategy to illustrate how trading firms can accelerate the delivery of trading strategies with a development tool that is accessible to the trading desk. This can help make significant cuts in development times, allowing firms to capitalize more quickly on opportunities.  Future sessions will explore some of the other aspects of the platform that target developers, recognizing that firms have different business models and development styles.

Lifecycle Image

We’ll be posting the recorded version shortly for on-demand viewing, but if you have not registered, I’d encourage you to click here and get on board for parts 2 and 3.