The Event Processing Market

Monday, February 22, 2010

Peas and Carrots

Posted by Louis Lovas

In the words of the auspicious Forrest Gump some things go together like peas and carrots. Truer words were never spoken. Some things just do go together well, sometimes by design, often by accident. I don't think anyone actually planned milk and cookies or popcorn at the movies but nonetheless these things are made for each other.  When it comes to technology the same harmonious relationships exist.

In the recent Aite report on High Performance Databases (HPDB),  the market for specialized databases is surveyed along with a handful of vendors in this space.  This is a cottage industry where the big database vendors don't play. It's hard to imagine in this day and age where database technology is so standardized and mature and a multitude of choice abounds from commercial products to open source that any other database technology and a gang of vendors would have a chance. Yet it is happening and it's thriving.  

I believe it has to do with a synergistic relationship to event processing. If CEP is the "peas" then HPDB's are the "carrots". These two technologies share two fundamental precepts:

  •  A focus on Extreme Performance
  •  Temporal Awareness

I. Extreme Performance, Speeds and Feeds
These HPDB's which are often referred to as Tick databases, are found in the same playground as event processing technologies. In the Capital Markets industry they connect to the same market sources, consume the same data feeds. Both technologies are designed to leverage modern multi-core hardware to consume the ever-increasing firehose of data. By the same token, once that data is stored on disk, database query performance is equally important.  The massive amount of data collected and is only as good as the database's ability to query it efficiently thus creating another (historical) firehose of data which an event processing engine would be the consummate consumer.  

II. Temporal Awareness, when is the data
Time is a basic principle in event processing technology, applications typically have as a premise to analyze data-in-motion within a window of time. HPDB's design center is to store and query time series data. Some of the database vendors even bring time to a higher level business function. They understand the notion of a business Calendar, knowing business hours, business week, holidays, trading hours, etc.  Imagine the simplicity of a query where you want 'Business hours Mon-Fri for the month of February' and the database itself would know the third Monday was Presidents Day, skipping over that, thus preventing analytic calculations from skewing erroneously.

Leveraging the Synergy
These two fundamental shared principles provide the basis for a unique set of business cases that are only realized by leveraging Event Processing platforms and High Performance Databases

  • Back testing algorithms across massive volumes of historical data compressing time
What if you could test new trading algorithms against the last 6 months or 1 - 2 years of historical market data but run that test in a matter of minutes? What if you could be assured that the temporal conditions of the strategies (i.e. timed limit orders) behaved correctly and deterministically matching the movement of time in complete synchronicity with the historical data? These are just a few of the characteristics that define the harmony between event processing and high performance (Tick) databases.
  • Blending live and historical data in real-time
Querying historical data in-flight to obtain volume curves, moving averages, the latest VWAP and other analytics calculations are possible with these high performance databases. Leading edge trading algorithms are blending a historical context with the live market and even News. The winners will be those that can build these complex algo's and maintain ultra low-latency.
  • Pre-Trade Risk Management
Managing positions, order limits and exposure is necessary, doing it in real-time to manage market risk is a mandate.  In addition to market data, these high performance databases can store pre and post trade activity to complement event-based trading systems and become the basis for trade reporting systems.

In the Trading LifeCycle, Event Processing and High Performance databases are partner technologies harmoniously bound together to form a union where the whole is greater than the sum of the parts. They are the peas and carrots that together create a host of real-world use-cases that would not be possible as individual technologies.

Myself along with my colleague Dan Hubsher we are doing a 3-part Webinar series entitled "Concept to Profit". The focus is on event processing in the trade lifecycle, but we include cases that touch upon high performance databases. You can still register for part 2: Building Trading Strategies in the Apama WorkBench where I will focus on the tools for strategy development aimed at the IT developer.

Once again thanks for reading, you can follow me at twitter, here.
Louie

Friday, February 05, 2010

CEP consolidation continues

Posted by John Bates

It’s been another remarkable week. I told my wife there would be a lot of traveling for me in the first part of the year and I was right. Last week was New Jersey and New York. This week was Dallas Fort Worth and Silicon Valley. I’ve been visiting key customers, journalists and analysts.

This week has also seen further consolidation in the CEP market. I have been predicting that there could not be a stand-alone CEP market and that CEP will either find a home in applications, databases, stacks or business application platforms.  In this case Sybase has snapped up Aleri to extend its database business into the CEP domain, as well as solutions in the risk space. Aleri are a good company with good people and good products. They come from the “in memory database” perspective but developed a high performing CEP engine and learned lessons from real customers that a SQL approach is not adequate to address real applications, and embedded actions statements are need within a CEP language. Also they learned that the best way to sell CEP is not as a technology but as solutions.

I think Sybase have made a smart move – for probably a bargain price – judging by the release that says they acquire the assets only. I wish my friends at Aleri all the best for the future.

Monday, January 11, 2010

Why businesses must evolve their business processes to be highly responsive, dynamic and predictive – or they will cease to be competitive

Posted by John Bates

Today Progress Software announced the acquisition of Savvion http://web.progress.com/inthenews/progress-software-co-01112010.html. I believe this heralds the beginning of a very exciting phase for Progress Software. Now Progress has become a leader in Business Process Management (BPM). But more than that, combined with our other solutions, Progress is now uniquely able to empower businesses to be operationally responsive – through responsive, dynamic and predictive business processes. And this is critical to keep modern businesses competitive.

You might wonder about the journey Progress went through to realize what the market needed. It was all about understanding the emerging needs of our customers and where they needed their businesses to go. The part of my job I enjoy the most is spending time with customers and understanding what pain points they have - with the ultimate goal of working with them to address the pain and making them highly competitive.

Over the last couple of years I have been hearing more and more from customers about the need to be operationally responsive. For example, many customers have expressed their desire to proactively – and often in real-time - address the needs of their customers and respond to the behavior of their competitors. The goals are to win new business, increase customer satisfaction and triumph over their competitors. These findings hold true whether the customer be in banking, insurance, communications, travel, transport, logistics, energy, gaming or many other industries. It could be British Airways ensuring their high value customers are looked after first in the event of a flight delay, or wireless carrier 3Italia pushing real-time offers to their customers based on their profile, activity and location, or maritime logistics provider Royal Dirkzwager dynamically adjusting the course and speed of a container ship to optimize fuel usage, based on weather conditions and port berth availability.

Operational responsiveness is thus about being highly responsive to opportunities and threats – and even anticipating such scenarios. Market research supports what I’ve been hearing, such as the recent survey by Vanson Bourne http://web.progress.com/en/inthenews/companies-stuck-in-o-10062009.html – suggesting Operational Responsiveness has moved from a nice-to-have to a must-have.

There are a number of business facing solutions that have shown great promise in addressing operational responsiveness. One of those is Business Transaction Assurance (BTA). This enables businesses to discover their business processes and gain visibility on the effectiveness of these business processes – even if they are built in a wide variety of heterogeneous technologies and work across legacy applications. BTA non-disruptively discovers business processes – without any modification to existing applications – and monitors to ensure processes run to completion. BTA also discovers bottlenecks and hotspots in the processes – enabling businesses to understand just how efficiently they run.

Another important solution is Business or Complex Event Processing (BEP or CEP). This enables business users to model the detection of and reaction to patterns indicating business opportunities and threats in real-time. Examples could be an opportunity to up-sell to a customer on the web-site now (opportunity) or risk exceeding a key level (threat).

And then of course there’s Business Process Management (BPM). This enables business users to model and execute a business process flow. BPM is also widely used for Business Process Improvement (BPI) – the re-engineering of (parts of) existing processes to improve their effectiveness.

The really cool thing we realized in talking with our customers is what happens when you use BTA, BEP/CEP and BPM together. Suddenly businesses are empowered to discover how effective they run, to detect opportunities and threats dynamically and to invoke business processes in response. The business becomes dynamic and responsive. Business users can take control and model the behavior they want their business to exhibit under certain circumstances, and through dashboards they can track the effectiveness of the business. Over time, the areas of the business processes that should be improved can also be detected.

Progress already has leading products in BTA and BEP/CEP with Actional and Apama. Progress chose Savvion to complete the story for a number of reasons. Savvion has a history of innovation and is a leading pure-play BPM provider. But Savvion also has a very rich platform, which includes not just BPM modeling and execution, but also an event engine, a business rules engine, a document management system and an analytics engine. The fact that Savvion enables business processes that respond to events means it immediately works well with Actional and Apama. And with high performance, scalability and availability, Savvion fits perfectly into Progress – where we pride ourselves that all of our products exhibit these characteristics.

In summary, Progress is now a best-of-breed BPM vendor – and not just at the departmental level – but at the enterprise level. But we’re also more than that. Our goal is to enable operational responsiveness and ensure our customers gain competitive advantage through the power of responsive, dynamic and predictive business processes.

10 Reasons Why Progress Chose Savvion

Posted by John Bates

Today Progress announced the acquisition of Savvion http://web.progress.com/inthenews/progress-software-co-01112010.html

The reason that Progress chose to enter the BPM market is clear. Businesses are increasingly turning to BPM to implement and improve their business processes. Why? Firstly because no other solution can help enterprises achieve real-time visibility, agility, efficiency and business empowerment the way BPM does. Secondly BPM enables this to be achieved with low Total Cost of Ownership (TCO) and ease of use.

But why did Progress choose Savvion? Here are 10 reasons to start off with…

  1. Savvion is a trailblazer and industry leader – Savvion is a pioneer in BPM but is also still at the cutting edge. We wanted the best BPM thinkers at Progress. 
  2. Savvion has been proven to work at the enterprise level. Some BPM systems only work at the departmental level, but Savvion works at either departmental level or enterprise levels.
  3. Savvion offers System-centric and Human-centric BPM – Savvion can orchestrate processes but can also involve human users in workflow.
  4. Savvion is event-enabled – so business processes can respond to events. Progress has a lot of momentum behind event-driven business systems through our Actional and Apama solutions – and Savvion will work seamlessly in event-driven business solutions.
  5. Savvion offers vertical industry solutions – Analogous to Progress’ Solution Accelerators, Savvion offers out-of-the-box vertical solutions in industries including Financial Services and Telecommunications.
  6. Savvion offers an integrated Business Rules Management System – Expressing logic in terms of rules can often be very important. Savvion have developed a rules engine, integrated with their BPM system, enabling decision-oriented BPM – modifying the process flow based on rule conditions. This is a powerful capability.
  7. Savvion offers an integrated Analytics Engine – Business Intelligence has proved its worth but it is a “rear view mirror” technology – analyzing facts that have already happened. Savvion’s analytics engine enables continuous analytics to augment business processes and human user with advanced real-time analytics, enabling better decision-making.
  8. Savvion offers an integrated Document Management System (DMS) – Savvion’s integrated DMS enables rich document handling and empowers document-centric BPM.
  9. Savvion BPM suite is highly scalable, high performance and highly available – At Progress we pride ourselves on the strength of our underlying technology. We want to offer our customers a complete solution that embodies scalability, performance and availability. Thus selecting a BPM vendor in-keeping with this philosophy was key – and Savvion is just such a vendor.
  10. Savvion is a great cultural fit with Progress – An often-overlooked point is that cultural fit is key to acquisition and integration success. The Savvion team pride themselves on being innovative, customer-focused and fun - just like the Progress team. We’re looking forward to working together. 

Tuesday, December 22, 2009

My Baby Has Grown Up

Posted by John Bates

20090625_7172 copy_2 I was proud to recently be appointed CTO and head Corporate Development here at Progress Software http://web.progress.com/en/inthenews/progress-software-ap-12102009.html. But I don’t want anyone to take that as an indication that I won’t still be involved with event processing – au contrair. Event processing (whether you call it CEP or BEP) is now a critical part of enterprise software systems – I couldn’t avoid it if I tried!!

But taking a broader role does give me cause to reflect upon the last few years and look back at the growth of event processing and the Progress Apama business. Here are some observations:

  • It’s incredibly rare to have the pioneer in a space also be the leader when the space matures. I’m really proud that Progress Apama achieved that. Our former CEO Joe Alsop has a saying that “you don’t want to be a pioneer; they’re the ones with the arrows in their backs!” Usually he’s right on that one – but in the case of Progress Apama, the first is still the best! Independent analysts, including Forrester and IDC, all agree on it. Our customers agree on it too.
  • It’s tough at the top! I had no idea that when you are the leader in a space, many other firms’ technology and marketing strategies are based completely around you. I have met ex-employees of major software companies that have told me that there are Apama screenshots posted on the walls of their ex firms’ development centers – the goal being to try to replicate them or even improve on them. Other firms’ marketing has often been based on trying to criticize Apama and say why they are better – so their company name gets picked up by search engines when people search for Apama.
  • Event processing has matured and evolved. Yes it is certainly used to power the world’s trading systems. But it’s also used to intelligently track and respond to millions of moving objects, like trucks, ships, planes, packages and people. It’s used to detect fraud in casinos and insider trading. It’s used to detect revenue leakage in telecommunications and continually respond to opportunities and threats in supply chain, logistics, power generation and manufacturing. It enables firms to optimize their businesses for what’s happening now and is about to happen – instead of running solely in the rear view mirror.
  • Despite all the new application areas, Capital Markets remains a very important area for event processing. Critical trading operations in London, New York and around the world are architected on event processing platforms. The world’s economy is continually becoming more real-time, needs to support rapid change and now needs to support the real-time views of risk and compliance. We recognize the importance of Capital Market. My congratulations to Richard Bentley who takes on the mantle of General Manager of Capital Markets to carry on Progress Apama’s industry-leading work in this space. With his deep knowledge and experience with both Apama and Capital Markets, Richard is uniquely placed to carry on the solutions-oriented focus that has been the foundation to Progress Apama’s success.
  • Even in a terrible economy, the value of event processing has been proven – to manage costs, prevent revenue leakage and increase revenue.  Progress announced our fourth quarter results today http://web.progress.com/en/inthenews/progress-software-an-12222009.html which saw a double digit increase for Apama and triple digit for Actional. Apama and Actional are used, increasingly together, to gain visibility of business processes without modifying applications, to turn business process activity into events and to respond to opportunities and threats represented by event patterns – enabling the dynamic optimization of business performance.
  • But one thing I do believe: that soon there will be no such thing as a pure-play CEP vendor. CEP is part of something bigger. We’ve achieved the first mission, which is to raise the profile of event processing as a new technique that can solve hitherto unsolvable problems. Now the follow on mission is to ensure event processing finds its way into every solution and business empowerment platform. It is one of a set of key technologies that together will change the world.

I wish everyone Happy Holidays and a successful and profitable 2010 !!!

Monday, December 14, 2009

Predictions for 2010

Posted by Giles Nelson

Last week we published some predictions for capital markets would evolve in 2010. I’d like to say a bit more about them.

Firstly, we predict there will be a big uptake of the use of technology for regulatory compliance and enforcement. Whilst the turmoil of the last 18 months was primarily caused by the over-the-counter, credit derivatives market, the effects of increased regulatory scrutiny are being felt throughout the trading industry. The power and authority of regulators has been bolstered, exchanges and alternative trading venues understand there is a greater need to monitor trading activity, and brokers and buy-side firms want to monitor and control their own and their clients’ trading activities better. There has been a significant debate in the media in the last few months on the merits of high frequency trading and variants of. This started in the specialist trade media, then made it to mainstream news outlets such as the BBC and it has been a topic deemed sufficiently important to be discussed by members of the US Congress and the British government. This has resulted in pressure on market participants to really up their game as far as trade monitoring is concerned. So how will technology be used to better enforce regulation and control over trading activity? Let’s start with the liquidty venues – the exchanges, the MTFs, the ECNs and the dark pools. Regulated exchanges in advanced markets generally do real-time monitoring of trading activity already to spot patterns of market abusive behaviour. They will need to continue to invest to ensure that they have the technology that can scale and be flexible enough to evolve with  changing patterns of trading behaviour. In contrast, exchanges in emerging markets do not often have adequate monitoring systems. This will change – we’re seeing substantial interest in Apama for exchange surveillance in less developed markets.  At the other end of the liquidity spectrum, the regulation around dark pools will change. It is likely that there will be limits imposed on the proportion of stock that can be traded through dark pools and operators will need to to disclose more information. Furthermore, regulators will insist that dark pool operators prove they have adequate monitoring systems in place. It won’t be just a paper exercise – they’ll have to prove it. Brokers will be in a similar position. Each participant in the trading cycle in the UK, for example, has a responsibility to ensure that the market is working fairly. The UK regulator, the FSA, is putting pressure on brokers to show that they have proper trade monitoring technology in place so customer and internal flow can be understood better.

Let’s move on to hosted services. “Cloud computing” is certainly a term du jour. What does it mean for capital markets? The first thing to say is that cloud computing, as in the provision of hosted software and computing resources, is a very familiar technology concept in capital markets, even though until recently people may not have used the term “cloud computing”. Anyone using a Reuters market data feed, or passing over orders over FIX to a broker, or accessing a single bank foreign exchange portal, is accessing services in the cloud. In fact electronic trading relies to a great extent upon cloud services. In 2010 however, more hosted services of a richer functional nature are going to become available. Instead of just building blocks – the market data, the DMA access etc. – more services will become available for algorithmic trading and risk management. Brokers do offer hosted algo services already, but they are broker specific. An example of a hosted algo service is one we launched with CQG recently. These will mature and broaden their scope. These types of services are invaluable to mid-sized trading organisations who can’t, or don’t want to, build a whole range of systems themselves.

Lastly, our prediction about emerging markets. We’re seen significant growth this year in demand for Apama in Brazil, India and China. Brazil particularly, because of continued economic growth and market liberalisation, has led the way (for example, Progress has 15 customers using Apama in Brazil now). India and China are getting there. They have further to go in market liberalisation to fuel the demand for algorithmic trading, but to attract extra investment and liquidity to their domestic markets they’ll be left with little choice. Hong Kong is an exception – algorithmic trading is used extensively both by global and regional players and it provides a window onto developed markets that mainland China can learn from.

Capital markets will evolve quickly in 2010, as in every year. That's what makes it such an interesting area to work in.

Friday, November 20, 2009

Exploration of Apama 4.2 Feature Set Podcast

Posted by Apama Audio

Louis Lovas, Chief Architect of Progress Apama, discusses aspects of the Apama 4.2 release that focus on application developer productivity and how Apama enhances an organization’s ability to built event-driven applications.

Wednesday, November 11, 2009

Putting the Smart in Smart Grid

Posted by Apama Audio

Listen to learn about the critical role of Event Processing in reshaping the delivery of energy and services to your customers.

Tuesday, November 10, 2009

Event Processing in Location-based Services

Posted by David Olson

Business is event-driven. No. Wait. Life is event-driven, and if it wasn’t, we’d be walking into walls and every sentence would start with “Oops.” Life would be a string of missed opportunities. We’ve done a masterful job of using technology to transform our business processes into software, but one tenet that’s been missing is that business should imitate life. Sense and respond is what’s been missing.

We recently announced that match2blue (http://web.progress.com/inthenews/match2blue-stands-ou-11092009.html) will be using the event processing capabilities of Apama to provide location-based services in social networking. Sense and respond is crucial for their ability to enable like-minded people to connect in real-time. Traditional data processing technology and its normal rhythm of “capture, store, analyze” can’t, well, keep up. And in a world where latency leads to missed opportunities, match2blue is proving that through the right technology business can imitate life.

Responding to business events as they happen is what will define your competitive advantage.

Business is event-driven, indeed.

Thursday, November 05, 2009

In defence of high frequency trading

Posted by Giles Nelson

The high frequency trading (HFT) debate seems to have entered a new and worrying phase in the UK. On Tuesday this week in an interview with the BBC, Lord Myners, the UK’s financial services minister, warned that high frequency trading had “gone too far” and that share ownership had “now lost its supporting function for the provision of capital to business”. (You can find the original interview here and reports of it in the Financial Times and The Independent yesterday).

 

 Mary Schapiro, head of the SEC, signalled at the end of October that a number of electronic trading areas were going to be looked into – naked access (where a broker sponsors a firm to have direct electronic access to an exchange), dark pools and high frequency trading.

 

It does seem now that on both sides of the Atlantic, governments and regulators are steeling themselves to act and softening the markets up to be able to accept the fact that electronic trading might have some limits.

 

The concern is that governments and regulators are going to come down too hard on electronic trading and the benefits that it gives investors will be damaged.

 

It all started with the flash order issue in the US a few months ago. Commentators were linking together various different, although related issues, in an inappropriate way. Flash orders seemed to be viewed sometimes as being synonymous with HFT, both of which were sometimes reported as forms of market abuse. All three topics are quite different. In my opinion, there are legitimate questions over the use of flash orders and a proposal to ban them is now being considered.

 

Dark pools, where large blocks of stock are traded off exchange to minimise market impact, have been the next targets. There are, again, legitimate issues. Dark pools, by their very nature, do not have good price transparency. Regulators have become concerned with their use because more and more trading is going through dark pools. Some estimates put this at between 10% and 30% in Europe and the US. This lack of knowledge about what exactly is the proportion is part of the problem itself. No one really knows what proportion of trading dark pools is taking. If a significant proportion of the market has no price transparency then this undermines the notion of a fair market for all. Regulators are looking at this and its likely that they will force dark pool operators to disclose far more information about what is being traded than they do currently. The SEC is considering limiting the proportion of a stock that can be traded through dark pools to a small percentage.

 

These legitimate issues however risk skewing the whole HFT debate to one where people will conclude that “HFT is bad”.

 

What people are now describing as HFT – the very fast and frequent, computer assisted trading of, usually, equities – is an evolution of something that has been happening in the market place for at least the last 10 years. In this time electronic trading has proliferated, not just in equities but also in all asset classes such as derivatives, bonds and foreign exchange. Far more venues for trading have been created. There are now many places where a company’s stock can be traded both in the US and Europe. This has brought competition and choice. Prices have been lowered, improving access to retail investors. Spreads have narrowed. Arbitrage opportunities are harder to find, which mean that market information is disseminating faster which, in turn, means that price transparency has improved. Because there is more trading going on, there is more liquidity available, which also means keener prices.

 

A key part of the HFT trend has been the use of algorithmic trading (the most prevalent use of complex event processing technology). Algo trading models fall broadly into one of two camps: alpha seeking, where market prices are examined to find a trading opportunity that will make money, and execution where orders are, usually, split up into smaller parts and then traded automatically in the market in an intelligent way to find good prices and to ensure those prices are not overly influenced by the trades being made themselves. For each type of model it can be very useful to react very quickly to market information, either to take advantage of a price discrepancy or to quickly pickup liquidity at a good price. Algorithmic trading is enormously beneficial for those who use it and its use is not limited to specialist hedge funds. Most algorithmic trading uses execution models that find liquidity, good prices, help minimise market impact and, lastly, increase significantly a trader’s productivity. Instead of wasting time executing several simple orders in the market over the course of many minutes or hours, the trader can simply ask a machine to do it. The trader can then spend time either covering more of the market (useful in straitened economic times) or spend more time actually delivering real value to a client.

 

Algorithmic trading and HFT have brought very significant benefits. It is these benefits that must not be threatened.

 

Trading has always involved cunning and guile, whether human or computer based. Competition has always existed in who’s got the best traders and trading systems. Organisations investing in ultra low-latency infrastructure to ensure orders arrive at an exchange in microseconds (not nanoseconds as sometimes claimed by the way – light travels 30cm in 1 nanosecond which isn’t far enough to be very useful) are part of this competitive world. Competition leads to innovation and it is this innovation that has brought so many of the benefits described above. Computer-based models can somtimes be used abusively. There are many forms of market abuse that regulators and exchange operators look for. Some exchanges and regulators have been investing in real-time surveillance technology (Progress counts Turquoise and the UK Financial Services Authority as customers using Apama) to ensure that they can spot abusive patterns of behaviour quickly.

 

We can’t start slowing trading down. We can’t go backwards and put the electronic trading genie back in the bottle. We don’t want to lose all the benefits that have come. Rather, regulators and exchanges should concentrate on ensuring maximum transparency in how markets operate and ensure that those attempting to maliciously abuse the markets are dissuaded or caught.