Business Event Processing

Tuesday, April 27, 2010

Monitoring and surveillance: the route to market transparency

Posted by Giles Nelson

Again this week, capital markets is under the spotlight, with the SEC and Goldman standoff. Just a few weeks ago, the FSA and UK Serious Organised Crime Agency were making multiple arrests for insider trading. Earlier this year Credit Suisse were fined by the New York Stock Exchange for one of their algorithmic trading strategies damaging the market. Still, electronic trading topics such as dark pools, high frequency trading are being widely debated. The whole capital markets industry is under scrutiny like never before.

Technology can't solve all these problems, but one thing it can do is to help give much more market transparency. We're of the view that to restore confidence in capital markets, organisations involved in trading need to have a much more accurate, real-time view on what's going on. In this way, issues can be prevented or at least identified much more quickly.  I talked about this recently to the Financial Times, here

Last week at the Tradetech conference in London, Progress announced its release of a second generation Market Monitoring and Surveillance Solution Accelerator. This is aimed at trading organisations who want to monitor trading behaviour, whether to ensure compliance with risk limits for example, or to spot abusive patterns of trading behaviour. Brokers, exchanges and regulators are particularly relevant, but buy-side organisations can also benefit from it. Previously this solution accelerator just used Apama. Now it's been extended to use our Responsive Business Process (RPM) suite, which includes not only Apama, but Savvion Business Process Management, which extends the accelerator to give it powerful alert and case management capabilities. We know that monitoring and surveillance in capital markets is important now, and believe it will become more so, which is exactly why we've invested in building out product. You can read the take on this from the financial services analyst Adam Honore here and more from Progress about the accelerator and RPM. A video on the surveillance accelerator is here

As all this is so relevant at the moment and Tradetech is the largest trading event of its kind in Europe (although very equity focused), we thought we'd conduct some research with the participants. We got exactly 100 responses on one day (which made calculating the percentages rather a breeze) to a survey which asked about attitudes to European regulation, high frequency and algorithmic trading and dark pools. Some of the responses relating to market monitoring and surveillance are worth stating here. 75% of respondents agreed to the premise that creating more transparency with real-time trading monitoring systems was preferable to the introduction of new rules and regulations. 65% of respondents believe that European regulators should be sharing equity trading information in real-time. And more than half believe that their own organisation would support regulators having open, real-time access to information about the firm's trading activity. To me, that's a pretty strong sign that the industry wants to open up, rather than be subjected to draconian new rules.

There will be substantial changes to the European equity trading landscape in the coming year. There will be post MiFID regulation change by the European Commission acting on recommendations by the Committee of European Securities Regulators who are taking industry evidence at the moment. Their mantra, as chanted last week, is "transparency, transparency, transparency". Let's hope that this transparency argument is expressed in opening up markets to more monitoring rather than taking a, perhaps politically expedient, route of outlawing certain practices and restricting others.

Tuesday, April 20, 2010

Predictions for increased transparency in Capital Markets

Posted by Giles Nelson

  It is my view that one of the most significant causes of the global financial crisis was a lack of transparency in financial markets.  Put simply, that means no one, not regulators or market participants, knew what the size of certain derivatives markets (like credit default swaps) was, who held what positions, or what the consequences of holding positions could be.  If financial reform brings nothing else, it should at least hold banks accountable for the business they conduct, and that means full disclosure and constant monitoring by responsible regulators.  

This action would help provide the basis for preventing future crises. No matter how inventive financial products may become, if regulators have complete and detailed information about financial markets and banks’ activities there, better assessments of risk can be made. This means that if necessary, banks’ activities can be reigned in through higher capital requirements or similar measures.  Simply limiting banks’ ability to conduct certain business is a blunt instrument that does not resolve the lack of transparency and likely will hamper economic growth.

Market transparency exhibits itself in many forms. Particularly relevant is that related to electronic trading. Therefore, I predict that regulators will require banks to implement relevant stronger pre-trade risk mechanisms. Regulators, such as the FSA & SEC, will ultimately bring in new rules to mitigate against, for example, the risk of algorithms ‘going mad’. This is exemplified by Credit Suisse, which was fined $150,000 by the NYSE earlier this year for “failing to adequately supervise development, deployment and operation of proprietary algorithms.”

Furthermore, volumes traded via high frequency trading will increase, although at a much slower pace than last year, and at the same time the emotive debates about high frequency trading creating a two-tier system and an unfair market will die down.

In addition, with regards to mid market MiFID monitoring, greater responsibility for compliance will be extended from exchanges to the banks themselves. Banks and brokers will soon be mandated to implement more trade monitoring and surveillance technology. There will also be no leeway on Dark Pools; they just simply have to change and be mandated to show they have adequate surveillance processes and technology in place. They will also have to expose more pricing information to the market and regulators.

This year will see a definite shift to an increasingly transparent – and therefore improved – working environment within capital markets. The ongoing development of market surveillance technologies and changes in attitudes to compliance will drive this forward, creating a more open and fairer marketplace for all.

Monday, March 08, 2010

Rumblings in the Cloud

Posted by Louis Lovas

Rumblings in the Cloud
Cloud computing... it's on everyone's mind these days. Personally I think it's a term that has attained such aggrandized acclaim that vendors, analysts, bloggers and anyone with marketing muscle has pulled and stretched its definition to such and extent that it could mean just about anything hosted. Cloud Computing Journal polled twenty-one experts to define Cloud Computing.  Just the fact they had to ask the question of twenty-one experts is rather telling in itself.  Well I read what the experts had to say.

So armed with my newly minted (yet fully stretched, but not of my own making) Cloud definition I happened upon this commentary about CEP in the Cloud or the lack thereof.  There's a great quote in the article: "I don’t care where a message is coming from and I don’t care where it’s going”. Correctly indicated, this in a sense defines a key aspect of CEP. Event-based applications should be transparent to messages (or events to which messages transform) origin and destination (sans a logical or virtual name).  However, unlike the author Colin Clark, I do believe the current crop of vendor products, most notably Progress Apama maintain this separation of the physical from the virtual.

The rationale behind the lack of CEP-based applications in the Cloud (ok, there's that word again) are found in other factors. To explain my reasoning I'll start by dividing CEP-based applications into two categories. Of course there are many ways to categorize CEP-based applications, but for the sake of this discussion, I'll use these two:

CEP-based Application Categories
  1. Those that do things
  2. Those that observe other applications doing things
Not sure I could make a simpler layman-like description, but needless to say it warrants further explanation (or definition in sticking with our theme)

CEP-based applications that do things
This category is best explained by example. Typical of event processing applications that do things are those in Capital Markets like algorthmic trading, pricing and market making. These applications perform some business function, often critcal in nature in their own right. Save connectivity to data sources and destinations, they are the key ingredient or the only ingredient to a business process.  In the algo world CEP systems tap into the firehose of data, and the data rates in these markets (Equities, Futures & Options, etc.) is increasing at a dizzying pace. CEP-based trading systems are focused on achieiving the lowest latency possible. Investment banks, hedge funds, and others in the arms race demand the very best in hardware and software platforms to shave microseconds off each trade. Anything that gets in the (latency) way is quickly shed.

In other verticals, an up and coming usage of CEP is location-based services. This is one that leveraging smart mobile devices (i.e "don't care where the message is going") to provide promotions and offers.  
    • Algo Trading, Pricing, Market Aggregation
    • Location Based Services (providing promotional offers and alerts)
CEP-based applications that observe other applications doing things
Conversely, event-based applications that observe other applications doing things are classified as providing visibility or greater insight into some existing business function. These event-based applications overlay business processes to take measures to improve their effectiveness. As is often the case critical business applications provide little visibility or the information is silo’ed. There is a need to provide a broader operational semantic across a heterogeneous mix of business applications and processes.  Here are a few typical examples of event-based visibility applications observing other business systems.
    • Telco Revenue Assurance
    • Click Stream Analysis
    • Fraud Detection
    • Surveillance
Of  course the demarcation line between these two classifications is not clear cut. Providing greater visibility is just a starting point, monitoring for opportunities to take action is just as important such as kicking-off a fraud watch if a suspected wash-trade occurred  (so in a sense they are doing things).

Where for art thou oh CEP
When considering the Cloud, an important point to consider is dependency. Specifically, there is a dependency that the underlying applications and business processes exist in the Cloud for (observing) CEP to overlay them.  I would offer that Enterprise business has not yet migrated their key business processes to the Cloud on a widespread scale just yet. Why not? What are the barriers? Security, regulatory compliance, DR, investment costs, limited skill sets are just a few of the challenges mentioned in this ITProPortal article.  I suspect these barriers are far reaching, keeping the pace of Cloud deployment in check to the point where it's not as yet strategic to many.
 
One of key things that makes the Cloud a reality is virtualization, it has clearly revolutionized PaaS as the Cloud. Virtualization does come at a cost, there is a latency penality for the conveinence, no matter how small for some use-cases that cost is too great.

Make no mistake, I am certain the Cloud with all it's twenty-one definitions is the future of computing. It's an imperative that will knock down the barriers and change the face of the Enterprise and when it reaches critical mass CEP will be there.

Once again thanks for reading, you can follow me at twitter, here.
Louie




Monday, January 11, 2010

Why businesses must evolve their business processes to be highly responsive, dynamic and predictive – or they will cease to be competitive

Posted by John Bates

Today Progress Software announced the acquisition of Savvion http://web.progress.com/inthenews/progress-software-co-01112010.html. I believe this heralds the beginning of a very exciting phase for Progress Software. Now Progress has become a leader in Business Process Management (BPM). But more than that, combined with our other solutions, Progress is now uniquely able to empower businesses to be operationally responsive – through responsive, dynamic and predictive business processes. And this is critical to keep modern businesses competitive.

You might wonder about the journey Progress went through to realize what the market needed. It was all about understanding the emerging needs of our customers and where they needed their businesses to go. The part of my job I enjoy the most is spending time with customers and understanding what pain points they have - with the ultimate goal of working with them to address the pain and making them highly competitive.

Over the last couple of years I have been hearing more and more from customers about the need to be operationally responsive. For example, many customers have expressed their desire to proactively – and often in real-time - address the needs of their customers and respond to the behavior of their competitors. The goals are to win new business, increase customer satisfaction and triumph over their competitors. These findings hold true whether the customer be in banking, insurance, communications, travel, transport, logistics, energy, gaming or many other industries. It could be British Airways ensuring their high value customers are looked after first in the event of a flight delay, or wireless carrier 3Italia pushing real-time offers to their customers based on their profile, activity and location, or maritime logistics provider Royal Dirkzwager dynamically adjusting the course and speed of a container ship to optimize fuel usage, based on weather conditions and port berth availability.

Operational responsiveness is thus about being highly responsive to opportunities and threats – and even anticipating such scenarios. Market research supports what I’ve been hearing, such as the recent survey by Vanson Bourne http://web.progress.com/en/inthenews/companies-stuck-in-o-10062009.html – suggesting Operational Responsiveness has moved from a nice-to-have to a must-have.

There are a number of business facing solutions that have shown great promise in addressing operational responsiveness. One of those is Business Transaction Assurance (BTA). This enables businesses to discover their business processes and gain visibility on the effectiveness of these business processes – even if they are built in a wide variety of heterogeneous technologies and work across legacy applications. BTA non-disruptively discovers business processes – without any modification to existing applications – and monitors to ensure processes run to completion. BTA also discovers bottlenecks and hotspots in the processes – enabling businesses to understand just how efficiently they run.

Another important solution is Business or Complex Event Processing (BEP or CEP). This enables business users to model the detection of and reaction to patterns indicating business opportunities and threats in real-time. Examples could be an opportunity to up-sell to a customer on the web-site now (opportunity) or risk exceeding a key level (threat).

And then of course there’s Business Process Management (BPM). This enables business users to model and execute a business process flow. BPM is also widely used for Business Process Improvement (BPI) – the re-engineering of (parts of) existing processes to improve their effectiveness.

The really cool thing we realized in talking with our customers is what happens when you use BTA, BEP/CEP and BPM together. Suddenly businesses are empowered to discover how effective they run, to detect opportunities and threats dynamically and to invoke business processes in response. The business becomes dynamic and responsive. Business users can take control and model the behavior they want their business to exhibit under certain circumstances, and through dashboards they can track the effectiveness of the business. Over time, the areas of the business processes that should be improved can also be detected.

Progress already has leading products in BTA and BEP/CEP with Actional and Apama. Progress chose Savvion to complete the story for a number of reasons. Savvion has a history of innovation and is a leading pure-play BPM provider. But Savvion also has a very rich platform, which includes not just BPM modeling and execution, but also an event engine, a business rules engine, a document management system and an analytics engine. The fact that Savvion enables business processes that respond to events means it immediately works well with Actional and Apama. And with high performance, scalability and availability, Savvion fits perfectly into Progress – where we pride ourselves that all of our products exhibit these characteristics.

In summary, Progress is now a best-of-breed BPM vendor – and not just at the departmental level – but at the enterprise level. But we’re also more than that. Our goal is to enable operational responsiveness and ensure our customers gain competitive advantage through the power of responsive, dynamic and predictive business processes.

10 Reasons Why Progress Chose Savvion

Posted by John Bates

Today Progress announced the acquisition of Savvion http://web.progress.com/inthenews/progress-software-co-01112010.html

The reason that Progress chose to enter the BPM market is clear. Businesses are increasingly turning to BPM to implement and improve their business processes. Why? Firstly because no other solution can help enterprises achieve real-time visibility, agility, efficiency and business empowerment the way BPM does. Secondly BPM enables this to be achieved with low Total Cost of Ownership (TCO) and ease of use.

But why did Progress choose Savvion? Here are 10 reasons to start off with…

  1. Savvion is a trailblazer and industry leader – Savvion is a pioneer in BPM but is also still at the cutting edge. We wanted the best BPM thinkers at Progress. 
  2. Savvion has been proven to work at the enterprise level. Some BPM systems only work at the departmental level, but Savvion works at either departmental level or enterprise levels.
  3. Savvion offers System-centric and Human-centric BPM – Savvion can orchestrate processes but can also involve human users in workflow.
  4. Savvion is event-enabled – so business processes can respond to events. Progress has a lot of momentum behind event-driven business systems through our Actional and Apama solutions – and Savvion will work seamlessly in event-driven business solutions.
  5. Savvion offers vertical industry solutions – Analogous to Progress’ Solution Accelerators, Savvion offers out-of-the-box vertical solutions in industries including Financial Services and Telecommunications.
  6. Savvion offers an integrated Business Rules Management System – Expressing logic in terms of rules can often be very important. Savvion have developed a rules engine, integrated with their BPM system, enabling decision-oriented BPM – modifying the process flow based on rule conditions. This is a powerful capability.
  7. Savvion offers an integrated Analytics Engine – Business Intelligence has proved its worth but it is a “rear view mirror” technology – analyzing facts that have already happened. Savvion’s analytics engine enables continuous analytics to augment business processes and human user with advanced real-time analytics, enabling better decision-making.
  8. Savvion offers an integrated Document Management System (DMS) – Savvion’s integrated DMS enables rich document handling and empowers document-centric BPM.
  9. Savvion BPM suite is highly scalable, high performance and highly available – At Progress we pride ourselves on the strength of our underlying technology. We want to offer our customers a complete solution that embodies scalability, performance and availability. Thus selecting a BPM vendor in-keeping with this philosophy was key – and Savvion is just such a vendor.
  10. Savvion is a great cultural fit with Progress – An often-overlooked point is that cultural fit is key to acquisition and integration success. The Savvion team pride themselves on being innovative, customer-focused and fun - just like the Progress team. We’re looking forward to working together. 

Tuesday, December 22, 2009

My Baby Has Grown Up

Posted by John Bates

20090625_7172 copy_2 I was proud to recently be appointed CTO and head Corporate Development here at Progress Software http://web.progress.com/en/inthenews/progress-software-ap-12102009.html. But I don’t want anyone to take that as an indication that I won’t still be involved with event processing – au contrair. Event processing (whether you call it CEP or BEP) is now a critical part of enterprise software systems – I couldn’t avoid it if I tried!!

But taking a broader role does give me cause to reflect upon the last few years and look back at the growth of event processing and the Progress Apama business. Here are some observations:

  • It’s incredibly rare to have the pioneer in a space also be the leader when the space matures. I’m really proud that Progress Apama achieved that. Our former CEO Joe Alsop has a saying that “you don’t want to be a pioneer; they’re the ones with the arrows in their backs!” Usually he’s right on that one – but in the case of Progress Apama, the first is still the best! Independent analysts, including Forrester and IDC, all agree on it. Our customers agree on it too.
  • It’s tough at the top! I had no idea that when you are the leader in a space, many other firms’ technology and marketing strategies are based completely around you. I have met ex-employees of major software companies that have told me that there are Apama screenshots posted on the walls of their ex firms’ development centers – the goal being to try to replicate them or even improve on them. Other firms’ marketing has often been based on trying to criticize Apama and say why they are better – so their company name gets picked up by search engines when people search for Apama.
  • Event processing has matured and evolved. Yes it is certainly used to power the world’s trading systems. But it’s also used to intelligently track and respond to millions of moving objects, like trucks, ships, planes, packages and people. It’s used to detect fraud in casinos and insider trading. It’s used to detect revenue leakage in telecommunications and continually respond to opportunities and threats in supply chain, logistics, power generation and manufacturing. It enables firms to optimize their businesses for what’s happening now and is about to happen – instead of running solely in the rear view mirror.
  • Despite all the new application areas, Capital Markets remains a very important area for event processing. Critical trading operations in London, New York and around the world are architected on event processing platforms. The world’s economy is continually becoming more real-time, needs to support rapid change and now needs to support the real-time views of risk and compliance. We recognize the importance of Capital Market. My congratulations to Richard Bentley who takes on the mantle of General Manager of Capital Markets to carry on Progress Apama’s industry-leading work in this space. With his deep knowledge and experience with both Apama and Capital Markets, Richard is uniquely placed to carry on the solutions-oriented focus that has been the foundation to Progress Apama’s success.
  • Even in a terrible economy, the value of event processing has been proven – to manage costs, prevent revenue leakage and increase revenue.  Progress announced our fourth quarter results today http://web.progress.com/en/inthenews/progress-software-an-12222009.html which saw a double digit increase for Apama and triple digit for Actional. Apama and Actional are used, increasingly together, to gain visibility of business processes without modifying applications, to turn business process activity into events and to respond to opportunities and threats represented by event patterns – enabling the dynamic optimization of business performance.
  • But one thing I do believe: that soon there will be no such thing as a pure-play CEP vendor. CEP is part of something bigger. We’ve achieved the first mission, which is to raise the profile of event processing as a new technique that can solve hitherto unsolvable problems. Now the follow on mission is to ensure event processing finds its way into every solution and business empowerment platform. It is one of a set of key technologies that together will change the world.

I wish everyone Happy Holidays and a successful and profitable 2010 !!!