Education / Events

Sunday, June 24, 2007

Apama SIFMA - II

Posted by Chris Martins

As promised in the last post, the enclosed picture is the final mural, completed on Thursday of SIFMA.   All in all, a nice artistic tribute to what's been happening with Apama.  And given the amount of interest spawned by our SIFMA presence, we'll be needing a bigger canvas next year.   

Thursday, June 21, 2007

Apama at SIFMA

Posted by Chris Martins

This week has proven to be a big one for Progress Apama at the SIFMA (formerly SIA) show in NYC.  As of end of Wednesday – with one day remaining - we can already categorize the event as a big success with lots of visitors to the Apama exhibit at #2117, in the “Grand Ballroom” of the New York Hilton.  At times, there's been little room to maneuver, as people have come to hear about and see demonstrations of our recent announcements - including market surveillance (FSA),  integration of news in trading strategies (the Dow Jones Elementized News Feed), and the Apama FX Market Aggregation Accelerator. With a number of other demonstrations showcasing the application of Apama's CEP platform to algorithmic trading, it has been a challenge to accommodate all the demonstrations, while we also highlight the capabilities of fellow Progress product line, Sonic, and partners ULLINK and Microsoft, who are also demonstrating with us.


One of the areas we sought to illustrate at this year's event is the ever-expanding Apama ecosystem of customers and partners.  We’ve done this via a mural that is being created live by a local NY artist during the show. What began as a blank wall is evolving into a colorful and eye catching array of logos from many of Apama's customers and partners. The mural is still a work–in-progress, but we thought we'd provide an interim view, with a complete one to follow, once it is complete.

While a visually interesting, it's also proven an enlightening one as people see the number of organizations who are using Apama or partnering with Apama to deliver sophisticated algorithmic trading solutions.




Wednesday, May 16, 2007

Complex Event Processing at CERN

Posted by Giles Nelson

This week I visited CERN in Switzerland, the European Organisation for Nuclear Research, who is a customer of Progress. It was an astonishing and inspiring visit. CERN is in the final stages of building the Large Hadron Collider (LHC) which is due to go into production late this year. The LHC consists of a 27km loop in which protons will be accelerated and collided at unprecedented power levels to give us new insights into the building blocks of matter. In particular the search is on for the Higg's Boson, predicted originally in a paper dating from the 1960s. Finding this will fill a gap in the Standard Model of elementary particles and forces, and will help in furthering a "theory of everything". A particular highlight was to go down nearly 100m underneath the ground to look at the ATLAS experiment - a truly massive particle detector. Its enormous size consists of a number of different elements which detect different types of particles - muons, gluons and many others. The huge magnets which form part of the detector are cooled with liquid helium down to -269 degrees C to make them superconducting (and therefore more powerful). Viewing all this brought home what a remarkable engineering effort it all is.

Anyway, what has all this got to do with events? Well, through a number of presentations that CERN staff were kind enough to give us throughout the day it became apparent that their whole world is to do with events and the processing of them. The term "event" is one which they used often, to describe the information gathered by the detectors which sit around the collider. Every time a set of protons collides sets of events are created which need to be analysed and filtered to determine which are of real interest. For example, there are two ways in which a proton can decay to produce two Z particles (check). One is predicted to involve a Higg's Boson so the set of events to look for is something like "proton collision followed by a Higg's Boson followed by two Z particles". To identify such sets of temporally correlated events the raw events are propagated up through three levels of filter to be finally sent through to a central computing resource for further research and analysis. Up to 40 million collisions per second take place. These are firstly analysed in FPGA hardware reducing the 40 million collisions to a few thousand of interest. These are further filtered in software to produce finally a few hundred. These few hundred are then sent to other computing systems for further analysis.

It's not only collider events that CERN needs to handle. CERN also has a newly built central control centre, part of which is used to monitor CERN's technical infrastructure. About 35,000 separate sensors exist to monitor everything from fire, to electricity substations, to coolant plants. All these sensors are currently producing about 1.6M events per day all of which have to propagated to a central point for analysis. In turn these 1.6M are reduced to 600K events which are overviewed by human operators. Most are inconsequential (for example the 18KeV power supply is still producing 18KeV) but some will require attention. By appropriately analysing these CERN can ensure that the colliders are running as smoothly and as safely as possible. With billions of euros invested so far in the LHC, keeping the collider up and running as continually as possible is a top priority.

The visit proved a fascinating insight into the world of particle physics and the data processing challenges it produces. It really showed event processing at its most extreme.

Sunday, April 01, 2007

Sentient CEP and the Rights of Algorithms

Posted by John Bates

Titchy_john_3I have just returned from the UK where, as part of my duties for the week, I spoke at a conference on Fixed Income and another on Foreign Exchange. At both these events, the delegates were interested to hear of the latest trends in the industry – which include aggregating multiple market data streams from disparate trading venues into a single view and using rules-based trading techniques to rapidly detect complex patterns and make and execute trading decisions. In these areas and beyond, Complex Event Processing is being successfully used by Investment Banks and Hedge Funds to enable such complex and low latency requirements.

While I was in the UK, one of the items in the news was the marking of the 200th anniversary of the abolition of the slave trade by Britain with a service in Westminster Abbey, attended by the Queen. I hope none of my comments seem to in any way belittle this significant incident, but it did rekindle some thoughts about “will algorithms ever have rights?”. We look back on the past and find it inconceivable that any class of people could be considered as less human than others. Society has adopted that principle, quite rightly, as part of our programming. However, we’re quite happy to turn a blind eye to the suffering of animals in unnecessary cosmetic testing and in horrific factory farm conditions. In the same way that most people in the 18th Century were quite happy to receive cheap cotton and sugar, now we are quite happy to receive cheap cosmetics and food. History suggests, however, that this will change eventually.

So what of the rights of the algorithm? Now you may think this is totally mad – and you’d probably be right. However, consider for a moment the field of algorithmic trading. While speaking at both of the conferences this week, I illustrated the changing role of the trader. Rather than the trader watching the markets for key trading opportunities and then manually executing, he/she now can deploy an army of algorithms to look for and respond to the key opportunities. The algorithms are the trader’s slaves. As a customer of Apama’s, Lou Morgan of HG Trading, put it recently – “…. They don’t need a lunch-break and they don’t need a bonus…!”. Of course these algorithms are not sentient, and therefore they don’t have any rights – but what if they were!?

Together with my colleague and mentor at Cambridge - Professor Andy Hopper, I ran a research group that looked into what we termed “Sentient Computing”. This is a very grandiose title for what Gartner might describe in commercial terms as the “Enterprise Nervous System”. It was all about how complex asynchronous stimuli in a wide-area computing environment could be detected and used to automate responses. There were lots of fun hardware in this environment, like “Active Badges” - that could detect your location to within 2 centimeters in 3 dimensions, “Active surfaces” – that provided a high bandwidth network to your wearable computers when in contact with the surface, and a variety of other ubiquitously deployed sensors, actuators, compute terminals and devices.

But the whole glue that made Sentient Computing possible was Complex Event Processing and Event-Driven Architectures – as they are called today. Sensors generated events – that needed to be analyzed; Actuators could be triggered – but needed something to do the triggering. Event-based rules provided the glue to enable complex circumstances to be modeled. For example “When John Trigg and Chris Martins are within 10 feet of each other  and there is a video terminal in their view, then play message ‘good morning’ on the nearest video terminal”. Some people described this kind of environment as an “Exploded Robot” – because rather than a single object having various sensors and actuators attached to it, the network cloud is the medium to which they are attached – and CEP and EDA are the mechanism through which the “neuron firing” is wired together. Nowadays, we are familiar with how CEP and EDA are enabling all sorts of “exploded robot” applications – such as logistics companies that monitor the wide-area movements of their fleets of trucks, and optimize based on the perishability of their goods, weather, traffic conditions etc.

Although we are at an early stage with Sentient Computing, the technology approaches of CEP and EDA definitely provide a great framework. An event is a nerve impulse from the body. CEP provides the brain – which can be changed with new rules (new thinking?) at any time – rather than being hardwired. The next stage of course is rules that are written and refined by the system itself. But the exciting thing is that there doesn’t have to be one brain; the powerful thing about events is they can flow on to another brain that can be processed and analyzed in a different way (a different perspective on the information; a different opinion?). And then events resulting from decisions can be routed to actuators/services/humans to ultimately cause things to happen – the resulting nerve impulse. All the time, EDA is providing a framework to bring these various nerve inputs, brains and nerve outputs together. Clearly we’re not there yet with EDA – and CEP is being used mainly in individual systems unconnected – but it is coming to offer an exciting “event cloud”, enabling various CEP systems to interoperate.

So should algorithms have rights? Well of course not – and it is April 1st as I’m writing this. But I do ask you to think about what you would have been like it you’d be alive in the 18th Century and enjoyed cheap tobacco, sugar and cotton. Would you have “thought differently”? I remember an episode of “Star Trek – The Next Generation” in which a scientist wanted to take Mr Data apart to see how he worked. Captain Picard successfully proved to Star Fleet that Mr Data was sentient and thus an individual, rather than being property. I don’t think you necessarily have to be sentient to have rights – there’s no reason to pull the wings off a butterfly just because it isn’t familiar with Descartes. I used to comment to one of my friends – who is a Maths Professors – that his computer programs were so badly written that they were like strange mutated creatures that probably suffered. Of course this is very silly!

Anyway, I leave you with the thoughts that perhaps Event-driven Architectures offer us the ability to have a truly global “information nervous system”. We are asynchronous; the world is asynchronous – so computing paradigms should support this. Of course realistically this is just a set of inputs, rules and outputs. However, as we add more intelligence to the “brains” then we must be careful this system doesn’t become too autonomous. After all – you know what happened with "Skynet" in Terminator!

Thursday, February 22, 2007

TIBCO and Apama featured at GMAC in Japan

Posted by Progress Apama

This news was forwarded to me from our good friend Tim Bass at TIBCO, and reflects continued global interest in CEP, this time in Japan.  Tim will be speaking at GMAC in Japan next week - it's the GMAC Japan International Banking & Securities System Forum.   It's wonderful to see Tim featured here, talking about customer interaction management and fraud detection.  If you're in Japan definitely check him out he is a great spokesperson for the industry in general.   That said, you might be torn, because, as you can see, the Apama session about complex event processing (CEP) in Algorithmic trading is running in a parallel track, and there are only 2!   Whichever session you attend, go learn more about event processing!

Tuesday, February 13, 2007

BAM presentation at BPMS Congress

Posted by Giles Nelson

I'll be delivering a presentation in Madrid in the coming week at the BPMS Congress on how event processing can underpin BAM applications. We've got some good interest in BAM in Spain at the moment so this is an opportune time to spread the word.

This conference will cover a whole range of issues in the broad integration and business process management space - BAM being one of them. One important thing that I'll be emphasising is that a BAM project doesn't  need to come after a long and tortuous SOA adoption process. Some people appear to believe that BAM needs to be built upon process orchestation, BPM tooling etc. If you need the real-time operational visibility that BAM can provide you can do that now, pretty much whatever your underlying infrastructure. As long as you can source the information you're interested in from somewhere then there's potential.

I'll be posting again once the conference is over.

Monday, February 12, 2007

Webinar: CEP - The Brains Behind BAM

Posted by Progress Apama

The Apama group will deliver a webinar series called "CEP - The Brains Behind BAM," beginning with the first session called "Putting Intelligence into Business Activity Monitoring" on February 27th (U.S. time zones) and March 1st (Europe time zones).

This session will cover how CEP, when coupled with BAM dashboards, provides an intelligent, action-aware BAM infrastructure for the enterprise, and will explore the architectures and the applications (algorithmic trading, fraud detection, and compliance) of some leading adopters of CEP and BAM.

The presenters will be Dr. Giles Nelson, co founder of Apama, and Mark Palmer, general manager of Apama.

Register for North American-friendly time zones (Tuesday, February 27th, 11 AM Eastern Standard Time)

Register for European-friendly time zones (Thursday, March 1st, 11:30 AM Europe Standard Time (GMT +01:00, Berlin)

Saturday, February 10, 2007

Richard Bentley moderates panel at SIFMA E Trading Conference

Posted by Progress Apama

Richard Bentley moderated a panel discussion at the SIFMA E Trading conference this week in London.  It was really well received.  The event was about fixed income and we talked about the application of advanced trading techniques to FI.  This is an interesting topic and an increasing trend we have seen of applying CEP to the fixed income market.  Look for announcements soon in this area.