The Gateway to Algorithmic and Automated Trading

Event processing in the world of electronic and algorithmic trading

Published in Automated Trader Magazine Issue 05 April 2007

Event processing is a set of concepts and accompanying technologies that have been building momentum in the world of finance over the past few years. These ideas and technologies are rapidly changing the way in which automated trading systems and related parts of electronic trading systems are built and run. As with any new concept or technology, a glut of confusing “market speak” often arises from vendors promising to save the world. Chris Donnan, who works in equity derivatives trading technology at a top Wall Street firm, provides a translation.

Chris Donnan

Deciphering this "market speak" isn't always easy, so participants are often left puzzling over a number of (in many cases quite basic) questions:

  • What exactly is event processing?

  • What is the set of problems it seeks to address?

  • Why is it relevant to algorithmic and electronic trading?

  • What is it being used for today in regards to electronic trading systems?

  • Who are the key players in this space?

  • What can we do to capitalise on whatever benefits these concepts and technologies have to offer?

  • How do these ideas and technologies fit in with what we already have?

  • What will future generation automated trading solutions built on this set of technologies look like?

This article sets out to answer them.

So what is it - in English please?

The core concept of event processing is to take real time information and model it as if it was really streams of events. The idea is to model information as a set of real time information streams and then to make an engine capable of processing those streams of information and detecting points of interest within them. An event can be just about anything; a tick, an execution, a fill, a unit of time passing, etc.

In itself this idea is hardly novel. These days any market data feed is already a stream of real time events. The "processing" part of event processing is where the power lies. Once the event streams are flowing, you are able to look at relationships between discrete streams of events. You are also able to look at patterns across time. You are able to join or correlate the information between streams of information.

This is where the power lies, and it is why these solutions are spreading everywhere. Once data from multiple sources can be normalised and synchronised into a homogenous stream format, powerful and interesting analysis becomes possible. A processor capable of thoroughly examining these event streams should also be capable of highlighting particular combinations of events within them that could form part of an algorithmic or automated trading strategy.

"... You are able to look at discrete relationships between streams of events."

Complex Events

Two terms that are often used interchangeably to describe event processing are event stream processing or complex event processing. There are several articles available that draw the distinction between these two terms (a good example of such an article is at http://complexevents.com/?p=103), but just for the purposes of this article I shall treat them as synonymous. However, the idea of a "complex event" is important for seeing how event processing is useful. It is when you have multiple streams of events and you wish to find significant relationships across the streams that the power of the concept becomes apparent. A complex event is when some kind of aggregation of events takes place. Events can be primitive in nature - like the above examples of ticks, fills, etc. What the stream processing engines provide is the ability to search for combinations of events such as:

  • "when IBM's price moves up > 3% in < 1 hour"

  • "when the inside bid moves more than .5 points in 1 second on more than 10 instruments"

As an event processing engine is capable of searching for complex occurrences of information through multiple information streams and across time, the combinations of linked events (which may or may not be simultaneous) can involve a much larger number of events than the example above.

"Continuous Query"

Continuous Queries

The other important aspect of event processing is the Hollywood principle; "Don't call us, we'll call you". Using classical database and file storage technology you need to "look up" or "poll" for information if you want some kind of "aggregate or correlated information".

If you wanted "all of the option trades executed with IBM as the underlying security within the past 1 day", or "all of the fills for a particular trade book in the past 10 minutes", you would ask some kind of database - and it would give you the answer. If you want to know again, you ask the question again.

The difference in stream processing based solutions is that you do not want to keep calling and checking up on that information. What you want to do is to say: "get me this information now - and continue to update me when this information changes". So, if you were to ask for "all of the fills on e-mini S&P contracts in some book in the past 10 minutes", the system would return to you the first set of results - with your information for the last 10 minutes - and continue to call you when more information fits that description. This concept is sometimes called a "continuous query".

To recap - the salient points about event processing are:

  • All incoming data is normalised into streams that can be understood by the processor

  • Events can be correlated across streams and across time

  • There is no need to poll - information will continuously update for the information you care about.

Why does event processing matter to automated traders?

In the world of automated trading we have so much information coming from so many places in real time. Often this information is in disparate shapes, colours and sizes. We are often forced to coerce, combine and interpret this information in several different ways. The paradigms and technologies of event processing give us a way to take all this information and look at it in a uniform way. Not only is this a uniform way to look at all of this data, but it is a way that represents the information intuitively as it really is - as streams of real time data.

The old way was to get streams of information in various formats shapes and sizes, then look through them and try to find correlated information with brittle rules. This is often an ad-hoc, customised, roll-your-own solution. Event processing solutions take these formerly varied solutions and unify them. Let's take an index arbitrage example. In essence, you are taking real time information from several sources - perhaps the S&P futures prices, then the information from some or all of the underlying parts of the index. You are also looking at your fills, risk levels, etc. Then we try to apply some formulae and see what is "out of whack". This is the epitome of event processing. We are looking at several streams of information - looking for a "complex event". We are looking for the point in time when the stars align and there is an arbitrage opportunity - something is "out of whack".

We also need our risk levels to be in an acceptable range, for our last orders to be filled, and so on. In this example you might get the "complex event" that means "something is out of whack" (there is an arbitrage opportunity) several times per day, per minute etc. Some events you may care about are less frequent. Let's say, for example, you are buying "lottery ticket options" - options that are very unlikely to ever be exercised. This is an effort in looking for a "fringe event" containing some potentially complex set of derived events of events, etc. When one of these events is triggered - the lottery ticket is cash!

Event processing is a novel way of decomposing or looking at the problem of real time data. Nowhere else is this problem manifest as clearly as in the world of automated trading. Any automated trading system - especially trading systems that are designed to look at a large number of assets - will need to have a standardised way of looking at these streams of information.

Event processing solutions can act as a competitive advantage when your competitors are continually rebuilding the wheel with older solutions. It is clear that this set of problems will continue to exist. It is also clear that automated traders have been building bespoke software for dealing with streaming data for years. If the cost of this bespoke software could be reduced, then the automated traders could use that capital elsewhere - perhaps as additional proprietary capital in the markets.

So, event processing is almost the technology for automated trading. I do not believe there is a question as to its applicability to automated trading, just the question of how fast it is adopted and which implementations are used. It is already highly appropriate, due in large part to the way that it maps so closely onto the real domain.

How is it being used today in the world of automated and electronic trading?

While stream processing is a relatively new set of ideas and technologies, it is not brand new. It is clear that event processing solutions have already begun to spread their way across many aspects of electronic trading in areas such as:

  • Real time automated trading

  • Real time risk management

  • Regulatory monitoring

  • VWAP Engines

There are several key players in the marketplace and several well publicised examples of stream processing technology hard at work in the electronic trading world. A few examples from recent news releases:

  • Kaskad has introduced a stream processing solution for Reg NMS surveillance.

  • Wombat and Vhayu deliver Real-Time VWAP and Transaction Cost Analysis for Bear Stearns

  • Aleri and Microsoft Partner to Offer MiFID Best Execution Solution

  • Boston Stock Exchange Goes Live on Kaskad Technology's New Korrelera


Looking elsewhere, a number of major investment banks are using the technology to create and run algorithmic trading solutions, both for their own proprietary trading and as a service for their buyside clients.

Firms such as low latency data provider Wombat have begun to partner with event processing providers like Coral8. This sort of initiative means that it is now possible "out of the box" to aggregate most available market data streams (taken direct from the relevant ticker plants) and feed this information into event processing engines. Integration like this has the power to allow faster time to market and significantly less spend on creating bespoke software solutions.

"Event processing solutions can act as a competitive advantage when your competitors are continually rebuilding the wheel with older solutions."

What can we do to capitalise upon event processing?

One of the nice things about event processing technologies is that they build upon many established parts of the automated trading infrastructure in new ways. As a result, we can simply dispense with redundant elements in our infrastructure, rather than having to rebuild it all.

The fact of the matter is that you already have many, many streams of data. You also already have various channels over which this data is delivered, which might have been created in house or be one or more of the delivery platforms offered by infrastructure or market data vendors. These transports or conduits are already all over the infrastructure transporting the data. You already have all the hardware, routers and so on to handle pumping the real time data throughout your real time electronic trading environments.

However, the first step in event stream processing is to standardise all of the stream data into a format that the stream processing engine can understand and interpret. Once that is done, the 'processing' promise of event processing can come into play.

Standardising stream data

Stream data comes in 9999 flavours. You have feeds from Reuters, Comstock, various direct exchange feeds, other aggregate providers, ECNs, block trade services - the list seems endless. You have news, market data, last prices, order books, internal executions, etc. The less homogenous this information is, then obviously the less integrated it will be. The ability to correlate information in real time across all of these streams can be a huge benefit. Therefore standardising the actual data into a meaningful shape is step one. Event processing solutions will help in direct proportion to how much information they can actually correlate and aggregate. The sooner you can look at information joined from all of these streams of information, the sooner you can take this information and capitalise on events of interest.


"…standardise all of the stream data into a
format that the stream processing engine can
understand and interpret."


The different event processing solutions offer different ways to pump real time data into their engines. The hardest part of this is merely choosing the first data to suck in. Once the data is in there you can start defining the custom events that you want to find - and it will tell you when it finds them. The essential element that the event processing vendors provide is a way to inspect this information in novel ways. The fact that we have the information normalised is all well and good, but now we need to be able to "select information" or "query" this real time set of information in order to capitalise on it. Just like a classical database, if you do not put information into it, you certainly cannot get it out. Once you start routing your real time streams into the stream processor, you can immediately begin looking at information in new and interesting ways.

Availability and accessibility

Event processing is a new way of looking at real time data. While event processing techniques and technology can extend into numerous industries and disciplines outside electronic trading, it is exceptionally well suited to this domain. Event processing solutions can analyse extremely high frequencies and volumes of data and expose information that was formerly extremely difficult to access.

There is already a stable base of vendor technology in place. This is no longer the bleeding edge, but a maturing field. It is not a tremendous leap to adopt event processing solutions today, it is relatively easy and you can start slow. It is clear that more and more solutions are being built today using these tools and techniques, and that many more trading environments in the future will be built using these tools and their offspring.

The future

Today, in most cases an algorithmic trading solution has data coming into it that is reformatted into some common format. Unfortunately existing tools are less than optimal for analysing real time streams of information. The ability to look back in time and across correlated streams of information is simply not there today without complex bespoke software or commercial event processing solutions. The more event processors make their way into the electronic trading landscape, the more homogenous trading systems can begin to look.

With all of data coming in, and being pushed through an event processor, trade bots can attach to events and participate in consuming events, as well as re-introducing events back into the streams of information. An engine that can efficiently and powerfully analyse, monitor and process events has the capability to provide one of those rare turning points in trading system technology. It is evident that such technology is a natural fit with automated trading systems.

Conclusion

So, if you think this may be for you, what next? Research the vendors. Consider taking a new project and prototyping it using an event processing solution. From personal experience with products such as Coral8, I have been astonished at how fast some of these products can begin to add value. They already have all the user tools and enterprise features, so getting up and running is truly easy. It seems apparent that as more and more firms begin adopting these technologies they will become the competitive norm and that those not using these technologies will be at a disadvantage. Ultimately event stream processing is a technology worthy of adoption because in the arms race of automated trading every edge counts.

  • Copyright © Automated Trader Ltd 2017 - Strategies | Compliance | Technology

click here to return to the top of the page