Why is it becoming more important that sellside providers educate the buyside to the benefits of algorithmic trading as well as try to overcome their concerns about perceived loss of control?
Banneville: I think market conditions are providing this education anyway, as events have effectively dictated a need for the buyside to make greater use of algorithms. The asset management industry is becoming increasingly competitive - especially with the rise of exchange traded funds that have low tracking errors and very low fees. Therefore asset managers have to compete with those vehicles by increasing their alpha and reducing costs. This drives the need for efficient tools such as algorithms that will allow them to control their execution while still being productive. Algorithms allow asset managers to have comfort that the bulk of their trades are being executed efficiently, thus freeing them to work on larger or more complex trades that require human finesse.
Sheridan: Regarding the perceived loss of control, I think that it is indeed just that - a perception and not a reality. With comprehensive pre-trade analytics, the buyside trader actually has more clarity on exactly how each child order will be executed (timing, cost, risk, etc) and can also then modify orders accordingly. In fact one could argue that traders have more information now on what was previously handled by the "black box" than they ever have. We think that the need for clear and regular communication is essential; especially given the expansion of sellside algorithmic offerings (previously most sellside desks had four or five algorithms, now some have well over twenty).
Bourne: I don't think there is a need to provide education in the general sense, but there is definitely a need to provide the buyside with education on the nuances of the banks' individual algorithms. The buyside is typically extremely clear as to which benchmarks they are interested in. Therefore they know what they are looking for in terms of performance and use transaction cost analysis to check that they are getting it.
The sellside now provides a considerable degree of flexibility to the buyside in terms of how algorithms actually execute. Therefore the educational focus is not so much on algorithms in general and how they execute but more on the ways in which that flexibility can be exploited to ensure that the algorithm outperforms the benchmark.
François Banneville: "I think the sellside has to do more than just sell algorithms."
Celebuski: It is less about the benefits of algorithms in general and much more about the specifics of where to use them appropriately. If you look at where volume is trading these days, you can see that it is polarised at two extremes. At one end are very small trades where liquidity is a continuous function of time, at the other are very large trades requiring capital commitment. There is very little liquidity in the middle. The buyside needs to be able to interact efficiently with the continuous pool of liquidity, and that is where algorithms come into play. The perceived loss of control is in my view a fallacy. The buyside still retains control in terms of selecting providers, algorithms and scheduling.
Conner: A policy of "Connect & Forget" when supplying algorithms to the buyside just doesn't cut it. Providing a high level of anonymous proactive service tailored to help buyside traders properly use algorithms is critically important. When a buyside trader takes on the increased responsibility of working their own orders, this introduces new risks into their day to day world; thus the need for increased control. When properly utilised, algos improve control - they are a tool that follows order instructions explicitly.
The employment of the wrong algorithmic strategy can clearly be a very costly exercise. How can the sellside help guard against this and ensure that clients are offered the most appropriate choice of solution?
Banneville: I think the sellside has to do more than just sell algorithms. It must also be able to deliver on the concept of a service and also a service desk - i.e. there must be a human behind the machine. A good sales trader will initially provide clients with advice on which algorithm to use and why. In addition they will monitor the activity of the algorithm on a real-time basis - not just the basic connectivity, but also checking that the performance of the algorithm is in line with its anticipated performance and client expectations. Therefore, even when trading with algorithms, there remains a relationship between the service desk and the buyside client. But as the staff are not situated with the regular sales and trading personnel, the integrity and confidentiality of the orders is maintained.
Sheridan: The sellside definitely has a responsibility for helping clients to find and implement the best algorithmic solution, on both a regular and ad hoc basis. A key factor in guarding against the application of the wrong strategy to a trade is to have a consultative discussion with each client before they begin actively using the algorithms, to ensure they understand how the strategies work and when to use them, and that we also understand their objectives and trading styles.
On an ongoing basis, sellside monitoring is also necessary, such as with an assigned sellside sales trader watching all the client's algorithmic flow so any issues or orders that look as if they are not in sync with their strategy will get flagged.
Bourne: I think the key to this is to follow simple naming conventions. We have seen some fairly exotic names being applied to algorithms. While this may appeal from a marketing perspective, it unfortunately tends to cause confusion as to what these algorithms actually do! Naming an algorithm after the client's target benchmark makes considerably more sense and will assist the client in selecting the right algorithm for the right job.
I think over-engineering is another thing to be avoided. Again, claiming that you have a huge number of extra bells and whistles in your algorithm might look good from a marketing perspective. However, if those bells and whistles do not allow clients to achieve optimal performance versus their benchmarks then no value is added. In addition, the risk of algorithm misapplication increases.
Celebuski: What this ultimately comes down to is the buyside strategy benchmark. For example, a value fund focused on individual stock selection typically has a long time horizon to capture alpha. Holding periods can be years. Therefore immediate execution isn't a realistic goal as the fund will have to pay too much to do that - a much longer trading timeframe may be more appropriate. (Contrast that with high frequency stat arb hedge funds that have to trade immediately to capture a fleeting opportunity.) The bottom line is there is a relationship between the alpha, the expected cost, and the trading strategy - and the buyside has to work out the balance of that relationship. Nevertheless the sellside still has a role here in providing tools that assist the buyside in assessing the optimal trade execution strategy.
Peter Sheridan: "While execution is obviously important, quality of infrastructure is also essential."
Conner: There are several ways in which this can be accomplished. The first is through the provision of a dedicated anonymous electronic central execution desk which proactively monitors customer activity to ensure smooth operation. For example, if the desk sees an order for a particularly illiquid stock being traded with an algorithm for very liquid stocks, someone can proactively call the client to check that this is not an error and perhaps suggest an alternative strategy.
Technology can also play a role here through an automated alert service available over multiple channels, such as email, text or instant messenger. This supplements the support from the electronic central execution desk by providing detailed activity feedback often not available via the buyside OMS or EMS. For example, if an order is too big in proportion to the average daily volume of a stock, the client can be automatically alerted to this and other "critical defence" type messages. This medium is also suitable for providing less critical "colour commentary" market/order information that clients may also wish to receive.
Another trend I think we'll see is an expansion in pretrade analytics combined with tools that can help the buyside craft their own algos for equities and other asset classes as well. This will allow clients to improve their initial algorithm selection and will dovetail nicely with the human and electronic sellside support while trading is in progress.
Finally - keep it simple and use names for algorithms that actually mean something to people.
Can you illustrate ways that providers might improve the current methods they use for transferring knowledge about how their algorithmic trading products work?
Banneville: A crucial element is the quality of sellside staff. They need to be both computer literate as well as trading literate. In addition, they need a quantitative element, because they have to be able to understand and explain what lies behind the algorithm.
Sheridan: One of the most important elements in this knowledge transfer is the quality and experience of the sales traders. In addition to direct client contact, sales traders are also part of an "execution consultancy" model that includes traders and financial engineers. This consultancy team can then work closely with clients to help design, implement and monitor optimal custom algorithmic solutions. Ironically, this model is not consistent with the common perception that algorithms are part of a "low-touch" strategy on the part of the sellside.
Bourne: I think the key here lies in working closely with clients to help them understand how the flexibility within an algorithm can assist them in achieving their benchmark.
In addition, there are some fundamental commercial questions that clients may need help in addressing. For example, do they want to trade the benchmark and use the algorithm on a target basis so the slippage is theirs, or do they want to buy it on a guaranteed basis from their provider so the slippage is the providers?
Celebuski: One of the most important points is to avoid information overload. Clients need specific, but relevant, information on how algorithms work. While a provider (such as Bear Stearns) may have a high powered research team working continuously on discovering the optimal way to interact with the flow in the marketplace, the full depth of their research would take a long time to explain to a client. More importantly, the effort required wouldn't add much value from the client's perspective in terms of improving order execution.
Conner: In order to understand how buyside traders and their portfolio managers think about their overall process of trading, there is no substitute for the sellside "getting on airplanes" to meet directly with their customers. Only then after listening and learning as much as possible about how they do business, and what's new in their world will you have the ability to explain how your electronic solutions might help them do their jobs more effectively using these low touch tools.
There is a common misconception that when someone connects directly to a broker supplied algo or the market, the best way to service the flow is by providing a basic reactive service with only an occasional clarification as to how something works. While this works perfectly for some, we often find that by proactively using trade examples, be they real or hypothetical, customers get a better handle on how our solutions can help them rather than just talking theory. How might the client execute an order using this or that algo? Why should they use that particular algo? What outcome could they reasonably expect? Of course, all this is further enhanced by other media forms such as updated product guides and effective post trade analysis etc. But as I mentioned earlier, trade automation must not result in human detachment.
Kevin Bourne: "The buyside is typically extremely clear as to which benchmarks they are interested in."
What sort of innovative sales and support services might be useful to help improve and facilitate buyside adoption of algorithmic trading strategies?
Banneville: I think it comes back to having the sort of sales traders who can provide proactive assistance to clients that is based upon a solid grounding of trading, technological and quantitative disciplines. If, for example, they observe a client using an algorithm in a sub-optimal manner (or perhaps even using a totally inappropriate algorithm for a trade) they can interact with that client in real time to suggest a better alternative.
Bourne: I think quite a few of the order management systems used by the buyside do not represent information about how a trade is executing versus a benchmark very effectively. As a result, we have seen a flush of execution management systems appearing. In many cases they simply add a further layer of technical complexity that the buyside could often do without.
Therefore the provision of Web tools that illustrate this performance data in a simple and intuitive manner would add value. For example, if a bank has been chosen to execute a trade, the buyside client's dealing room would be able to track a graphical representation of that trade versus the benchmark they have selected. If there is a market or stock event - such as a news story that causes a price spike - then this would also be represented.
Therefore when the sales trader and buyside dealer have a conversation about a trade they can both do so in a transparent and well-informed environment.
Celebuski: Unless the buyside client is using software from the specific broker they are trading with, they can't really see where they are in relation to their execution schedule. While we can provide that information directly to clients via a number of channels, in most cases their internal order or execution management systems will not have this functionality integrated.
Tighter integration between third-party products and broker algorithms would therefore add value. As a result, we have been working with a number of EMS providers so that clients are able to see this information as a web service on their screen, rather than as a FIX message.
Conner: The big picture is important. From a sellside perspective, it is easy to fall into the trap of just leading with product and pushing algorithms or a broker owned EMS. Obviously they are a crucial item in the trader's toolbox, but they are only a part of the equation in what really should be relevant as part of the larger overall holistic relationship.
Therefore sellside sales and support has to reflect that bigger picture. Yes, providing top flight algorithms is essential, but the whole package also needs to integrate other areas such as commitment of capital, access to banking deals, the ability to provide soft dollar and/or broker consolidation services.
Given that there is an overlap in terms of the basic functionality of many current algorithms, what factors are likely to influence the choice by the buyside of one sellside provider over another?
Banneville: Performance - full stop. Perhaps as little as three years ago, the buyside tended to use machines/automation as a way to reduce brokerage costs. Now there is the realisation that higher upfront brokerage costs are justified if the service provided sufficiently reduces trading impact so as to improve net trading performance. That is why transaction cost analysis is becoming increasingly popular as a means of understanding the true level of service across the various providers and the performance of their algorithms as regards the bottom line.
It is very easy to cut high-level commission by 5 basis points, but that is an overly simplistic approach. The buyside now realises that the opportunity costs of market impact far outweigh simple commission measures in terms of overall fund performance. At that level, the discrepancy in performance could be 75 basis points, not 5.
However, to prove that performance point, I think the sellside has to be able to offer some hard numbers to support its assertions about algorithm performance. If the algorithms offered to buyside clients have already been tried and tested on the provider's prop trading desk, then they are able to make their provider selection on the basis of solid information.
Sheridan: Anonymity is key. As a result, it is critical that the sellside provider's algorithmic/DMA business is entirely separate from any proprietary businesses. The same level of separation should also apply to the technology platforms.
There is often a tendency to focus purely on the execution element when discussing algorithmic trading. While execution is obviously important, quality of infrastructure is also essential. Clients need a platform that they can rely on in any market conditions - robustness, built in redundancy, and ease of use and integration are very important. Lastly, the service level that the sellside provides is absolutely critical to both winning and maintaining business. The more the sellside can do to help buyside clients manage all aspects of their algorithmic flow, the better their execution performance will be.
Bourne: The major factor is undoubtedly performance. However, to assist the buyside in assessing that performance in a consistent manner there needs to be similarly consistent understanding of the benchmark selected. Therefore there is a need for major sellside firms to agree among themselves on the attributes of the most common benchmarks. That would allow the buyside to benchmark the performance of the sellside on a like-for-like basis.
Another factor is cost - but the attitude to this tends to be dependent to some extent on the size of the buyside firm. Smaller firms tend to be more focused on the cost of execution because it has a bigger impact for them. On the other hand, larger firms tend to have a strong appreciation of the trade-off between cost and performance.
If they are buying a benchmark on a guaranteed basis then they understand that the guarantee has a price because they also understand that this represents a form of capital commitment.
Celebuski: The fit between the providers' algorithms and the buyside clients' benchmarks is obviously important. That requires the buyside to undertake a consistent evaluation of possible providers against their benchmarks - a task that some on the buyside have been doing very effectively for a long time.
Nevertheless, I think there is a role for an independent third-party to examine algorithmic flow from a number of standpoints and benchmarks to assist clients in assessing this fit in a number of respects. For example, at present we have a huge amount of overlap in terms of scheduling and very little overlap in terms of limit order pool management. Implementation of the trading schedule differentiates one algorithm from another.
Conner: The key here is avoiding the commodity trap. In recent years the buyside has been swamped with sellside firms offering them a huge variety of algorithms and services. As a result, the buyside has understandably started to think in commoditised terms when shopping for sellside providers.
This topic also comes back to my previous point about the bigger picture - if the buyside just regards you as a seller of commodity algos then you have a problem. Therefore an important influencing factor is how seamlessly you tie the algorithms into a much broader suite of products and services that will allow the buyside client to leverage the overall relationship with your firm.
In what ways might the ability of new more innovative algorithms to create additional value for the buyside become a differentiator amongst sellside offerings?
Banneville: It is increasingly apparent that the buyside clients don't want new algorithms, but algorithms more closely suited to their needs - one size doesn't fit all. A very simple example of this is an implementation shortfall algorithm; when you go to sell it to clients you quickly realise that implementation shortfall doesn't mean the same thing from one client to the next. Therefore relatively simple algorithms that are closely tailored to client needs are what are required.
This represents a significant change from the early days of execution algorithms, when it was a case of the sellside trying to come up with things they thought the buyside would want. Now we are getting the reverse of that, clients appreciate the value of algorithms but want this or that additional functionality. That might result in an improvement in the overall algorithm that benefits all clients, or it may just result in a particular tweak for a specific client.
Matt Celebuski: "One of the most important points is to avoid information overload."
Sheridan: Clearly the ability to innovate on behalf of clients and create truly customised solutions is the key to adding value going forward. Sellside firms that have modular offerings are better able to combine the various components using a "building block methodology" to quickly and efficiently create new solutions for clients.
This modular approach makes a considerable difference when clients are contemplating whether to build or buy. Their understandable assumption when confronted with providers who can only offer "one size fits all" solutions is that they will be better off building their own platform themselves, as they will get exactly what they need. However, a provider able to offer a modular approach will be able to offer them the same degree of tailoring as a self-built solution - but at a fraction of the cost and inconvenience.
Bourne: I don't believe that the investment should be in new algorithms, but rather in enhancing the performance of existing algorithms. That is because the existing algorithms already tend to reflect the objectives of the client accurately. Therefore several sellside firms are looking at integrating newsfeeds into their tick data, which will enhance its information value and assist in imposing discipline on the execution decision process.
Celebuski: One differentiator is how effective a provider's algorithms are in terms of acting as an aggregator for liquidity pools. Whatever form those liquidity pools take, algorithms can run a consolidated limit order book in ways that humans cannot.
Another differentiator is the extent to which providers can incorporate external information (whether that is news or some very short-term alpha) into their algorithms. For example, a news feed might be used to determine the amount of "chatter" in the system and whether or not that is creating volatile conditions. This in turn might trigger an alteration in trading style to prevent execution costs being inflated by that volatility.
Conner: In the US today, there are many new sources of liquidity/market centres cropping up seemingly every day where a single name can trade. In fact, we have been experiencing a re-fragmentation of liquidity like we did 5-6 years ago during the ECN explosion. However, this new breed of re-fragmentation we are all living through provides a unique new challenge knowing that many of these sources of liquidity don't broadcast the contents of their markets and are thus "dark" sources of liquidity.
To support clients effectively in this dispersed dark environment, sellside providers have to be prepared to be agnostic and allow their algorithms to check all sources of liquidity, including private liquidity sources; even if they are offered by competitors.
The value-add for the buyside from this occurs if their provider offers them this multi-sourced liquidity in a consolidated and above all intelligent manner. Simply interacting with all visible liquidity & connecting up to the dark pools is not enough. There also has to be intelligence in the way hidden liquidity is identified and accessed.
Is demand now outstripping supply for new algorithmic trading strategies or are there any obstacles hindering further take-up of algorithmic trading amongst the buyside?
Banneville: I think that depends on both the investor and the country. Some countries and some clients are more advanced than others. There are lots of algorithms and banks providing them, resulting in a highly fragmented market, which can cause confusion. However, broker neutral platforms can assist here by making it simpler and less costly for the buyside to try different algorithms before they commit.
Sheridan: The main issues that are currently constraining further take-up of algorithmic trading are limited buyside IT resources and FIX capabilities. Buyside IT teams often have massive priority lists and simply cannot move implementation projects along any faster. Therefore sellside providers need to respond to this by offering alternative versions of their core solutions that allow clients to access their algorithms and analytics without incurring any IT overhead.
Bourne: I think one of the greatest challenges to the take-up of algorithmic trading on the buyside is the functionality of their order management systems. Some of these were developed well before the concept of algorithmic trading existed and are therefore not readily adapted to incorporate algorithms. Even where they can, the buyside client may not necessarily be able to justify the cost of upgrading.
Another factor is whether buyside clients actually want to take direct control of algorithms, because in doing so they may transfer the (best) execution risk to themselves. In a post-MiFID environment some of them may not wish to do this.
Celebuski: You only have to look at the algorithmic trading growth rates the various research groups are predicting to see that if there are obstacles they are likely to be overcome. Having said that, education (or rather the lack of it) is a possible obstacle for some, as is simple technophobia.
Conner: Quite a few buyside firms feel rather maxed out on algorithms. There is certainly a buyside perception that there is too much supply in the generic algorithmic space. Complicating things, there is also a buyside perception that they have a lack of support
Dave Conner: "...sellside providers have to be prepared to be agnostic and allow their algorithms to check all sources of liquidity..."
from the sellside tailored to assisting them in making best use of the algorithms available. There is still very much a servicing gap in the market that only a few providers are really addressing correctly. Buyside traders have the constant challenge of balancing commission management with the need to achieve best execution. We are seeing a trend by the buyside towards aligning themselves more with full service firms for their low touch activities where their commission dollars can be best leveraged.
What functionality is likely to characterise the next generation of algorithmic trading systems?
Banneville: I think being able to access liquidity and scan a variety of marketplaces, rather than just going to the default market. Clients don't really mind where they trade as long as they can get things done and they can have access to liquidity. Therefore algorithms able to detect this liquidity and use it effectively to assist execution performance will probably be at the forefront. MiFID will drive further emphasis on this through its best execution requirements that will make it vital that orders are routed in the most efficient way possible.
Sheridan: The key things that we see as part of the "next generation" are:
• The provision of a single point of access (via an algorithm) to new liquidity pools
• Portfolio trading strategies that focus on the risk element and allow traders to minimise transaction costs through advanced risk management
• An increase in applications that provide clients with both auto-execution and 'request-for-quote' functionality to risk pricing
Bourne: The integration of newsfeeds and additional real-time data from multiple sources seem strong contenders. They will assist all parties that are observing the performance of a trade in assessing the likely event impact on that trade while it is being executed.
Celebuski: Tighter news integration is an obvious example, but another area of potential advantage is the use of weak form hedge fund type models. Hedge fund returns from quantitative strategies have declined over the past five or so years. As they have approached the returns of cash it has become disadvantageous to invest using these strategies. However, if those (now) weak models are able to predict a return of five basis points and there is a six basis point sunk cost to put on trades, then those models can obviously add value by reducing overall execution costs.
Another likely example of next-generation functionality is a true consolidated limit order book algorithm, which can search for all available liquidity. At the most basic level, the creation of such an algorithm is a significant plumbing and connectivity challenge. However, more intelligent versions of these algorithms will take things a step further by segregating out every single price and value to predict hidden liquidity. This avoids the problem of simple sweeping of visible prices where you might actually take a market up above where there is hidden liquidity on an alternative venue. We have focussed significant resources in this area.
Conner: Analytics in a consolidated EMS that will work with multiple algorithms from different brokers, multi-asset class algos and enhanced understandable interactive & post trade analysis are all areas we commonly feel will define next generation algos and/or algo support tools.
Over the coming year, what sectors of the buyside would you expect to see showing the most interest in algorithmic trading?
Banneville: Hedge funds are obviously using direct access but many of them use their own algorithms. However, we are seeing increasing use of algorithms by traditional long only funds. Algorithms also have something to offer corporates for things such as buyback programmes where there are strict rules, such as only sitting on the bid and not being allowed to lift the offer; a situation well suited to algorithmic execution.
Our view is that long only funds were a little bit behind the curve in terms of algorithmic adoption, but their appetite has now increased. Therefore 2007 will probably be the year in which their activity starts to lift off - particularly mid sized managers.
Sheridan: Both hedge funds and buyside program desks are already for the most part established users of algorithmic trading. Going forward, we expect to see more interest from active long-only investors, as their FIX capabilities improve and the sellside continues to roll out more options for them to access algorithms without the need for large IT spend. Finally, we would also expect to see long-only activity increasing in response to the roll out of new strategies - particularly liquidity seeking strategies.
Bourne: I don't believe it will be a case of a particular buyside sector predominating. Instead, more depth of usage amongst existing groups of users seems likely.
Celebuski: I think we're going to see the long only managers really expanding their use of algorithms, because they will need to be able to access all the available liquidity. For example, they may have trades representing several days of volume in a security, so the ability to set an algorithm working on part of that in a continuous liquidity pool while they manually shop for larger blocks, clearly adds value because it opens up access to both ends of the volume spectrum.
Conner: I don't think you can really segment growth in that way. To determine who the big adopters of algos will be in 2007, you need to identify the actual people who are in the buyside trader/dealer seat, and figure out what value they place on using algorithms as tools to help them achieve best execution. It could be that they work for a hedge fund, mutual fund or bank… Clearly, anybody that sees the benefits of having a hands-on approach and a desire to do their own trading in the most efficient way will be a likely taker of this sort of technology.