The Gateway to Algorithmic and Automated Trading

Sentimental smarts

Published in Automated Trader Magazine Issue 38 Autumn 2015

The era of unstructured information flow has made quant shops keenly interested in news sentiment analysis. Automated Trader finds out what the latest developments are and what's getting practitioners excited about the future.

Jae Hong Kil

Jae Hong Kil, CEO and portfolio manager at Sentiment Alpha Capital Management

The sentiment analysis space, says Jae Hong Kil, CEO and portfolio manager at Sentiment Alpha Capital Management, keeps growing and evolving. There's more data, more connections to data, and a greater trust in the results of backtesting as that data builds up over the years.

"All the quant shops are interested in it," said Kil. "The social media space is really gaining ground. People will find a way to use sentiment data for trading and there are many funds and day trading shops using it now."

Getting an estimate for what portion of the wider buy side community is using it is a tricky business, however. There's no specific research available, though an informal survey among experts pegs the figure at some 10%. And for the buy side that is using it, vendors providing data feeds report that there is little to no communication on how the information is implemented.

Founded in 2011, Sentiment Alpha aims to incorporate theories relating to the wisdom of crowds, behavioural finance and the predictive power of social media. The fund launched in 2013 and has 'single-digit millions' in AuM, with family and friends as the main investors. Once it establishes a track record, Kil plans to approach institutional investors.

Annual return is a 'low return, low risk' model, which translates to between 5% and 10% annualised return with a Sharpe Ratio of about 1.5 to 2.0. There's a new strategy in the works targeting a return between 20% and 30%, with a Sharpe Ratio of 2.0.

The company was born out of a research project at Stony Brook University started in 2004, located in Long Island, New York, where Kil was a research assistant. The project aggregated news and social media, generating analytics, including sentiment data, for named entities.

After stints as a quant trader for Natixis, Kil was recruited back to Sentiment Alpha, which was ultimately spun out of the research project.

With a proprietary natural language processing (NLP) engine, the firm collects hundreds of millions of online sources and has an archive of some 100 terabytes of raw data that it processes using many algorithms, machine learning being one.

"Each named entity has a sentiment score given a day. It can be iPhone 6, or it can be Apple, or Microsoft, or a person's name. We keep track of every single named entity mentioned - that's like 2 billion named entities," said Kil.

Meanwhile, social media usage is exploding. When Stony Brook began the project, Twitter didn't even exist. Now, it's the darling of financial circles.

As usage grows and years tick by, Kil expects confidence among investors to grow.

"It's still hard to raise funds, but much better compared to three or four years ago. We have to fight to convince investors why this works, why we have competitive advantages. This is new, this is not traditional," said Kil.

Indexing sentiment

Richard Peterson, Managing Director, MarketPsych

Richard Peterson, Managing Director, MarketPsych

MarketPsych's managing director, Richard Peterson, is all too familiar with capital raising troubles. Between September 2008 and the end of 2010, he put his experience - which includes a degree in electrical engineering, a doctorate in medicine, a residency in psychiatry, and postdoctoral research in neuroeconomics - to the test with a market neutral US equities fund. Net returns (minus fees) over the S&P 500 were 24% during that period.

But the financial crisis made it difficult to raise money "Few trusted fund-raisers after Madoff," he said.

At the same time, financial institutions were calling him up for data, and he has pursued that through a partnership with Thomson Reuters to produce and distribute a sentiment index.

Peterson estimates that some 90% of clients are buy side, albeit a wide variety - high frequency traders, long only mutual funds, CTAs. There's also economic research departments.

But he does note a lack of adoption across the financial community.

"One of the barriers to adoption is the use of inappropriate use of linear statistical techniques during testing," Peterson said.

Linear regression models, for example, are not particularly well suited for non-linear sentiment events, like fear associated with market panic.

"Linear models can't account for VIX index spikes. And ignorance of how to model such spikes accounts for why 'black swan' funds do well over time," he said, referring to Universa, a hedge fund for which Nassim Taleb is a consultant and had reportedly made a billion dollars during the August market plunge.

Preferred techniques, Peterson added, include decision trees, cross-sectional arbitrage of extreme values, regime specific models, or moving average crossovers of information.

"Many traders aren't mathematically literate in the way they need to be to model markets. Some apply linear techniques assuming a market with normal price change distribution," he said.

In general, Peterson identifies a swath of weak research and unreliable results. For example, many academics create their own sentiment analysis engines, which are necessarily 'primitive', he noted.

In the past, some academics used the Harvard General Inquirer, an open source dictionary based on 19 th century British literature, with words identified as connoting negative sentiment including 'investor' and 'financier'.

"Many articles about investing default to negative sentiment based on such a dictionary, which is ridiculous," Peterson said.


Source: MarketPsych
Buzz around the Russian Ruble which illustrates how media data can be nonlinear, with bursts and spikes.

Reactive predictions

Whether analytics is best used for its predictive or reactive features is the subject of a significant debate across the industry.

Charlotte Wall, Managing Director and Head of Sales and Marketing, OTAS

Charlotte Wall, Managing Director and Head of Sales and Marketing, OTAS

OTAS Technologies recently launched a natural language reporting product called Lingo. It produces reports containing updated analysis of standout and unusual activity in stocks. Clients include Franklin Templeton, Fidelity and Union Invest. The majority of users are buy side and hedge funds, though OTAS has been moving into sell side more recently, helped by a recent partnership with Fidessa.

The analytics service is not about predicting events however, it's about providing decision support, said managing director and head of sales and marketing for OTAS, Charlotte Wall. Essentially, an alert will identify any major news events when there is a stock market move. For example if there's an earnings downgrade on a company, an alert would show a news item at that time.

This is integrated with existing analytics apps that look at real-time execution using factors such as volume, liquidity and spread. OTAS takes raw data across all lit markets and uses mathematical modelling to trigger when there are deviations on a single stock basis relative to a basket of peers, for example.

Though proprietary quantitative metrics like divergence underpin what OTAS does, clients were also pushing for news analytics. But they can't have it all.

"There's just going to be this huge amount of information that is not actionable, so what we are doing is teaching our machine to make it relevant, and link it to other things," Wall said. "News is noise in the current format. However, we do think it's valuable if you put it into context."

OTAS is launching the news overlay along with machine learning tagging that has already been applied to other metrics.

OTAS Technologies

Source: OTAS Technologies
Current metrics across market observables, which includes the news stamp to be launched in October.

NLP breaks through

The real breakthroughs in the past five years have been in the use of machine learning technology in natural language processing to adapt to specific domains. What works well for movie reviews doesn't work well in the space of enterprise software.

Prem Melville, Founder and CEO, Social Alpha

Prem Melville, Founder and CEO, Social Alpha

"Unstructured data is very different from structured data, and when you move from nice clean sources like financial news to social media, it's a whole new set of challenges," said Prem Melville, founder and CEO of Social Alpha, a vendor providing predictive social analytics.

"If you take an off-the-shelf technique and try to trade on social sentiment, you can actually underperform the market versus using a model that is specifically catered to capital markets," he said.

He referred to research showing that by gauging news sentiment before markets open, and going long on positive and short on negative news, you'd be right about the direction with a little more than 50% accuracy. But, if you always predicted the market would go up, you'd be right 51% of the time. Compare that to models trained specifically for capital markets, which would predict direction of stock movements more than 55% of the time correctly.

Mining machine readable news has been around for some 10 years, with leading hedge funds pioneering its use. But the forefront of technical analysis now is social media analysis.

"Social media is the fastest moving source of information," he said. "But it is a very noisy channel. If you look at Twitter, you are seeing over 500 million tweets a day. So, how do you get to the information that really matters given this high volume and high velocity of content?"

Social Alpha likes to think of what it does as "information triage". Sentiment analysis is one component of doing just that. Social media analytics, said Melville, is explicit crowd sourcing, or "just listening to everyone".

Technology to do that may have a long way to go but it's also come a long way.

"We are very far from a human level of natural language understanding, but there have been significant breakthroughs in the last 10 years through the application of machine learning in the domain of natural language processing," Melville said.

For one thing, there's been a conspicuous shift from hand crafted models written by linguists, to completely data-driven approaches. "Where we now have so much data in specific domains, we can have machine learning algorithms learn or induce models directly from observing the data."

Moreover, a resurgence in neural networks research through deep learning, has been boosted by using sophisticated hardware, like GPUs for example.

And when institutions like the Bank of England start doing it to make predictions, people take notice.

The central bank recently set up a taskforce to monitor internet and social networks. Andrew Haldane, the BoE's chief economist, told Sky News that they've already been working with unstructured data to help get an idea of the economic recovery.

Melville noted that the first forays in this area have yielded a great example of a common pitfall. By trying to look for signals indicating a panic-driven bank run caused by the Scotland referendum on independence, they got football. This had been captured because they combined the term "run" and the abbreviation "RBS". But in this context, the reference was to 'Running Backs' and not the 'Royal Bank of Scotland', explained Melville.

"What they should be looking for is people expressing fears and anxiety around the banks, and around the upcoming referendum. You are looking for a second order signal there," he said. "It's not people talking about the event itself."

Social Alpha

Source: Social Alpha
Social Alpha dashboard screenshot from 28 September, 2015. Downtrend in social sentiment is clearly reflected in stock prices.

Signals in the noise

Armando Gonzalez, CEO, RavenPack

Armando Gonzalez, CEO, RavenPack

Sifting through noise is what RavenPack does with its news analytics service. The bulk of its clients are hedge funds and asset managers, as well as tier one and tier two banks. Within that group, those with quantitative or event-driven strategies tend to have the most interest.

On the sell side, brokers developing algorithmic trading strategies for executing trades are represented.

There's also been growth in news analytics being used by compliance functionaries - either by banks or vendors that provide surveillance tools. For regulatory purposes, risk managers are looking to RavenPack's data feed to understand whether clients are trading before or after the news, looking at time spans down to the millisecond.

The predictive perspective, explained Armando Gonzalez, RavenPack's CEO, is that the information extracted from news or social media is timelier than fundamental data.

It's the same for corporates as it is for economic indicators, it is always backwards looking. Price, he added, is arguably backward looking as well.

"Price is an expression of what investors thought about the stock a millisecond ago, a day ago, a week ago. There isn't any inherent predictive information in price transactions," he said. "But the theory is that there is when you start to look at trends and patterns that emerge from multiple opinions, or pieces of information put together in a way that is unique, gives you an edge."


Source RavenPack

That's where the predictive power of unstructured data analysis, also called big data analysis, can be found.

There are many cases in which news on company performance arises with great expectations for a product launch, for example. But as euphoria deteriorates, and reviews come in poor, the anticipated outcome on the next earning announcement could see a correction in share price, Gonzalez explained.

"Individuals that tie those pieces together are ultimately able to make better predictions on company performance," he said. "It's a cyclical process. But we do need to find and understand better what information is explanatory versus what is predictive."

Although Gonzalez specialises in news and social media sources, he's keeping a keen eye on developments in harnessing web traffic information, or the 'digital footprint' of companies, as a predictive source. Tracing company activities as a precursor to preparing to launch a product in a new industry or country, for example.

He also referred to the internet of things as a great opportunity in a future where tracking sensors could detail a company's production.

And then there's the abundance of information trapped in banks themselves, the users of information.

"Financial institutions have been producing research for decades, so they have tonnes of pages that analysts have accumulated," Gonzalez said. "When you train analytical technology to understand relationships and patterns, there's great opportunities for modern day big data analytic investment banks."

Source: TABB Group

Hack attack

As opportunities grow so too do risks, and cyber security is identified as one of the warning flags. Relying on public data, including social media, and its abundance of false information should give any practitioner pause for thought.

One of the most spectacular examples of hooliganism leading to market response is the 'hash crash'. In 2013, hackers took over the Associated Press' Twitter account to announce a bomb had gone off at the White House, injuring President Barack Obama. Within seconds the Dow lost 100 points.

"Market manipulation becomes an immediate reality as a result of fake information published on reliable, verifiable sources. We haven't really understood what problems this might have and might cost," said Gonzalez.

Commentators noted that although this is a concern, at the same time, it is also against the law. After all, there will always be people who break the rules across any industry.

Meanwhile, filters are getting better all the time, said James Cantarella, business development manager at Thomson Reuters.

"People like to hack and cause trouble, especially if there's a way to make money off it. But I think people are getting better at figuring out what's real and what's not," he said.

Thomson Reuters has an internal project that aims to validate tweets. Cantarella pointed out that in the case of the 'hash crash', the style that the tweet was written in and where the tweet was published from would have given away that something didn't add up.

Moreover, it's hardly a problem stemming from automated systems. "It seems to me that the reaction to the AP hack was much more from people sitting at desktops hitting buttons. I am less afraid of automated systems reacting to erroneous unstructured data, partly because people still have quite a bit of distance between their trading boxes and tweets," he said.

Thomson Reuters' machine readable news was launched in 2005, followed by a sentiment engine in 2008. The typical client comes from the quant equity hedge fund space.

Social media sentiment and analytics, Cantarella said, is gaining a lot of attention but not necessarily traction.

"There's been any number of start-ups that are getting into this space. It doesn't seem that anybody has done it strongly enough to really capture the demand," he said. "Getting a signal that's good, predictable, consistent, and something that you can also backtest, there just isn't the history there."

Aside from traditional users, Cantarella also noted sell side research and execution desks taking a greater interest as well as compliance, with exchanges and brokers, and really large sell side firms increasingly interested in surveillance and risk functions.

Temporal risks

Dan DiBartolomeo, Owner, Northfield Information

Dan DiBartolomeo, Owner, Northfield Information

For the largest traditional buy side clients, however, it's uncertain how much value there is to be gained reacting to actionable news sentiment signals. This is particularly true for the short term, said Dan DiBartolomeo, owner of Northfield Information.

Northfield does not provide any predictive analytics for trading purposes. Its focus is strictly on risk, liquidity and trading costs. In 1997, a big options trading firm wanted to assess risk of financial instruments over a short horizon, and DiBartolomeo's team started investigating models.

"All models in some way look at history, but the real question is: how do we understand the present to be different from the past? If conditions change and you are working from a purely historical context, then you have to wait for some new history to update your understanding and we wanted to operate much more quickly than that," he said.

Northfield's clients are some of the largest traditional asset managers around - BlackRock, Fidelity and CalPERS for example. Too large, said DiBartolomeo, to take full advantage of alpha generation.

"Even if they had an edge, if by analysing news they can see that this stock is going to go up today, they'd never be able to do the trades in sufficient size to make it worthwhile. They are just so big that there's not enough liquidity," he said. "Unless you come up with a way of using news that has a fairly long shelf life - next month, or two or three months - really big funds aren't going to utilise it."

The next big step forward in using news analytics in risk modelling, he believes, will be a methodological recognition that the world changes.

"From a conceptual perspective, the key here is: yes, history is important, because things don't change that quickly, but we also have to recognise that they do change."

Risk, he added, is about the future, not the past.

"You only get hit by a car that you don't see coming. You wouldn't stop looking both ways crossing a street just because a car hasn't passed this particular road in the last four hours," he said.

"But that is precisely what we do in financial markets all the time. We assume that we can look at some past period and assume the future. That's just silly," he said.

Matt Jenkins, Chief Data Officer, Digital Contact

Matt Jenkins, Chief Data Officer, Digital Contact

Digital Contact is launching - a real-time analytics platform for investors, traders and researchers. Its big data archive collects live price data, social media volume, sentiment analysis and key word extraction, among other things. We talk tech with Digital Contact's chief data officer, Matt Jenkins.

Automated Trader: How are you using machine learning?

Matt Jenkins: One of the biggest challenges with surfacing hundreds of millions of pieces of data and sorting through content that is relevant to different companies is, how do we group bits of media?

We use a vector space modelling system, which is a shallow neural network. You can train it up on as much information as you can throw at it. It will look at a particular word and try to figure out the probability of that word existing in conjunction with other words within a sentence.

One of the huge advantages that gives is whereas previous natural language models treat sentences such as, 'this film is good', and 'this movie is great', as separate because it has different words in it, the system will learn they are incredibly closely correlated.

The words 'film' and 'movie', and 'good' and 'great' have similar vector scores.

AT: Have you had to make any major adjustments to the models?

MJ: A lot of models have been based on static data sets. But you need to make sure that any new articles are getting classified accurately, but also previously classified information needs to get changed and moved around. We are using a few different clustering methods for that. We have modified a probabilistic approach, BIRCH.

AT: And on the storage side?

MJ: We use Hadoop for our storage and retrieval. It's well established as a fantastic open source community and getting developed all the time.