The search for structure
Whether a quant modeller is able to articulate it or not, ultimately good algorithmic trading is about a search for structure in the noisy data of markets. It is about finding patterns, regularity or pockets of predictability. Here is a simple example of what is meant by structure. Let’s say that we observe that whenever the market goes up two days in a row, it usually goes up the third day. If this happens quite often, we have found the pattern or regularity we were looking for. The trading strategy immediately follows. If the market goes up two days in a row then buy at the close of the second day and sell it at the third day’s close. If only!
It is easy to get fooled by randomness and see patterns that in hindsight seem nonsensical at best. Technical analysis books are strewn with all sorts of patterns with colourful names (cup and handle!), most of which will not stand even mild statistical scrutiny much less any systematic way of teaching a computer to identify the patterns.
To compound the problem markets are to a first approximation just noise. Yet they fail statistical tests of randomness in tantalizing ways. The histogram of daily returns of most assets seems nearly bell-shaped. That is what randomness would prescribe except that the tail of the distribution is much fatter, i.e. periodically markets have much, much larger moves than a normal distribution.
When we study the dynamics of daily moves rather than the aggregation in a histogram, things get even more interesting. If markets were indeed a random walk, today’s move would be independent of yesterday’s move. Not so, say econometricians where there is a cottage industry of models (called the GARCH models) which suggest that big moves (in absolute terms) are usually followed by big moves i.e. there are times when markets remember yesterday’s moves. Panic and greed leave the markets quite shaken with bursts of volatility! This is a structure that has been modelled by academics and used by practitioners to model volatility for derivatives. From a trading perspective it can be used not for a strategy as such but as a signal of the onset of volatility and a lowering of trade size during this period.
There is more that academics have discovered about financial time-series. It may be hard to figure out what the future holds for a single time-series, but academics have found techniques (called co-integration) that allow us to create portfolios of longs and shorts of different assets in such a way that the value of the portfolio oscillates around a mean value much like a sine wave. Again this is the regularity a modeller is looking for and the strategy is clear. Buy the portfolio when it reaches the bottom of the sine wave and sell it when it reaches the top. The trouble alas is that markets change, the relationships break down and the sine wave starts moving out of its band or narrows to the point where the strategy is unprofitable. It is a non-stationary world. The message to a quant is clear – there is structure to be exploited, but beware of noise, fat tails and non-stationarity.
The search for structure usually begins with an insight about markets. This could come from academia or from traders or simply from a combination of financial, mathematical and computational skills. An example from academia is the so-called factor model for stocks. The goal here is to be able to explain the daily move of a stock by a variety of factors. The factors could be macro-economic, broad market indices or other technical factors. However, the crucial test of the model is that the part of the daily return that is unexplained by the factors (the residual), should again behave like a sine wave. We are back to a world of regularity.
Another example of a model inspired by traders is the so-called carry model for currencies. When interest-rates in Australia are significantly higher than in Japan, it is tempting to borrow yen at cheap rates and invest in Australia. This requires considerably long holding periods and is prone to periodic but violent corrections. Where a model differs from a human being is that it is impervious to fear and greed. So although the idea originates from traders, the modeller tests the strategy on past history, recognises the deficiencies of a naive strategy, uses the full power of math and financial skills to create portfolios of currencies and signals that shield the strategy from the extremes of these violent corrections.
The hazards of modelling
Once the trade idea emerges, the task then is to start coding this insight into a strategy by teaching the machine when to get into a trade, how much to trade and when to get out of a trade. Machines are amazingly fast at performing calculations but demand an exacting degree of precision about decision making. To simply teach a computer to buy a portfolio “roughly” when the portfolio is at the bottom of the sine wave requires a lot of code and parameters that control decision making. Over-parametrisation is the bane of many a model. The more knobs and switches there are in the dashboard the more tempting it is to use them till you reach a point where you have no clear understanding of what is the source of the p/l and what is just cosmetic.
In our anxiety to be able to capture all trades in the past, we sometimes inadvertently make the system overly complex and make it worse by choosing parameters that work best for past data but perform poorly in the future – overfitting as it is called by the practitioners. There are other subtle and not so subtle ways in which a modeller can trip up. Having too many models of a similar type, or looking ahead in the data are common mistakes. Chasing a phenomenon that has no discriminatory ability in picking the inflection points of a time-series is a mistake that even a seasoned modeller can make.
A key to modelling is to incorporate realistic transaction costs, otherwise it is easy to come up with strategies especially in high-frequency that seemingly make a lot of money in theory but are sure money-losers in reality. Brokers make a living out of transaction costs – it is important to remember that in modelling!
No matter how well a model may capture a phenomenon, the modeller should be prepared for what is called the “tail risk” – that rare but catastrophic event that accounts for the fat tails in the returns. This takes the form of deleveraging, reducing the trade size, and stopping out of certain trades or models. Risk management is usually an integral part of the modelling process and the triggers that lead to reduced trading are found by experimentation. Along with a wary eye on the tail risk, comes also an attention to other forms of risk – over-concentration on certain assets or sectors, market impact due to large trades, being too long or short a certain market and other model-specific risks. All of this is an integral part of any good quantitative model.
But one good model does not a hedge fund make. A successful fund needs a portfolio of models covering all asset classes, differing holding periods from the high frequency to much longer holding periods, differing types of strategies – momentum, mean-reversion, basket-trades, carry trades and more. If the models are added carefully they can enhance the Sharpe ratio where one model compensates for the under-performance of another.
Only the paranoid survive
If it’s one thing markets teach you (as the debacle at Long Term Capital Management showed) it is humility. At IKOS we don’t take past performance for granted in the future nor do we assume we have answers to everything. There is a consistent re-evaluation of all the models and rigorous questioning of the assumptions even if it ultimately renews our faith in the models.
Once a model has started trading, periodically it may be time to question the models. Are the drawdowns the model is experiencing commensurate with historical drawdowns? Is the decay in recent performance routine or abnormal? Is there unusual slippage between the theoretical costs of trading and reality?
There is also a constant search for newideas and new models that takes into account changes in the market place. Although algorithmic trading has had a long history in the equity and futures markets, it is a relatively new phenomenon in the spot F/X market, thanks to banks and other electronic communications networks allowing electronic trading. IKOS is positioned to take advantage of these and other innovations in the marketplace.
It is also necessary to pay attention to the ever growing regulatory landscape impacting the financial industry, especially in today’s post-Lehman world. How will a ban on short-selling affect strategies? What if the Tobin tax is enforced on F/X transactions? What if certain types of high-frequency trading are prohibited?
Every crisis is also an opportunity to re-invent and adapt to the changing world. Above all this requires a clear sense of the environment around us, an awareness of what the models are and are not capable of and a willingness to accept reality, act accordingly and do what the moment requires. Having been through the worst of the market meltdown in 2008 gives us confidence that we do have what it takes not just to survive but to flourish. We hope we’ll keep that spirit of humility, confidence and nimbleness in the future even if we don’t have all the answers!
IKOS Asset Management has a team of 70 professionals and manages over $1.5 billion. It creates state of the art trading technology to capture Alpha from trading in global financial markets, including equities, currencies, commodities, interest rates, indices and bonds. IKOS advises managed accounts and its strategies are available on leading third-party hedge fund platforms.