Robustness in Systematic Investing

Resisting complexity can aid long-term returns

Originally published in the May 2015 issue

The word ‘robustness’ in systematic investing is used to describethe resilience of a rules-based strategy to market shocks, except that it is subjective and there is no defining measure for it. With as many variables influencing asset prices, as there are ways of arranging investment ideas into a strategy, it is a judgement call in the end.

Intuitively we know that somewhere between simplicity and complexity lies the hallmark for robustness, where something is “made as simple as possible, but not simpler”, as Einstein put it. A systematic model, on the one hand, needs to have its primary inputs explain a market phenomenon. But to successfully explain that phenomenon over time, it often needs the help of additional variables. As variables increase, however, the process rapidly shifts towards complexity. And with this rises the risk of loss of robustness.

It is easier for a modeller to add “fixes” than to take them away. A short-lived market problem that may pass in the course of days may appear to be more permanent in nature. This can lead to finding a solution only then to discover the problem was temporary. Such a process of continuous doctoring can lead to an increasingly cluttered systematic strategy, introducing many moving parts. It is a fact that the more variables we introduce, the more likely it is that we have explained the past, a practice of “fitting” or “data mining”.

Simplicity, parsimony and complexity
When we talk of simplicity or parsimony we think of the principle of Occam’s razor that “Entities should not be multiplied unnecessarily”, or van der Rohe’s “Less is more”. While acknowledging that parsimony, being frugal, is not tantamount to simplicity, we do believe that robustness in a systematic strategy comes with a degree of parsimony. The exercise is muscle-building in approach, working with few dominant variables affecting asset prices.

The variables need to explain the rise or fall in risk premium, the excess return over a risk-free asset. They equally need to shed light on the corollary: the measured increase or decrease in risk aversion through a desire for a liquid or safe-haven asset. Starting on a blank canvas and introducing parts that are truly value-adding is a better way, in our mind, towards robustness.

Issues with complexity
An absolute return systematic strategy is expected to deliver performance in all environments. Different parts of the strategy need to perform in their own suited conditions. For example, a trend-following component should capture market trends, a mean-reversionary or countertrend tool would perform broadly in range-bound oscillating markets, and so on.

In modelling, data mining or optimisation using iterative methods is often used to identify what component to use in a certain environment. Such methods use past data and are dependent on empirical tests that are driven by a process of enquiry. The fact is the future will always be different from the past. We know that with as many variables as there are data points, we can explain any data. It is easy for researchers to build complex multi-dimensional models with numerous, seemingly, non-correlated factors to explain a market condition. In quantitative back tests the factors may work in synergy to perform well on varied scenarios.However, since the models are tested for ideas and parameters based on the past with no knowledge of the future, there is a distinct risk of ‘back fitting’. In order to reduce this risk, researchers will typically conduct blind tests and sensitivity analyses, changing parameters, Monte Carlos and so on.

An art of reduction, not complication
While this article does not suggest there is a unique way to design a robust process, we do believe that fewer but more powerful and clear decision-making variables are ableto handle both risk and opportunity more creatively. There is less need for frequent updates, avoiding early model decay as well as the danger of over-optimisation. Spurious correlations that may seem to hold in the previous data can quickly break down as the data evolves or conditions change. The risk of compounding the number of variables in a system is increased errors.

While it may sound counterintuitive and against our cognitive biases, we consider it to be more robust in a model to either leave a wide “margin of error” in any forecast, or to use fewer streamlined variables. The process allows for a variable in the short term to underperform, but in the knowledge that its true value over time outweighs the immediate weakness. It is likely to handle unanticipated market events more capably through clear defined actions, and avoid the need to frequently enhance a strategy to stay ahead of the data.

Market evolution is like the physical phenomenon when scaled over time. The process is subject to random shocks that shift the data into unknown territory in due course. New events or developments push participants to invent new sources of risk premia, replacing old ones. Our research therefore places greater importance in identifying decision-making variables or economic systems that are better at dealing with macro shifts in market behaviour. These perform more robustly under unexpected conditions.

It makes sense to stress-test a strategy for its capacity to perform when market microstructures shift, or there are permanent changes in the economic system not seen before. Examples are, say, a permanent shift in correlations between markets, or if a market’s risk pricing gets forever altered. What if nominal interest rates shift down to low levels for considerably longer periods than experienced, or oil permanently settles at reduced prices due to changes in the market microstructure?

A robust model should not take for granted that normalcy will return, but rather be prepared for a new normalcy to be set with altered patterns not evident in the past. Tectonic shifts that can permanently change the behaviour of any particular market, or the underlying economic systems, are kept an open possibility. The reduced variable approach, in this context, should more robustly handle such major long-lasting changes in conditions, without depending on accurate forecasts from past data.

The power of blending art and science
In conclusion, we believe a great invisible, but powerful, force is brought into a strategy if we can judiciously combine art and science. The power of creativity, thinking left-sided, is as important as that of the rigour of analytics. We should use this blend in identifying major variables that explain asset prices and their relationships with each other. Above all, when it comes to complexity in a model, we suggest that engaging in the practice of active reduction, rather than addition, is more likely to take a strategy forward in its return potential, longevity and robustness.

Aref Karim is the founder, CEO and CIO of QCM. Karim is a qualified chartered accountant. He is responsible for the firm’s investment strategies and research. Karim left the Abu Dhabi Investment Authority (ADIA), where he worked for 13 years, to found QCM in 1995. At ADIA, he was involved in the foundation of the sovereign wealth fund’s alternatives portfolio.

Ershad Haq is a board member and director of macro research at QCM. He joined the firm in June 2000. He is a chartered financial analyst. Haq works closely with the CIO to assist in all aspects of component development and investment strategies for the firm. He holds a MSc. in Mathematical Trading and Finance, as well as an MA in Economics and BSc. in Actuarial Mathematics and Statistics.