Worryingly, there is an assumption in places that such a financial crisis – a “millennium” event – is just not going to happen again within current managers’ careers.
In fact, it is not at all clear that the current crisis is behind us – we have an equity rally, but all the fundamental macro indicators are red and most economists have a dark view on the coming two years as deleveraging and deflation take their toll. But more importantly, such an over-confident attitude forgets that twice in less than 10 years, wealth has been destroyed by the same cycle: inflation of money and credit leading to inflation of asset prices, then to inflation of risks and finally to a bursting of the bubble and mass contagion. Assuming we have returned to business as usual for the coming five years is at best irrational optimism.
These global liquidity crises can suddenly destroy the excess return carefully accumulated during business-as-usual periods. The disappearance of the risk premium over the past 20 years is dramatic: in that period the S&P500 has provided the same return as seven-year treasury bonds, while being twice as risky (see Fig.1).
Because of these global crises, investors can no longer simply look to generate long term excess returns by relying on the so-called risk premium hypothesis – i.e. the idea that the riskier an asset class, the higher the return will be. Protecting against a global liquidity crisis has thus become the only way to secure excess return over the long term. But doing so requires the ability to anticipate the potential impact of such a crisis, and then to have a systematic investment policy in place which delivers actual protection, while also ensuring your investors back you in doing this.
In theory, that should be beneficial for hedge funds, since their value proposition is precisely to offer returns which cannot be attributed to a static asset allocation scheme. However, the apparent unpredictability of hedge funds behaviour in a crisis environment could play against the industry.
Measuring hidden betas
In the case of a global liquidity crisis like 2008, when very liquid markets dried up and the most solid counterparties almost went to the wall, it is very easy to say that it’s not the fittest who survive, just the luckiest.
Certainly many in the markets felt that they were hit by an unpredictable liquidity storm. However, the studies we’ve run at Riskdata, both prior to and after the crisis, demonstrate that in fact the impact of the crisis was quantifiable using market information available as soon asthe autumn of 2007 (for more details, click here).
Analysing 3,100 hedge funds prior to the crisis and applying to them a proper risk model (a non linear factor model, like the one developed by Riskdata), it was possible – with high confidence – to spot which funds would go through the crisis unaffected and which ones which would be devastated. So what is the truth? As usual, such contradictions result from an implied assumption that is false: the assumption that liquidity and market risk should be analyzed separately. In reality, hidden market risk, driven by hidden betas, and liquidity risk are equivalent.
A big misconception here is that liquidity risk is about the current liquidity of the asset. In fact, current liquidity is not a risk, but simply an observable fact – the risk is how the liquidity of the asset can change in the future. For instance, a private equity fund with a lock up clause of five years has by far a lower liquidity risk than a corporate bond, which can overnight shift from being very liquid to illiquid.
The “equivalence principle” between liquidity and hidden market risk is in reality a translation of the relationship between the supply/demand equilibrium and market price:
• When the liquidity of an asset dries up, it causes a massive supply/demand unbalance, itself always reflected by an exceptional drop in the market price. This exceptional drop in market price is precisely the materialisation of hidden market risk.
• Symmetrically, when an asset price starts dropping in an unexpected way, it gives asset owners a nasty surprise which invariably causes panic, which then itself leads to a sudden change of liquidity, with only sellers and no buyers in the market. Shifting to a global macroeconomic view, it is always an accumulation of hidden market risks (e.g. sub-prime) which eventually leads to major liquidity and credit crises.
So the key question is not “is it possible to quantify the impact of liquidity crisis?” but rather “is it possible to quantify hidden risk?”
Riskdata’s risk platform has been designed precisely to model these hidden risks, through the risk profile concept (which is why its capability to anticipate asset behaviour, including during a massive liquidity and/or credit crisis is powerful). When Dr Raphael Douady and the Riskdata research team explored this question further, they were able to demonstrate how this hidden risk comes from hidden betas:
• Exposure to risk factors which have been very quiet for some years – e.g. sub-prime; which flourished during a long period of low credit spreads and volatility;
• Negative convexity versus a risk factor: assets that in normal times are quite de-correlated but become very correlated when markets collapse.
However, just measuring these hidden risks will not bring protection against the next big one. To get protection there has to also be a fundamental shift of investors in asset allocation policies. In short, it is necessary to shift from a rigid portfolio structure to a risk budget.
Investment policies driven by asset class diversification, as well as alternate approaches such as liability driven Investment, rely on a rigid portfolio structure – the growth into alternatives has occurred over recent years still relies on the same diversification principles. This implicitly assumes that risk parameters are the same whatever the market regime, that there is always a safe harbour (i.e. a risk free asset) and also that asset class returns are over the long run proportional to risk.
All these assumptions have been invalidated on the past 10 years:
• Liquidity across all asset classes experienced very large oscillations; the safest counterparties can become bankrupt, and correlations between asset classes can shift from zero to one in a couple of months.
• Whatever the asset class, there is a potential bomb which can damage it: inflation, currency depreciation, government or bank default – there is no longer a foreseeable safe harbour.
• And last but not least, high risk is no longer rewarded by higher return!
Due to these market dynamics, the relevant diversifiers, hedges and opportunities of today are certainly not the ones of yesterday. What should remain fixed is not the portfolio structure, but rather its structure of risks, i.e. its exposure to the various risk factors in extreme market situations. It is much more effective in terms of extreme risk management to focus on “how much I can lose if credit spreads explode?” rather than “how much I am allowed to allocate to corporate bonds?”
Shifting from an asset allocation paradigm to factor risk allocation does not necessarily require a smart tactical in house view on what will work and what will hurt over the next 12 months. The reason is that mechanically people tend to be contrarian, i.e. to reduce an allocation to the latest bubble. At any given time, there are always ‘trendy’ factors to which all asset classes are more or less exposed, but which eventually will end in a blow up and drive a new market dislocation. A factor risk budget is a good way for an investment committee to discipline itself, by helping to resist the temptation to bet too much on the current source of easy returns, which will at some point turn into the next source of market dislocation.
And it works – as long as you are using a risk model capable of detecting hidden sources of risk (those that germinate bubbles). For example, Riskdata ran a 3% risk budget on a simple portfolio made up of just three assets, the S&P500, Treasuries and cash. Not only did the portfolio never lose more than 3% in any one month, but it also delivered a persistent risk premium (see Fig.2).
Extreme risk budgeting
In May 2008, one of our clients was being threatened with redemptions by some of its investors, because the portfolio manager had reduced his exposure to the equity market: The investors were afraid they would miss an equity rally.
The manager trusted his own view, however after the crisis, he suffered more or less the same level of redemptions as his peers, despite the fact he took the right decision. What this tells us is that in a world where investors don’t value a strict ex-ante risk management discipline, you’ll never be rewarded for your efforts, even if ex-post you happen to be right.
It also tells us that the sine qua non condition for hedge funds to be rewarded for extreme risk budgeting is to implement it as an agreed investment rule with external and internal clients.Once a common framework is a greed, the issue becomes the potential impact of this policy in terms of business-as-usual returns. In other words, how much does a hedge against a crisis cost? If the cost is too high, even if it is the only way to secure long term returns, in the short term it can kill a portfolio manager due to lack of performance versus their peers.
However, the beautiful conclusion of the out-of-sample study we have run on several random portfolios is that this cost may in fact be negative (value-adding)! An extreme risk budget policy can be financed by removing any business-as-usual risk constraints. In our study Keeping the devil in its box (see Appendix) we show that extreme risk budgeted portfolios can and will over perform Markowitz portfolios (i.e. portfolios designed with a Markowitz optimiser, maximising return/volatility) massively in crisis periods.
Though this was expected, what was somewhat surprising was the proof that they will over perform slightly in rallying times, too. This over performance results from the fact that the extreme risk budget was unconstrained in term of volatility, while the Markovitz one was: in rally periods, higher volatility means mechanically higher returns.
Refocusing on extreme risk transparency represents a very strong business opportunity for the alternative investment industry as a whole. In fact, it can become a core source of excess returns for the end investor:
• the approach described above is precisely the one implemented by most macro, CTA, volatility arbitrage and short bias funds
• hedge funds which have the symmetrical pattern of low volatility and high extreme risk – such as arbitrage, activist and event driven funds – can secure their role as a business-as-usual source of alpha, to the extent that they are transparent on extreme risk patterns, giving investors the possibility to hedge them.
By combining these two patterns, an investor can create a storm proof portfolio, securing excess returns in the long run – which is much more difficult to achieve with traditional liquid asset classes.
A synopsis of the full paper Keeping the devil in its box: using quantitative techniques to manage risk. The full paper can be found here.
The market meltdown of 2008 raised two fundamental issues:
• The abnormal behaviour of most asset classes
• The possibility for investors to limit their losses in extraordinary market conditions when all possibilities of diversification seemed to have vanished.
Hedge fund portfolios are emblematic of these issues, as they are complex and cross asset class, their behaviour is usually considered difficult to anticipate.
We addressed the first point in our study, Navigating the Perfect Storm, by demonstrating that the impact of the market meltdown on hedge funds was foreseeable and, therefore, achieving risk transparency was possible. We stated than once risk transparency is achieved, it was possible to implement an extreme risk budgeting policy at a reasonable cost.
To address this second issue, we conducted an in-depth analysis of randomly selected data from the HFR database.
To assess the possibility of keeping extreme risk under control, we ran a strict, out-of-sample back test. We randomly selected three short lists of funds among those reporting to HFR since at least 2002. We then simulated the construction of a portfolio as if we were an investor willing to limit the worst month to a loss of 3% with a 99% confidence. In other words, we set a 3% loss budget for the 99% value-at-risk of the portfolio. The portfolio reallocation was made once a year, with a three month lag before the actual implementation in order to account for potential liquidity constraints.
We tested four different techniques of portfolio construction:
1. Capital allocation: we simply allocated the same amount of money to each manager. Then, we adjusted the leverage in order to get a monthly volatility below 1.3%.
2. Risk allocation: We allocated the 3% of risk budget equally to all funds, assuming conservatively that the funds were 100% correlated.
3. Markowitz: We simply used the Markowitz optimiser, with a targeted monthly volatility of 1.3% and expected excess returns equal to the past three years’ average returns.
4. Riskdata’s FOFiX: In this process, we combined a classical allocation process with a tail risk budget of 3% set on the extreme risk as measured by our nonlinear factor models.
To make the exercise realistic, allocation limits were assigned: no negative positions or allocations exceeding 5% of assets under management.
To make it comparable with FoF performance indices, we applied a 1.5% management fee and a 15% performance fee to compute monthly NAV for each strategy. The fact that we used automated portfolio allocation techniques does not mean that we only believe in quantitative-driven investment processes. This is simply the only way to run a rigorous, out-of-sample back test of the efficiency of risk management.
If we now compare the out-of-sample track records of the four techniques over time, we get the results shown in Fig.3. If we then look at the month-by-month performances, our conclusions are:
• Only the Markowitz technique failed to stay within the risk budget: it exceeded this 6.5% of the time whereas it should have exceeded it only 1% of the time. This failure is both of ex-post techniques and of the optimisation approach, which mechanically tend to search the flaw of the risk model in place. This is a very poor performance.
• Riskdata FOFiX’s approach is by far the most economical in terms of cost of hedging. It strongly overperformed all the others, resulting in an outstanding 4.4% excess return over the risk-free rate before fees. Moreover, prior to the crisis, it delivered slightly higher performance than the best-performing strategy, meaning that the cost of hedging is in fact negative!
• Other techniques worked, but at a cost which made them unattractive versus the investment in cash. However, the risk allocation technique appeared to be excessively conservative, with a 1.6% worst month over the period. This results from the 100% correlation assumption between the funds. If one relaxed this assumption, then the risk allocation would perform much better than the capital allocation, with a tangible excess return over the risk-free rate before and after the fees.
If we analyse further, we see that the main reason why FOFiX succeeded while Markowitz failed is FOFiX’s capability to spot hidden risks. The FOFiX technique combines all the quantitative indicators which properly spot these extreme betas and thebias ratio. The Markowitz technique (and its more sophisticated “fat tails”-based optimiser) naively assumes that a fund’s risk only derives from its return stream.
The other reason why FOFiX strongly over-performs is that it gets rid of business-as-usual risk constraints. Because tail risk is kept under control, it can afford a higher leverage than the other strategies. As a result, it over-performs Markowitz even during the business-as-usual period despite the fact that its Sharpe ratio is lower:
This study provides explicit confirmation that an efficient risk management tool can bring considerable value to portfolio construction. To demonstrate this, we used a purely quantitative approach to avoid any bias stemming from ex-post knowledge of funds’ performance.
In practice, this strong value can be extracted through combining a semi-qualitative process of selection and portfolio construction with a quantitative assessment of the risk. Risk control brings considerable value, as long as it is fully integrated into the investment process, offering a precious feedback loop in order to:
1. Raise alerts on managers exhibiting abnormal patterns (hidden risks).
2. Quantify the contribution of each manager to portfolio diversification, both for normal risks (typically, marginal VaR) and extreme risks (extreme betas), helping to build portfolios which are actually consistent with qualitative views on managers.
3. Check if the portfolio risk profile (i.e. exposure to market events, including extreme events) is consistent with the tactical view and/or the client mandate, and find solutions to correct this profile if it is not in line with the expectation.
ABOUT THE AUTHOR
Olivier Le Marois is the Chairman of Riskdata. Prior to Riskdata, Olivier was in charge of developing Dalkia, a world class leader in energy services, utilities and power production. Olivier is a graduate of French Ecole Polytechnique and Ecole Nationale d’Administration.