Millburn has advanced in three core directions: statistical learning techniques added to trend-following to make systems more adaptive, responsive and empathetic to the moods of the markets; multi-factor models mean Millburn has more radars picking up more useful signals; and a wider investment universe, including non-directional trades, further expands the potential opportunity set; execution efficiency has also been enhanced. The result has been an increasingly differentiated return profile.
The art and science of statistical learning
While Millburn continues to make significant use of trend-following, and feels it has advanced and refined that approach over decades, the utilisation of statistical learning in concert with trend-following reflects a global shift. In fact, statistical learning has superseded other approaches in many industries. Says Barry Goodman, Millburn’s co-CEO and executive director of trading, “Flexible statistical learning methods have already displaced traditional forecasting in many fields. Approaches have improved remarkably in the last 10 years, especially with the push it has been given as firms like Google, Amazon and Netflix came to understand the opportunities it provides. But it is not perfect, and still requires real understanding of how to apply it, and, we think, a mindset of what I call heightened humility.”
Statistical learning from other industries cannot simply be copied and pasted into finance, though, as it entails unique features. For instance, as Grant Smith, Millburn’s head of research and Goodman’s co-CEO counterpart, points out: “The financial markets are extremely noisy, and present a very difficult problem in return forecasting when compared with many other applications. This is our challenge.”
Once signals have been disentangled from noise, the next stage in the process is determining how to combine different types of signals. Millburn has made breakthroughs in this area. For a number of years Millburn followed the standard quantitative investment paradigm and simply applied the scientific method to each data set discretely or individually, developing standalone models producing unique signals that were combined ex post. But Goodman explains why this approach was found to be suboptimal, and part of the answer lies in behavioural bias. “While the addition of non-price data was valuable, the ‘averaging effect’ that occurred when individual signals were combined meant a certain amount of information loss. Also, the combination of signals—specifically the weights that were given to each signal—was subject to the influences of hindsight and other behavioural biases that human researchers have. In short, we knew that we were not being as efficient as possible in terms of extracting information from this data.”
Yet rebalancing the model weights was becoming ever more important as so-called ‘alpha-decay’ was greater in some of these standalone models when compared to price-based momentum models, for example. This somewhat reduced ‘shelf-life’ meant that Millburn had to devote a lot of its time to monitoring existing models, including the structure of the markets and the validity of the models’ original assumptions, to see if anything had changed, as opposed to researching new data, factors or effects from which to construct new models.
‘Timing alphas’ is a controversial topic in terms of whether and how they should be timed. Some managers claim they have not found a profitable way to time alphas and consequently just equal-weight models. Millburn has built up some confidence in its ability to detect relatively subtle, but important, near-term regime shifts, and has engineered a process to adjust factor weights accordingly – by using systematic methods, free from human discretion. Millburn’s approach uses many different statistical learning techniques applied to a growing suite of factor inputs.
But these discoveries came only after a long – and ongoing – journey. Millburn went through a phase of intense research and investment into a statistical inference framework. Explains Smith, “For us, this framework is just another technology, but it enables us to attach our domain knowledge to the learning process, as opposed to individual models. This allows the models to look at data in context, understanding market structure by the combination of data inputs rather than by each data input individually. But the process also allows the models to adapt over time to changing market dynamics as new data is incorporated into the learning set, as indicated in Fig.1. This means that alpha decay, or alpha ‘timing,’ is handled by the process rather than as a result of the views of a particular researcher.”
For example, momentum signals, which Millburn has traded since the 1970s, can be de-emphasised under particular market climates, and price momentum is not always the predominant signal. Now, Goodman expounds, “Statistical learning-based strategies are where the bulk of our risk is deployed in the programs we trade, although most programs retain an anchor in traditional trend-following strategies. But even for the statistical learning framework, momentum (price-based) inputs remain critically important, and we still believe, as we have since our inception, that price captures the majority of information in any market. What’s changed is we can now – our research and actual trading suggest – potentially identify market conditions where price should take a back seat to the influence of certain other data inputs.”
Benefits of multi-factor models
The incorporation of statistical learning is the latest stage in a long process of research and development. Millburn had been seeking to improve the quality of forecast by adding non-price inputs for some years. In the mid 2000’s, they started investigating the use of quantitative non-price data— fundamental supply and demand, market structure, and weather, for example.
Factors can be generic or specific. Some apply to all markets and some only to certain markets. Explains Smith: “Models can incorporate a range of different factors, some of which are broadly applicable to all markets, but others that can be quite specific. An example I like to use is the lumber futures market. For lumber, we might incorporate, say, something like US mortgage interest rate data as an input, or feature, of a model, along with price and supply/demand data. However, mortgage interest rates probably has little effect on forecasting prices for, as one example, Japanese equities.”
The assumption that US mortgage rate data is not likely predictive for Japanese equities is, Smith admits, only a pre-conception based on conventional wisdom or experience. In reality, whether the team should include US mortgage rates as a consideration when trading Japanese equities is no longer reliant on Smith or any of the team making an assumption one way or another. As Smith says, “The beauty of this technology is that whether a factor is important for a particular market is completely testable. Further, if things were to change—for example if US mortgage rates do start to add predictive value to Japanese equities—the data will indicate this.”
But including every conceivable factor in a model is not without cost. He summarises: “Superfluous factors consume precious computer resources. A key challenge for our researchers now is determining what factors should remain in a model for each market.”
As price and price-derivative signals sometimes play second fiddle, models can incorporate dozens of factors – including fundamental factors. These have more influence and are very much part of the picture, and are not an afterthought. But using multiple factors does not distinguish Millburn from many others. It is the statistical learning methodology used to dynamically rebalance weightings that has tended to set Millburn apart.
Explains Goodman: “It is a fully integrated, fully quantitativeapproach. Fundamental factors are not an overlay to price (nor are any of the other data types used) but rather are foundational elements of the return-forecasting models.”
So is the right label for a hybrid approach now a hybrid word, namely ‘quantamental’, blending of the quantitative and the fundamental? Responds Goodman: “In many ways the process is closer to a quantitative macro approach in its incorporation of fundamental and other data inputs, but the way we go about this is more adaptable than traditional quant macro or multi-model approaches, as the ability to adapt and weight is built into the process itself.”
The influence of a factor on a signal is not static. Smith explains why. “The models make continuous return forecasts as a spectrum of real-time data are evaluated. Because we tend to measure the real-time data frequently, signals can change frequently for a particular model and the influence of any particular factor can change as well. So while a demand factor may be important when the model is initially run, as new data are incorporated the signal can change, effectively overriding or diminishing the importance of the demand factor. Over longer time frames, however, we can generally see reasonably stable structures in terms of how much influence each factor wields within a particular model. And importantly, we use multiple models in an attempt to increase the robustness of the signal.”
Signal changes are performed gently and gradually, in baby steps rather than giant strides. In terms of precision, the approach is very fine-grained, making incremental position adjustments rather than wholesale changes. You might think of it as a ‘dimmer switch’ as opposed to an on/off toggle. Millburn’s research suggests that more frequent, but smaller, position adjustments create a smoother return stream compared with, say, traditional trend-following approaches.
Expanding investment universe
The investment universe has already been expanded somewhat, and varies by investment mandate. On the long/short CTA front, Millburn’s flagship Diversified CTA Program date back to the 1970s (which is a 2016 constituent of the Societe Generale Prime Services SG CTA Index), with risk allocations shown in Fig.4. The Commodity Program, a top global performer over the last few years, trades nearly 50 markets (more if you include the relative value ‘spread’ markets that trade both intra-commodity (e.g., calendar) spreads or inter-commodity (e.g., WTI vs. Brent crude oil futures).
Beyond this, Millburn has applied its engine to some unique ‘hybrids’: long-biased and long only strategies that trade up to 130 markets and include, in some cases, exposure to ETF securities. Millburn has seen very good results in these programs to date and expects the universe of tradable instruments to continue to expand, increasing opportunities for diversification and, with luck, alpha.
Foresees Goodman: “What we are doing now in terms of research is basically market-agnostic. Because it is data-driven rather than requiring us to have a pre-determined theory, there’s no reason why our framework can’t be extended into other markets. We’re only just now starting to scratch the surface.”
Execution and liquidity
In some financial markets, there are concerns about liquidity and Millburn is keenly aware of this. Trading frictions do not alter forecasts but influence which ones are actioned and in what size. Indeed, ongoing research has also led to enhancements on the execution side. Having a forecast of price direction and magnitude has enabled Millburn to get more sophisticated with its execution approaches. While the firm finds near-term return forecasts to have the highest degree of confidence, forecasts can also be made covering the next several hours, days or even out to a week. This return path is incorporated into the execution systems, such that a proposed trade can be instantly evaluated based on trading costs and likely profitability.
What this amounts to is that Millburn has at its disposal a set of continuous return path ‘calendars’ for every market, which is used to determine whether to take a trade. Smith explains how the execution cost decision is kept separate from the original signal. “Essentially, regardless of the signal, if the system determines that the cost of trading is likely to overwhelm forecast profits over a certain timeframe, the trade may be deferred, completely or partially. But because the signal and execution decisions are independent, the information contained in the signal is not lost. In other words, the trading frictions impact how we act on the forecasts, but do not impact the forecasts themselves.”
All of the above has contributed to Millburn’s changing performance pattern, which has become steadier and more resilient during non-trending periods. Says Goodman: “For us our focus remains on continuing to improve the process, but of course the proof is in the performance.”
And to that point, Millburn investors (including Millburn itself whose current and former employees and family members represent about 23% of total firm capital) would be likely to be happy with the return profile of late and, importantly, the path of returns. Millburn’s goal is to take advantage of trending behaviour in markets, but they have recognised that these periods have historically been very short-lived, with some long periods of uninteresting returns between trends.
As a result, Millburn’s returns are still correlated with trend-followers’ returns, but they are no longer predicated purely on trends. Smith explains: “Our research suggests that our approach provides the ability to capture our fair share of trending behaviour, but also the ability to provide returns during so-called ‘sideways’ markets, where many trend-following approaches can falter. And in practice we’ve seen this play out.”
Take 2014 as an example: the first half of that year was a poor environment for trend, with the SG CTA Index just above flat while Millburn’s Diversified Program, in contrast, was up more than 10% net. Then in the second half of 2014, a much friendlier environment to trend emerged and the SG CTA Index rebounded in the space of five months. Millburn’s Diversified Program continued performing, although it underperformed trend slightly. In the end the program outperformed the index for the year but with a much smoother, and different, path of returns.
“With an investment in a pure trend-following approach, there’s a reasonable chance investors would have redeemed by July 2014, and missed the profitable trend period that followed. Millburn’s smoother return path could have mitigated this behavior,” says Smith.
March 2016 was another example of differentiated performance, with the Diversified Program flat while the SG CTA Index lost 3%. May is proving to be a differentiator again, with (through May 19th) the Diversified Program outperforming by about 1.9%. Sharpe ratios on the whole have been impressive of late, and the Diversified Program has outperformed the SG CTA Index in each of 2014, 2015 and so far in 2016 by reasonably wide margins.
A similar pattern of returns has been seen in the Commodity Program. The program, which was up more than 28% net in 2014 and more than 25% net the following year, allocates about 90% of its risk to the statistical learning framework. Illustrates Goodman: “Our Commodity Program is unique, and capacity constrained, but provides a very good example of the flexibility of the framework. In the case of the Commodity Program, the statistical learning approaches are applied to outright directional and relative value ‘spread’ markets, and use dozens of different factor inputs. We’ve seen strong response during some of the severe downward trends in commodities that came from major supply/demand dislocations, but also a much quicker response time when trends end, and in many cases the ability to reduce positions in advance of inflection points. This has served investors in the program very well.”
The Commodity Program has seen its asset base grow, but has some remaining capacity before the firm expects to pause asset-raising.
The long-only MAAP program, which was seeded in September 2014 with proprietary capital, has also quietly been developing a very solid track record. In fact, net of its standard management fee, the program would have ranked in the top 2% of all Morningstar Tactical Allocation category funds in terms of risk-adjusted returns. MAAP uses the signals from its long/short alpha engine to make adjustments between a 20% long positon and a 100% long position for each traded market, but takes no short positions. The result is a program that provides strategic long exposure that can be modified, but never completely overridden, in response to market conditions. Millburn is now actively looking for the right distribution partner for this strategy, which they think has the opportunity to add real value as a long equity replacement for the right investor.
Despite the advances and performance, Millburn is not just doing victory laps and after more than 45 years in the business, is quite obviously in for the long haul. “Recent returns have garnered us some increased attention, to be sure, but I think what these investors find once they go beyond performance is a very robust, adaptable process—we think that’s where investors should focus, and we believe that is where we will find continued opportunity,” expects Smith.
People and IP are key
At about $1.6 billion in AuM, Millburn is big enough to invest appropriately in its employees, technology and infrastructure, which the firm sees as critical to attaining its goals. With regard to hiring priorities, outlines Gregg Buckbinder, the firm’s president and COO, “We’re competing for quant talent just like many others, but we try to hire people who have a real interest in the financial markets, specifically in solving what we think are some of the most difficult problems in terms of quantitative approaches.We attract people who want to work in an entrepreneurial culture, but also one where results are measurable, and where the markets provide the ultimate feedback mechanism.”
Millburn is headquartered in Greenwich, Connecticut, but the majority of its 50 employees work out of an affiliated entity in midtown Manhattan, which the firm finds is typically a more attractive location for young talent.
Many recruiters have a tendency to hire mirror images of themselves. But none of this implies that new hires are clones of existing staff by any means. Observes Smith: “Our R&D team members come from a variety of disciplines, and we like to mix people with industry experience together with people that may have had none. When you take that concept and combine it with a flat, non-silo organisational structure, we think you can produce results that promote innovation while never losing sight of the important practical considerations of investing in the financial markets.”
The team and technology are powerful resources enabling Millburn to step up to the challenges of Big Data. SaysGoodman: “We can think of our return-forecasting approaches as living organisms, changing and adapting to different market environments. If that holds then data is really our lifeblood. What’s exciting about investing in the markets is that there are so many different types of data that can impact how prices move. This data must be discovered, cleaned and verified, properly warehoused and protected. What we’ve built is a framework that does precisely this.”
Millburn has seen something of a reshuffle in senior roles that dovetails with what has proven to be a very interesting period in Millburn’s evolution, but it would be an exaggeration to describe this as a ‘succession’. In November 2015 a leadership realignment occurred that saw Goodman and Smith, who each have been at the firm for several decades, take the helm as co-CEOs, and COO Buckbinder assume the additional role as President. Former chiefs George Crapple and Harvey Beker remain co-Chairmen and members of the Investment Committee.
Accessing Millburn’s alpha
Millburn has been diversifying its investor base for some years, with managed accounts, offshore and onshore US clients, and various sub-advisory relationships. Millburn offers an appropriate level of investor communication and transparency. Says Buckbinder: “We think sophisticated investors today are moving beyond the fear of quantitative strategies as ‘black box,’ especially as quantitative approaches become more accepted in the general business landscape. Regardless, our approach is to be as transparent as possible with our investors while protecting our IP.”
Watch this space
Millburn expects the themes explored above will all continue. Although the framework is established and proving itself nicely, research continues to evolve and improve the approach. The firm’s quantitative analysts are focused squarely on the search for new factors, and are in the early innings of exploring the applicability of the framework to new markets.
And as quantitative easing shifts and more differences emerge between markets, Millburn sees opportunity. “As the influence of certain factors, like QE or others, begins to differ, this can mean more frequent, and more meaningful, periods of adjustment across instruments, sectors and geographies. This can provide opportunity for our strategies,” says Goodman.
Millburn has had its ears to the ground in terms of investor demand for more sophisticated strategies. “Performance aside, our approach of complementing momentum or trend-following effects with influences and interactions from a broader array of data seems to be resonating. Investors implicitly understand the value,” says Craig Gilbert, Millburn’s global head of business development and ex-Two Sigma VP.
Former Bluecrest exec and Millburn’s head of business development for Europe and the Middle East Shezad Syed agrees, and also gives us a hint about an imminent launch. “We are now looking at options for a UCITS feeder, which should put us in the position to provide more investors in Europe and elsewhere with access to the strategy. I hope to be discussing this program with interested early adopters in the coming weeks and months.”