Systematic Investing on a Global Scale

The nature of quants

PHILLIPPE JORDAN, PRESIDENT, CAPITAL FUND MANAGEMENT
Originally published in the October 2015 issue

My name is Philippe Jordan. I’m a partner at Capital Fund Management (CFM), a quantitative investment management firm where we have applied a quantitative and scientific approach to financial markets for almost 25 years. I’m going to try and do a couple of things today. First, I’m going to try and convey some background on quantitative multi-strategy investing. Second, I’m going to outline the quant marketplace and our place in that marketplace because “quant” is a word like “hedge fund.” It doesn’t mean anything without context around it. Hedge fund means a legal structure in search of fees, that’s it. It’s a very big playing field that’s very fragmented and there are many different interpretations of it.

Quant is kind of the same thing – bad quant, good quant, fast quant, slow quant. I’d like to start at the beginning. Quant is not that complicated to a lot of you. We look at indexes every day that are quantitatively derived. The S&P 500, the DAX, are both quantitative technologies that were deployed a very long time ago. The constituents of those indexes are selected by a simple algorithm that ranks companies by market capitalization.

The base of many of our institutions is invested into the markets via quantitative products, such as BlackRock-type ETFs that employ quantitative techniques. So my simple point is that there are very basic building blocks and then there are more complicated parts of quant.

Quant is an industry that had a massive evolution in the last 20 years. Quant existed 40, even 60, years ago. When I began in this business, I’m 51 years old, I had a Telex machine in my office to send orders to people. Itwas better and faster than telephones because you knew the guy on the other side in Chicago had the order in his hands and could read it.

Electronic trading
What’s happened in the last 20 years is that we’ve gone from using technology that was developed in the 19th century to FIX connectivity, which uses algorithms in an engineered marketplace that is now interconnected. That’s a huge change because it generates not just speed, which everybody is focused on, but it also generates and facilitates data collection. Firms’ abilities, with this new infrastructure, to collect data and look at it from many dimensions has gone up by orders of magnitude in the past 20 years. I’d say the major change happened in around 2000.

The first exchanges that went electronic were in countries that are very well known for their capacity to do financial innovation. Switzerland and Germany were the first to go electronic in the late ‘90s, and then the CME changed around 2001. So you had a very, very drastic change in the manner in which you could connect with markets and look at your own data. What is your own data? It’s your own trading – you trade, you generate data.

When you were printing tickets and time-stamping them, organizing that data was pretty messy.

Synchronizing it with prices was impossible, ergo it was impossible to really understand what your trading impact was on the market. In other words, how much did it cost you to trade? Everybody that trades for a living knows that trading is expensive. It’s expensive in fees, it’s expensive in tickets, and it’s expensive in friction. Most importantly, it’s expensive in impact, but impact is a really tricky thing to understand.

Gauging market impact costs
You need to produce a lot of data, so a lot of statistics, to be able to derive some sort of number as to how much money it cost you to trade in a given year. If you don’t trade much, maybe twice a year, it’s not a big deal. If you can live with variable risk and don’t care what your realized volatility is, it doesn’t matter either since you’re not going to trade for the purpose of risk.

If you trade relatively with some more frequency and you like to have constant risk, you need to know what your impact is on the market. The technology to implement this is less than 20 years old, so it’s relatively new and interesting, and very helpful in understanding costs.

You can think of the cost of trading like an iceberg. The visible part is commissions and fees and financing cost. Then there’s the impact underneath the water line, and that’s where all the costs are really buried. These costs are also not as obvious to measure, and that’s what is very expensive.

Increasing frequencies of quant trading
Within quant, we touched on the trillions and trillions of dollars quantitatively invested in Beta products that have done their job in a spectacular fashion for the public at large – very low friction, very low turnover, simple methodology yet very effective. Quant methodology has been very successful and is not criticized globally in any manner – it’s fully accepted.

You transition from there to quants that trade more and run different orders of complexity in different types of portfolios. It’s important to note that they’re not high frequency traders even though they’re more active compared to what I termed the “base quants.” These more active quants might trade over a horizon of around six months. This is fairly tame in terms of friction so they are really doing a lot of work on signals, trying to find ideas that are persistent, and then executing it in a fairly conservative manner in terms of portfolio turnover.

Then you go a little faster and you have folks that are operating from the frequency of a week, let’s call it days, maybe intraday from a couple of hours up to three months. They’re competing with traders, what we think of as very active management strategies that employ trading as their central tool as opposed to investing.

Then there’s the stuff that’s been highly mythologized (and villainized), which is in the high-frequency space. So, people that trade in five, 10 milliseconds, then microseconds and nanoseconds. These traders are trying to go to picoseconds. The large part of that market is not visible to people who buy hedge funds. That market is in the hands of people that do proprietary trading for themselves and do market-making. The investment strategy here is predicated on arbitrage linked primarily to speed rather than on data analysis.

Infrastructure and intraday trading
The entire market today, whether you place your order through a phone or you enter it through a computer, is going to aggregate into a system that is highly engineered, run by market-makers that operate at high speeds. They’re running their businesses with robots or else they can’t scale, don’t have efficiency, and have bad risk management. So we’re all interacting today with a marketplace that is interconnected, engineered and runs at high speed.

That’s relevant to my business only in the sense that I need to be able to connect to that infrastructure, that marketplace, in a manner that’s sufficiently capable to go to the technological gunfight. We’re not going to have the biggest gun in terms of speed. I don’t need to win the high-frequency trading game, but if I’m going to control my impact I can’t show up there with a knife. You need to have the infrastructure if you’re going to trade with some frequency, which we do. We don’t do microsecond or nanosecond stuff, but we do intraday trading. We manage risk frequently because we can. We control our impact sufficiently so that it allows us to run a risk regime that’s very steady and coherent, in increments as small as five minutes, an hour, etc. Our “big gun” from a technology perspective is in data collection because that’s our edge. We differentiate ourselves with data in the same way that speed is a differentiator for high-frequency trading firms.

Scientific talent shifting to finance
So as I said, trading technology is important for us in the sense that we need to interact competitively. As far as we’re concerned, this technology is a prerequisite. As a business, our DNA and point of differentiation is getting to the signal. You need to find the ideas that have coherent, statistical returns and that make sense hopefully, not kooky stuff. Why do I talk about kooky stuff? The world today is fortunate (or unfortunate) in that we have lots of very well trained PhDs in physics, math, and other sciences that are underemployed. The reason they’re underemployed is because government research centres across the globe do not have enough projects for these talented people.

So what we’ve seen over a number of decades is that there aren’t enough large-scale government financed projects out there to occupy the number of well-educated kids coming out of academia. Wall Street took them on in the late ‘80s, and the part of Wall Street that took them on is Solomon Brothers and the derivative finance group.

Derivative finance is very different than the brand of quant that I’m talking about. We’re not in the business of trying to find super complex securities and figure out how we can carry them properly through multiple different environments and milk a yield, then leverage it all up. The brand of quantitative finance that we belong to is one that wants to operate in liquid markets, that wants to take directional or market-neutral risks in simple securities.

There’s kind of like a historical confusion between derivative finance with rocket scientists and what I would call the brand of quant trading that’s systematic in liquid markets. It’s not the same thing, it’s not the same experience. One was about leverage in complex securities (and lots of it), and the other, carry, is about leverage in simple securities, hopefully in very diversified portfolios and approaches.

We believe that the growth of quantitative research in finance, accelerated by the advent of “big data,” will feed researchers back into the more “pure” research disciplines like biology. By extension, certain findings in quantitative finance could lead to progress in area such as medical research.

To get those diversified portfolios and approaches, we have big crews of people that have been trained scientifically, which is very useful to have and also very dangerous at the beginning. It’s very difficult, and it takes a lot of time to take somebody that has been trained in hard sciences, a recently minted PhD, to deal with the fact that markets are not just noisy, which physicists understand. They understand noise, but markets are even messier than that, they’re not elegant places.

Physicists are pretty good at dealing with that; mathematicians are not so good because they like elegance. You cannot solve the market in one elegant equation, it doesn’t work. So mathematicians tend to go away, work on very complex things and come back with very elegant equations that are super dangerous. They’re useful, and we have some, but in terms of the way they think about the world, it’s tricky. There are caveats to this obviously, such as Medallion. Jim Simons is a mathematician and has obviously found something that’s very different in orders of magnitude to everybody else.

So you have technologists and you have scientists. The scientists are not working on Holy Grail stuff. They’re working on crunching data, looking at the data from a statistical perspective, and trying to go through a process in which they do not fool themselves by in-sample bias. That’s the big killer in my business – in-sample bias. We’re all guilty of it, discretionary traders, hedge funds, etc. People look at recent performance and find ways to rationalize why that recent performance is going to continue in the near future, and we know that generally this does not work.

Parameterization and in-sample bias
We really push people very hard in trying to look at things in a simple manner without using a lot of parameters. A few parameters, perhaps with a lot of computational complexity, but not a lot of complexity in building the algorithms. The more complexity you add to a model, the more the odds that you are adding in-sample bias in the process.

What we try to do is compress that in-sample bias to the most compressible, reasonable, credible level possible. We know it’s there. For this reason, as well as to account for the operational risk of introducing new code to the production environment, we implement a new strategy at 10% of its eventual target allocation within the portfolio.

That sounds like a prudent thing to do, but we’re not doing it because of the prudent man rule. We’re doing it because we’ve observed over 25 years that every time we put a new model in production, on average we realize a lower out-of-sample Sharpe compared to the in-sample results. That tells us that in-sample bias, even though it’s manageable to a certain degree, is to be expected. So a big part of building models is about managing our human behaviour, which is prone to injecting in-sample bias in what we do. Whether you’re trained as a PhD or as an MBA, it doesn’t matter,we all do it.

Diversification amongst models boosts Sharpe ratios
What we try and do is have as many models as possible that are de-correlated to one another. When you aggregate all this into a portfolio that’s risk managed constantly in quasi-real time, you get a Sharpe ratio in excess of one, which is what we’ve achieved over 25 years with our first fund. What that means is that the individual models at the bottom aren’t super models, they’ve got Sharpes of around 0.3. It’s the aggregate work and the portfolio effect and the risk management combined that gets you to that Sharpe of one plus.

From here, I want to go to what is and is not a hedge fund, and our definition of it since we’re talking about Sharpes. You talk about Sharpe and then you start talking about hedge funds. What we think we are doing is selling a product where our contractual exchange with our investors is a priori, you’re buying a product in which you are going to realize a Sharpe in excess of one. We’re going to strive for 1.5. We might not hit 1.5, but we want to be above 1, which is pretty ambitious. We’re going to deliver it with a constant volatility. If you want that volatility number at 6% or 7%, or you want it at 9% or 10%, we’ll realize those two different levels of risk constantly.

Liquidity and no correlation
Finally, I will give you back your money when you ask for it inside of three months, which limits the scope of the type of strategies I can go into. If I want to meet my liabilities, I’ve got to stay in the liquid spectrum of the world. That gives us a portfolio that’s made up of futures contracts across asset classes, global equities, options on stocks, options on futures, OTC FX, and that’s about the ballpark. I’m staying within the realm of things where I can actually realize my liquidity.

To realize those bogeys above one Sharpe at constant volatility, and give the money back inside of three months, that’s quite a feat if you can do it persistently over a very long time. We think that’s worth premium fees. We’ve got a hedge fund product called Stratus that we think is worth premium fees of 2 and 20 because we realize these objectives with a zero correlation to benchmarks.

That’s the final piece, zero correlation to benchmarks, not 0.20, not 0.30, not 0.50, not 0.70, zero, and rolling correlations in times of stress that don’t go above 0.30. Our definition of Alpha is strategies with more than one Sharpe, at constant risk, with zero correlation to benchmarks, and we think anybody that delivers on that score should be in a position of charging premium fees, and most are if they deliver those numbers.

You’ve got to wonder if it’s possible to deliver those kinds of numbers in an industry that’s managing in aggregate $3.3 trillion. When we talk about the industry today, we say there’s $3.3 trillion in hedge funds. That’s over total worldwide managed assets that are of the order of $65–$70 trillion. In other words, that number is saying 5% of totally managed assets on the planet are trying to earn, by my definition, more than one Sharpe, with constant volatility, at three months’ liquidity.

Capacity constraints
I’m capacity constrained in my hedge fund business, which is going to hard close soon because I know that north of $6 billion I will only decay my Sharpe. I’ve got extra capacity for returns in the coming years, but I know that I’m at a level of AUMs where I will start deteriorating the performance of the program.

You have choices to make at that point as a firm. You can decide to run your business the way you’re running it now. You’re happy, and that’s good, so you’re kind of like a perfectionist clockmaker with a good crew and a good team. Or you can decide to take AUMs and grow the brand, so you have more client services, more marketing, more branding, and you dilute the Sharpe a little bit but you can get to $10 billion, $15 billion, maybe $20 billion. It’s a proven path to success that a lot of people have taken.

Or you can decide to get involved in another business that’s different, so other areas in which you can still grow. We decided to do this in alternative beta while managing the capacity in what we consider to be a very nice multi-strategy quant fund.

Fees
The big challenge for you in an organization like FERI is dealing with a world in which there’s an immense amount of noise as to what constitutes Alpha, or what constitutes returns for which you should be paying 2 and 20. Arguably, if you’re thinking like a net returns investor, you should be paying 5 and 40 if you get a chance to invest in Medallion, which has been running very high Sharpe ratios for decades, over several billion in AUM. If they charge you 40% incentive fees and 5% management fees, it doesn’t matter because you’re thinking in terms of net performance, and that’s a great stream of returns.

The reality is that a lot of people can’t do that. They cannot invest on a net return basis. They have to invest on a gross return basis taking into account fees, total expense ratios (TERs), etc. How much did it cost to access that stream of returns regardless of the quality of those returns?

Is it rational? Who cares? The regulator can determine that by simply saying, “If you touch that offering, then you’re going to pay a certain haircut on it. Then, if need be, we will show up at your office to ask you questions about your balance sheet every day.” It’s kind of like an asymmetric negative payoff. Should I invest in hedge funds and deal with all this heat or should I look for other things?

Is smart Beta repeatable?
There’s a marketplace that caters to that. This is a service industry and it emerged from that structural demand due to regulation with “Alternative Beta” as the response. People are accessing Smart Beta or Alternative Beta, which are in some ways related strategies in different packages. One is a long-only format, the other is a multi-asset format that is market neutral, so lots of correlation and little Alpha that’s generated over the benchmarks versus zero or low correlation to equity and fixed income benchmarks.

That industry hasn’t played out yet. We’re going to find out who’s lying, who’s good and who’s just average. It’s going to take some time. You have banks that have now developed and offer these products. People have documented up to 800 bank algorithms in the last three to four years (I wouldn’t want to have to do this myself), so there’s now a universe of 800 bank algorithms that you can buy from your friendly guy at an investment bank that comes packaged in a swap, along with a very large legal document where you can get exposure to long-term trend, carry, a version of high volatility versus low volatility, or some other “Alternative Beta” or “Risk Premia.’

I’m going to go back to what I was talking about earlier: in-sample bias, which is very dangerous. You now have a world where there are 800 different algorithms being marketed within channels that are constrained to invest in products that are all full of in-sample bias. If you take these algorithms and throw them against the wall, at any point in time, 20% of those algorithms are going to have Sharpes of 1.5, but there’s no persistence. They’re going to fall apart inside of a two-year cycle.

But you can recycle it, somebody gets back on the phone and tells youthey have new algorithms that are better or, even worse, they tweaked it, they changed it, and now it’s a better algorithm. I think there’s going to be some very interesting experiments that are happening with all these algorithms that are being marketed by banks.

Asset managers are also building approaches in this space. We’ll see, I think it’s better than banks because at least they have a fiduciary responsibility. They actually manage your money and they’re responsible for it.

Barriers to entry in quant
We’re a 24 year-old firm and we’ve got a team of 70 that just run technology. I asked them this question, and they thought I was going to fire everybody (it wasn’t the case). I asked them how long and how much money it would take to rebuild our infrastructure. This would be using our team that knows what they’re doing.

They said, “I would need at least 50 or 60 million euros and three years to build it.” This is not stuff you can buy from Bloomberg, it’s not an off-the-shelf product. It’s designed specifically to control very precise risks, and it’s designed specifically to execute in a certain manner for certain types of strategies to compress impact. These things require project management, time and experience, not just money and technology. You’ve got to assemble it and put it together. You have long lead and lag times.

If I were in your shoes and I had to think about quant, I’d look at organizations from the perspective of infrastructure. People say, “Yes, machines, who cares? Everybody’s got machines.” It’s not just the machines, it’s how you run them. It’s the algorithms that you have in the machines, not for the purpose of trading and signal, but to manage scale, to manage infrastructure on three different continents, that are interconnected and can mirror each other, to manage tens of thousands of processes correctly every day.

Our definition of quant and the world we live in, and the one we think our peers live in, is about process, research, intellectual honesty, and then finally a business practice that is sound. If you just have quants on their own, that’s bad. They can do really, really dangerous things. They learn fast but they can make spectacular, dumb business mistakes in the beginning.

So you need a mix of IT people, scientists and business people that have built a company together to be successful, which is unusual in the investment management industry. In other words, you have three different types of people that are not particularly inclined to talk to each other – an MBA with an IT guy with a PhD in physics. It takes quite a while to get these guys to develop a common language and a common project in building an enterprise that all three think is coherent and has the chance to succeed.

Wrap up
So let’s just recap what we’ve covered, both in terms of demystifying the world of quant, and where CFM fits in that ecosystem.

First of all, quant – at the basic end – is an unquestioned part of virtually every portfolio. More complex versions of quant mean more active trading, which entails the appropriate management of impact, in-sample bias, a broad range and network of technologies, and a complex business. Success here requires a big monetary investment, and a deeply experienced team of very different types of people, demonstrably able to work together toward a common purpose over many years.

Second, we defined quant hedge funds as needing to produce a better than one Sharpe, maintain constant volatility, provide ready liquidity, and show zero correlation to traditional markets. This is what CFM has done now for 24 years. We stressed that this is inherently capacity constrained. At capacity, a manager has three choices: close the fund and maintain the Sharpe; stretch capacity, gather assets and fees, and degrade the Sharpe; or close the fund and launch a new business. We have chosen to close our hedge fund, Stratus, to new investment and enter the alternative beta business.

Alternative and smart beta are relatively new industries, yet to be proven, but attractive to investors as a feasible response to intrusive regulation. Every indication is that many offerings are chock full of in-sample bias and are managed by people without fiduciary responsibility, and who know little of the pitfalls of managing impact and risk. Every indication is that there will be some major disappointments.

Alternative beta requires many of the same things it takes to run a successful quant hedge fund: smart researchers with deep experience to avoid in-sample bias issues; a deep investment in technology appropriate to managing impact and risk; business people to manage the complexity of the business. And above all, it requires a coherent team that knows how to accomplish this together.

The above is a transcript of a presentation made at the 4th FERI Hedge Funds Investment Day, held in Bad Homburg, Germany on 10 September 2015.

Philippe Jordan joined CFM in 2005. He heads its New York office and serves on the Board of Directors of CFM S.A. France, together with his partners Dr. Bouchaud, Dr. Potters and Mr. Saulière. Mr. Jordan also chairs, together with his three partners, CFM’s monthly Investment Committees. Prior to joining CFM, Mr. Jordan was a Founding Member of Indeman Capital Management, LLC (IDM), a start-up focused on hedge-fund incubation. Mr. Jordan joined IDM from Credit Suisse First Boston (CSFB), where he was a director and the Global Head of Capital Introduction in the Prime Banking Group. He also worked in CSFB’s Hedge Fund Development Group, where he was the head of hedge fund origination for the Americas. Mr. Jordan began his career as an account executive at Refco Group Limited in London. He served on the Board of Directors of FINEX from 1993 to 1999.