Podcast 1: Factor Crowding, Cobras,
and Other Market Hyperboles
“When a measure becomes a target, it ceases to be a good measure.”
— Goodhart’s Law1
“What Works on Wall Street” Podcast
After reading as many as 18 different WSJ articles on the subject, it’s understandable why investors might be up in arms about the alleged perils of factor crowding. We hope this discussion allays some of those concerns…
Episode 1 transcript:
Jim O’Shaughnessy (Jim): Hi, I’m Jim O’Shaughnessy. Welcome to the “What Works on Wall Street” podcast. Today, I’m joined by my colleagues Chris Meredith, our director of research and senior portfolio manager and Patrick O’Shaughnessy, a Principal and Portfolio Manager who many of you know through his podcast “Invest Like the Best” and his many writings.
We thought it’d be fun today because we’ve been getting lots of questions in meetings about the factor trade and the central question we seemed to get is, “Is this trade getting way too crowded?” And I know Chris that you’ve been getting a lot in your meetings and Patrick you as well. Let’s start with you Chris. Do you think the factor trade is crowded and if so, what would the results of that be?
Chris Meredith (CM): There has been a large increase in style investing, factor-investing different accounts for how much is in there. I think that going around the table here, we would all have a different amount that is in quantitative strategies. The number I read this morning somebody had a paper that said it was $1.1 trillion, now that is strategies between both the institutional and the retail side.
A lot of these are saying that they have similar types of characteristics where you’re buying on factors or risk premiums and it would be value or momentum or yield. The part about this is could it get crowded but it really comes down to you have to dig a level deeper. It’s not just about getting $1.1 trillion into a theme like a value trade.
What you have to understand that at the end of the day, you’re not buying this amorphous, platonic idea of value. You’re still at the end of the day buying a portfolio stocks. Underneath the hood, you’re sitting there and it’s expressing that part of value and how you’re getting it through the individual securities and the part for me about crowding comes from you have to wind up basically seeing the crowd where everybody is working off of one the same signal. The same idea of whether they’re getting from value.
They have to be doing in the same place, all right, the same stocks whether it’s large cap, mid cap, small cap which we can talk about and they have to do it at the same time. Each of these when you think about the whole pantheon of quantitative investors, the number of ways you can express factors, my opinion is there’s a lot of room out there for people to have on these trades particularly for boutique managers who have unique expressions who are not built for scale where they’re winding up investing in the largest names out there, but more the underserved part of the market and having the ability to select stocks that are not nearly as crowded as what would be from the larger market cap weighted products.
Jim: I have been looking at a lot of these portfolios and I’ve come to the conclusion that at least from what I’ve looked at, to my mind the majority of these are what I would call “Factor Lite”. They have huge portfolios. They’re tilted, yes but they’re not extremely tilted to the various factors they proclaimed to believe in. I think that honestly, they’re very different. In fact, I’m working on a paper right now because of all of the explosion of the stuff on quant.
The big Wall Street Journal series. Everybody suddenly wants to be in quant. We’ve been doing this since 1996 and our goal was always to provide alpha for our investors so our mindset may be a little different than factors today. Patrick, you wrote a great paper called “Alpha or Assets” … Do you want to talk about that for a minute?
Patrick O’Shaughnessy (PO): Yeah, I think the most interesting aspect of this crowded idea is the translation between published research and actual portfolio. You go read an academic paper on price-to-book or whatever the factor might be accruals which you’ll often find is portfolio is broken into deciles or quintiles to show return spreads between best and worst and especially when you get into deciles portfolios, you see some really attractive paper returns. Looks like something that you as an investor certainly want to be involved in.
What is not typically taken into account in most academic papers are real-world frictions, things like trading cost, turnover taxes etc. Some are estimated, but just that estimated and when you go and look at actual sponsored portfolios predominately ETFs these days in the factor world, what you find is portfolio construction methods that really look nothing like those long short paper spreads that you saw in academic papers.
There’s a bit I think of a … It’s not maybe explicit but a bit of a bait and switch where you’re being sold on a core concept or idea to steal Chris’s term, this platonic ideal of value or whatever, but then it doesn’t get translated into what you’re actually buying and the term smart beta we think is a fairly appropriate term that effectively what you’re getting is a broad market exposure with a preference for value stocks or momentum stocks or quality stocks or low volatility stocks, but that you should not expect the sorts of results from those portfolios that you saw in the paper that probably sold you on the idea.
That gets to your question about Alpha or Assets and I think that this is just a simple rule of thumb that investors can well apply when considering a factor solution which is what was the motivation of whoever it is that launched this thing whether it’s a mutual fund or an ETF or a separate account. What was their motivation in launching this? Was it to gain some sort of interesting idiosyncratic alpha based factor exposure? Was it to raise a ton of money and charge 20, 30 basis points? Stave off this trend towards passive? Was it designed with scale in mind or with alpha in mind? Alpha or assets?
I think that simple question will reveal a lot about whether or not you should buy something and just to my mind when you look at the history of investing, if something can accommodate $100 billion dollars plus of assets, I’m not so sure that I’d be convinced that there’s much of an edge there. That’s an enormous amount of money and some firms have as much is $500 billion in some of these simple factor strategies. I think that question is a good starting point for this crowding, but also for evaluating any strategy.
CM: Sorry, the #1 evaluation is basically that market cap-weighting versus equally-weighting. The market cap weighting is essentially done where portfolio constructions about you have a budget to work with on how my choices you can make and if you’re going for market cap weighting, you’re essentially saying, “I want this thing to be able to scale up and be able to gather a lot of assets versus having a product that would be more in a large-cap and say that $10 billion to $25 billion range where you’re going to have a lot of your holdings but still you’re not buying and simply gaining the largest ones.”
The crowding can happen in those largest ones. That was Johnson & Johnson when volatility and low-vol and dividends were coming through. There was a part where I was trading at a significant premium. That was a nice Wall Street Journal article I think in the middle of last year talking about how those particular names are winding up seeing a lot of flow in because the ETF part of it, but when you’re sitting there and talking more about large-cap putting that, again that large swath of the more $10–25 billion, you wind up with more opportunity and then far less possibility for the crowding.
Jim: Right. We as a firm have chosen alpha as opposed to assets. That was always my goal in all the research that I did and when you choose alpha, the amount of assets even in the large-cap space are nothing like the hundred, $500 billion that Patrick mentioned earlier because you simply can’t scale those types of portfolios the way you can scale a market cap weighted portfolio. Why don’t we dig a little deeper and Chris, why don’t you start with how our universe differs from the universe that many of the smart beta and other factor portfolios are using?
CM: Sure. I guess the key part is that it’s not benchmark-driven. We don’t start with the benchmark and then try to tilt. A lot of the smart beta products out there, they’re about essentially giving you market access plus a little bit. Part of that is essentially starting with the part that Exxon will be 4% of your portfolio because it’s 4% in the benchmark.
For some of these, they’ll start with the style benchmark and we’ve seen at that point you’re incorporating the Russell methodology or whatever your benchmark methodology is inside of that style based which can lead you to choices where you can wind up having names because of the choices of them using book-to-price or whatever their value metric can be is going to influence what you’re going to get at the end.
For us, that idea of starting with simply stocks equally weighted within that large-cap universe, not with the benchmark market cap weighted with their style component gives you more flexibility and the ability to identify opportunities for value investing, momentum investing or yield investing.
PO: Chris has a neat way of thinking about this which is to imagine just hypothetically there are some alpha signal that quant is still trying to capture and that there’s only signal in the best and worst decile. You can outperform by 4% in the best and you can avoid under performance or actively short 4% under performance in the worst decile of stocks. Then there become two different ways you could construct a portfolio with that information assuming that it was true.
The first is that you go long and short or just long and the best and ignore the worst and then stay neutral to the rest. Basically, to the middle fat part of the market, you maintain a market like weight or market like exposure. This is kind of the … I think what Chris calls the risk first smart beta type approach which is I don’t have an opinion, there’s no signal in this middle part so I’m going to be neutral to it which has its merits philosophically.
The other approach is to say, “Forget that whole neutral middle. My approach we’re not having no signal or no opinion is to just not own any of it.” Meaning, I just want to own the best decile portfolio only, forget the rest or a long short best minus worst decile. This brings us to this interesting discussion of active cost which is what are you really paying for factor exposures.
If something has a 30 basis points expense ratio which might be an average or a common number for a factor ETF, that sounds relatively low. It’s somewhere between pure active and pure passive which is probably what you would expect, but when you back out the portion of the portfolio that you’re neutral for so that middle 80% call it or 70%, you find that you’re actually paying quite a bit for that active exposure.
It could be more like a traditional active fee of 80 basis points or higher. I think that’s something that people want to be very mindful of is. We’ve gotten to this area where allocators, financial advisors, institutions are controlling allocation to passive to active management, much less stock picking going on. What you want to think about holistically is you could do an index which cost your institution two basis points or less and pair it with a just the best decile approach, that’s something that we would advocate to get an all in exposure that’s similar, but lower active cost or you could just go with the all in one stop shop smart beta type approach which may get you some factor exposure, but sometimes at a higher cost.
I think that as we continue to go forward in quant of Wall Street Journal Chris is writing about quant every week, people will really start to dissect this and understand it and already sophisticated institutions are saying, “If it’s a simple academic exposure, we’re just going to do that ourselves. We don’t need to hire someone to do that. We’ll pay an internal team.”
Jim: Chris, you want to jump in?
CM: Yes, what’s also interesting Patrick, you’re talking about the portfolio construction aspect of which is always we focus on returns. We want to solve funding gaps for clients on generating those returns, but we still do risk controls on the product.
Portfolio construction can cause crowding as well and if you look, the main off-the-shelf products for risk models Barra and Axioma will wind up having it where they can crowd the products. The interesting one was Matthew Rothman who is back at Lehman at the time. He ran a study where he had all of the quantitative managers give their signals, basically say, “Take the S&P 500.” He did this across a number of managers and said, “Tell me which, what your top 10% and bottom 10% would be.”
What’s interesting was, I’m paraphrasing a little bit on the numbers and the findings, but he said essentially, about 40% of the longs overlap but there was distance between 10% of the long shorts where some people said it was long and one side short which tells you how much difference you can get. He called it like a net 30% overlap.
When he ran all those portfolios though through Barra, the overlap went into the 45 to 50% range. That idea of crowding through your risk model by buying off-the-shelf products and this is what I mean about having it where boutiques will have a better opportunity for advantage because they’re building their own signals on their own universes with their own risk controls versus going through some would say the standard off-the-shelf products.
What’s interesting about the explosion particularly in ETF’s is 44% of the assets and ETF’s are essentially going through two groups MSCI and S&P. That’s one where you’re going to see crowding even inside of that because a number of shops have said, “We’re just going to use the MSCI framework.” That’s like 27% of the market which means they’re all using the same factor definitions and risk controls that will go through and that that can also attend to lead to more crowding.
Back to this idea of scale and obviously, these ETF sponsors want to raise a lot of money, that’s the reason that they’ve launched these things. I’m sure that they have pure intentions too, meaning they think that there is some excess return to be harvested or earned over the long-term for patient investors, but one of the other problems with the ETF landscape is that the vast majority of them are classified as indexes meaning they have to publish rules for how the index is reconstituted, how stocks are selected and I think that there is … Everyone talks about front running.
There definitely is front running, people that run a purely systematic process that you know exactly what and when it’s gonna do. Everyone can get Compustat. Everyone can get access to the major databases for a reasonable fee if you’re a big asset manager that are used to build these strategies and so I think that’s something to be mindful of too, that something that is straightforward, you can publish the rules simply, you can accommodate a ton of money, I don’t know how much excess return is going to last in simple ideas like that.
Jim: Yeah, that becomes a mathematical anomaly rather than the one we prefer which is a behavioral anomaly and something that I remember early from my career was we originally called what we did strategy indexing and as such had to publish the rules that got to the names that we were holding and we immediately found out when we were having some success that there was a lot of activity in those stocks prior to us putting them into the portfolio.
Even back 20 years ago, people were saying, “Oh — they’re publishing the rules! We know that these are going to be the stocks, let’s buy them a day or two or three days before.” That I can see is a continual problem. I want to shift gears a little bit and talk about how we construct factors because when you say something like price-to-book, most people just think, “Okay price-to-book, very simple, very straightforward, have seen all the studies, it’s one of the most used in the academic world, et cetera.” But we found some pretty major differences by the way that we look at things Chris, do you want to dive into that for a minute?
CM: Sure, this comes down again of the crowding aspect of if everybody is using that same factor on selecting stocks and price-to-book is the most common place one used out there. Took a series of multi-factory TF’s which are becoming very popular right now and you unpack their value. There were differences on some of these different characteristics they would use, but everyone used book-to-price which is we don’t use.
Part of this is there are ways to go about defining book-to-price a little bit differently, there’s just the raw common equity that you’re doing adjustments for goodwill et cetera. Even then what we’ve seen is that there’s just less and less effectiveness over all on the characteristic using a stock selection. This is because are some inherent noise in the recent market place that’s been created because of share re-purchases, stock repurchases impact both the market value as well as the book value of equity and it’s one where essentially these companies that are trading at a discount because that’s why they’re attractive for the repurchase.
All of a sudden then, see their book value of equity starts to disappear and they appear to be more expensive overall. You can wind up with with in value certain noise that can happen within characteristics, things like earnings like you said should seem very straightforward, but there are adjustments you can make for whether you’re going to sit there and do extraordinaries and include them versus just bottom line that income, do the income, adjust for extraordinaries or whether preferred dividends should be taken out because those don’t flow to you as a common stakeholder.
Things like free cash flow. There’s a ton of different ways to define it. We have our own proprietary definitions of it and how we aggregate our signals and put them together as it winds up being proprietary for us. From our point of view because again we have custom factors, even though we’re using some of the same as Patrick said off-the-shelf data sources, that can wind up giving you some unique expressions where you’re in a slightly different place from what happens with other people in the market place and answer it just again, you’re staying out of the crowd a little bit better.
Jim: It’s interesting, I’ve done this all my life and yet it’s sometimes very difficult I would imagine for your average investor to be able to take that kind of a deep dive. They see a product that calls itself smart beta, it says it’s of value tilt and that’s pretty much as far as they look whereas obviously that is not the case at all.
To me, that’s the ongoing research part. Patrick, you had a great quote and I’m going to misquote it and you could correct me, but it was something along in the lines of, “Don’t let the measurement become your target.” What was that quote exactly?
PO: Yes, I’ll get to that in just one second, but one more point on this idea of factor definition. This might sound to a broader audience like arguing how many angels can dance in the head of a pin, but it’s really important especially in combination with portfolio construction philosophy. If your approach is the smart beta as we’ve discussed, very broad market, tilted exposures towards a factor, then exactly how you define price-to-book is going to or price-to-earnings is going to matter less.
If instead your philosophy is I want a fairly active, alpha only portfolio, then the way that you define factors becomes extremely important because it can lead to differences of a decent percent of the portfolio in terms of the actual businesses that you’re buying the actual stocks that you’re buying which can materially impact results.
It’s not an academic, quant only exercise when you have this more active portfolio construction methodology which I think we’ll see more quants do as that smart beta space has gotten very very crowded and saturated, then all of a sudden, the research and the work you do on how to define factors becomes critically important. I think that’s a key point. Back to your idea of measures and targets. People have probably heard me talk about this before because I just love it so much, but there’s an economic law called Goodhart’s law that says when the measure of something becomes a target, it ceases to be a good measure.
There are fun, silly examples in French colonial Vietnam and British colonial India where in Vietnam, this is the one that’s actually been verified. They had a rat infestation problem so the colonial government thought they’d offer a bounty for people that killed rats. The way you collected your bounty was turning in a rat’s tail so if you killed it, you turn in the tail, you get 50 cents or whatever.
People started doing this, they start paying up bounties and a month later, the rat problem had only gotten worse. They started to investigate and they’d see all these rats running around with their tails cut off so people were wisely cutting off the tail and releasing the rat back into the wild to breed and increase the bounty pool.
You get this, the same thing with cobras in India. Those are just too funny … Too fun silly examples, but this idea of perverted incentives or unintended side effects is so interesting in all things investing and I think that with say price-to-book as an example, you may have a market analog to this rule which is that price-to-book has long been the de facto measure of cheapness and there are good reasons for that for why a low price-to-book stock is a cheap stock.
It’s a simple signal, it’s fairly stable through time, it shows up well in the original research in the early 1990’s. Then, so it’s a good measure back to the little framework. Then it became an enormous target. I think inarguably it is the factor in which the most money is invested so there are enormous asset managers, there’s somewhere between $500 billion and a trillion dollars’ worth of assets invested where price-to-book is a key component in stock selection and if you look at the performance of that value factor versus the rest of the suite of the cash flow, earnings sales et cetera, it has been by far the worst performing sense, that kind of initial research and all that money started to flow behind it.
Measure became a target, maybe not now the best measure of evaluation. There are reasons that Chris mentioned too about things like transactions with shareholders that affect the performance of that factor too, but I think it would be hard to argue that $500 billion going into anything in the market wouldn’t have a material impact on that thing relative to other things. I think that’s what we’ve seen.
Jim: We were able to actually look at price-to-book back to the Twenties and one of the things that we found was that for the late 1920s through 1963 where the larger Compustat that universe began, price-to-book didn’t work at all. In fact, it was inverted. The cheapest 10% of stocks by price-to-book actually did the worst of all the deciles.
Obviously, if you think about that for a bit, well we had the great depression and another thing that are very very low price-to-book might be indicative of his bankruptcy risk and a lot of companies in fact did go bankrupt during the 1930s. My point being though that you have this huge swath of data saying, “Well wait a minute, this didn’t work at all and one of the things that we always strive for is consistency across market cycles etc.”
Back to the idea Patrick that you were talking about because I think it’s very intelligent and I think that smarter investors are going to start looking at these things and that is why pay a much higher fee for just market exposure? If you’re an institution, you can pay two basis points, if you’re a big enough institution, you can just do it yourself and then hire some very high Active Share alpha-oriented type portfolio.
CM: What I would say though is one thing I wanted to touch on is how essentially booked price can be crowded in the value space. It’s also part of the reason why we yield in our large-cap value space. It’s definitely a different look, a different way to identify companies that are still cheap and attractive but also have some secondary growth metrics attached to them because the repurchases are increasing, the amount of earnings that flow to you as equity stakeholder typically comes with higher-quality and it makes a very nice complimentary piece in your large value portfolio.
If you’re out sitting there having one side of value, but you also yield which I think is a underserved metric within that space which leads you to having less crowding overall…
…Shareholder Yield specifically.
PO: Shareholder Yield.
Jim: Right. A few thoughts on why the venom? Why is there so much venom particularly in the press for companies that are buying back shares? It seems to me that it’s indisputable that historically and currently those types of companies are doing really great things for their shareholders?
CM: I think it comes to having to do with tough economic times. They want to see people they think … they’re thinking at the political level plainly, at the country level saying, “It would be better served for the country if that money was put into place and spent out there on projects that would then generate jobs.” That’s because of employment.
As a small business owner, I can tell you that spending money doesn’t naturally mean that it’s going to generate positive economic impact. There are limits to what you can spend on your projects. The part here is that at the end, you don’t want to waste the money particularly when shareholders, it’s shareholder money at the end of the day and they’re making of a job of looking at the company and saying, “You know what? I’ve invested all the growth projects I can get, the highest return I can get for my money right now is to give it back to the equity shareholders.”
PO: You have to remember buybacks. Buybacks are nothing more than a tax advantaged elected divided. It is functionally the same thing. It is return of cash to shareholders, the difference being is the ones that choose to do it, not those that or just receiving a dividend stream. There’s just a functional equivalency here that I think people overlook because it’s so easy to say, “Well the dividends seem great, it’s paying out cash.” Buybacks is the same thing, but somehow buybacks should be going towards CapEx or towards research and development, things that create new jobs.
You got to remember that companies don’t last forever. The average, the half-life of a company in the US historically has been about 10½ years meaning that half of all companies will die every 10½ years. It’s a very rare breed for … company’s mature. They get to a stage where their market is addressed and they’re in this harvest and return of capital phase of their lifecycle.
What’s great for shareholders and that going by a return of cash through dividends and buybacks, what’s great for shareholders is that markets have tended to leave those companies that do it at an extreme level quite under-priced. They have set expectations for these more mature companies low, but too low and that’s all that matters in markets and it’s not low or high, it’s too low or too high and that’s what creates an excess return opportunity.
You get, historically, this double whammy of cash going back to shareholders whether you take it or not in the form of a buyback or automatically inform of a dividend, but you also tend to have to pay lower price in the lower price than it should be for these capital return type stocks. That’s been an incredibly effective combination and to Chris’s point I think is a factor, a holistic factor at how companies allocate their capital that is under researched, that is underserved in terms of a factor type ETF’s or quant strategies and I think represents a persistent opportunity as a result.
CM: Back to the crowding part we talked about at the beginning how having different selection factors, different ways of constructing the portfolio. Another aspect of this though is also the time horizon that a lot of people work on. If you think about it, you guys know when there’s a quantitative, there’s a whole lot of misconceptions on what we’re doing starting with the high-frequency trading sitting there, buying and holding for maybe a minute and most.
This whole series in the Wall Street Journal was about quant hedge funds and how they were saying their holding period was from a minute to three months. That was theirs. The ETF’s I think a lot of people what they don’t realize is most of them, the index construction methodology, you’ve got to dive into the documentation, but they have very strict limits on what they can do to turn over their portfolio as well.
They’re running through whether it’s typically rebalancing twice a year with a 20% turnover characteristic constraint which means that they’re essentially, to turn over their portfolio takes 2-1/2. They’re on the slower side let’s say versus what we’re doing of getting into their factor exposures and there’s people that just hold for 5+ years.
This idea of the signal if think about the crowding, it’s got to be people moving quickly into a trade and getting in front of you and having the same signal and I think across that whole spectrum, you can wind up having it where there’s enough of a difference where you’re going to see differences or there will be certain ones that will have structural impediments for being behind and certain ones that are just going to be working off of different signals from what we do which means that there’s still plenty of room.
PO: Also, the manifestation of crowding in the data would have to be narrowing spreads across factors. If something we’re getting too popular and it was only popular because of this idea behind. Let’s use value investing. Let’s keep on that theme, let’s say it’s priced-earnings ratio. If price-to-earnings ratio was value investing, that was getting popular just because people like value investing not because they were expressing some opinion of the underlying stocks.
All things equal, third-party trend then out of necessity, there would be a narrowing of the spread between, the gap between the PE of cheap stocks versus expensive stocks and you see that can have that spread which is something we track ebb and flow through time, but if anything what you’ve seen more recently is a pretty sharp reversal of that spread, so it had been quite low and say the 2013, 2014 timeframe but it’s effectively blown out.
Some of that is driven on the other end so Amazon and Facebook and Google and all the names that have led the market and are now that the biggest names in the indexes are very very expensive and their PE ratios and other value ratios have been dragged up so that’s helped widen that spread, but that doesn’t change the fact that there’s a widespread, wide and widening spread today in several of the key evaluation metrics which is not what you would expect if there was some enormous crowding in that factor in particular.
I think that when you add all this up, all the trends that we’ve been talking about, the importance of portfolio construction, more limited capacity strategies, real underlying data where spreads are widening, not narrowing. It all suggests to us that there remains an extremely interesting opportunity in quant type strategies. You just need to be very careful about what those strategies are.
Jim: I think that we’re going to give you the final word there Patrick, very nice summation unless Chris you want to have a —?
Jim: Thank you both for being with me today. Hopefully our listeners can get some benefit from this. It looks like the trades at least the ones we’re interested in are not very crowded at all, but would certainly understand the people who worry about such things after reading 18 different articles in The Wall Street Journal.
If you have an insatiable appetite for thoughtful insights and bleeding edge research, then why not subscribe to the OSAM Blog. You’ll get an email alert whenever a new article gets posted…