Why you should take everything you read about VC performance with a grain of salt
Last Thursday, Rolfe Winkler of the Wall Street Journal wrote about the fact that Andreessen Horowitz’s so-called returns trail after those of other Silicon VCs. Also, according to his data (which was obtained by talking to other people) A16z has yet to earn its membership in the so called “Elite” club of venture capital.
If you trust his sources since he is still basing most of his arguments on hearsay and 3rd party assessments, most of what Winkler wrote makes a lot of sense.
Scott Kupor, a distinguished VC and a managing partner at A16z, responded with the usual “Apples-vs-oranges” comparison since VCs often value their portfolio in different ways. He also noted that venture capital investing is a long-term game and that we shouldn’t judge VCs according to anything but actual cash returns.
However, in a very simple and understandable way, Kupor went on to explain how different VCs use divergent valuation methods such as “Last round valuation/ waterfall,” “Comparable company analysis,” and “Option Pricing Model (OPM).” The use of various manners to evaluate companies creates a situation where it is very hard to compare the success of unrealized VC investments since one VC prices company X in one way and another VC prices the same company in a different way. Since there is no one true methodology to analyze unrealized investments (aka private companies), you can’t do a real comparison if you do not know which models all the different VCs are using.
Kuport concludes the post by saying that trying to “measure” a VC by his unrealized returns is like “reporting the Superbowl winner based on the result of pre-season NFL games or even training camp scrimmages.”
This is what you need to know – and the journalist won’t write
To his defense, Winkler did some great leg work with collecting bits and pieces of information and talking to many people to get data that is usually kept under lock and key and only available to investors (Limited Partners) and the fund itself.
Having said that, a journalist’s job is to gather data, verify its authenticity and then share it with their readers. Winkler, like many other journalists (including us) these days, took it one step further and tried to do an in-depth analysis on the data he collected to show a certain picture, capture a great headline and get as much traffic as he possibly could to his article (or subscribe to his newspaper, as the full article is behind a paywall).
As most reporters do, when they form a specific opinion in mind, they tend to disregard other facts that interfere with their theory. In this case, Winkler, who has vast experience in covering the VC industry, chooses to politely ignore the different valuation models fact when comparing the different VCs and calling out A16z.
Second, even the title of the post, called A16z returns “trailing” is misleading. While Winkler details that most of his analysis came from comparing actual returns together with those “unrealized returns” (which are measured differently by each VC), if you compare the actual returns, it comes out that A16z is, in fact, very much in line with other leading VCs of their age in the valley.
Third, if we’re into the age and VC maturity, comparing actual VC returns between A16z (which started its first fund in 2009) vs. Sequoia Capital (which started back in 1972) or Benchmark Captial (which started a little later, back in 1995) is like comparing the life success of a 7-year-old to what has been achieved by a 21-year-old or a 44-year-old: not something most people would do.
But hey, it makes for a good headline and there is nothing “untrue” about it.
This is what you need to know – and the VC won’t say
On the other hand, Kupor did exactly what was expected of him: he protected the reputation of A16z, and did so in an elegant and explanatory way so you can almost miss out the fact that he is still trying to have you doubt the claims of “the other side.”
Through his thorough explanation (which is in fact, the clearest explanation that I have read for a VC valuation model), Kupor is making you think that everything that Winkler said in the original article should not be taken seriously.
And he almost succeeds. After explaining the different valuation models and why Winkler’s comparison is wrong, Kupor makes another argument saying we should not compare VCs based on their unrealized returns as VC is a long-term game.
While no one would argue with the fact that VC is a long-term game, comparing VCs based on their unrealized returns is exactly what VCs do when they go out and raise a second, third, and sometimes even forth and fifth fund before their first fund has reached its maturity and showed its full returns. Considering that the biggest bet won by A16z’s first fund was Instagram in 2012, this is exactly what Kupor and his friends at A16z had to do in order to raise their second and maybe even third funds.
So, if investors are comparing themselves to other VCs on short-term “marks” based on different valuation models when they go out and raise more money for their funds it’s okay, but when journalists do that in order to provide transparency on what is happening behind closed doors in the VC world, it isn’t?
Disclaimer: It’s very easy to come by after the fact and say what I think was wrong. This article reflects my take about what was written based on my experience as an entrepreneur, investor, and journalist.
This article was originally published on Geektime