Being wrong

I learned today, thanks to Robert Krulwich at NPR, that our solar system isn’t very typical. 

Specifically, since we knew that in our solar system we had four rocky planets closer to the sun (Mercury, Venus, Earth, and Mars) and four gaseous planets further out (Jupiter, Saturn, Uranus, and Neptune), we came up with a theory to explain that observation.

The theory had a “frost line.” The planetary dust closer to the sun would melt into minerals and form rocky planets, and the dust further out stayed “dusty,” forming gaseous giant planets. 

That sounded good. It made logical sense. It explained the observation. There was just one problem: when we looked closer at the solar systems around us, we didn’t see any solar systems like that. Krulwich’s post does a great job explaining what those other solar systems look like.

We love stories—everywhere

It’s not that the “frost line” theory was wrong. The cause and effect dynamic is probably true. It’s just that other factors (solar gravity, solar winds, interplanetary forces) play a large (possibly larger) role, and our single observation wasn’t enough to tease out those dynamics.

What I found fascinating, however, is just how surprising that discovery was. We knew we had one observation—our solar system. We logically knew that one observation isn’t enough to generalize. There were certainly rational scientists that checked their beliefs accordingly. And yet the results were still surprising.

In this case, theorizing about the structure of solar systems, you could argue the consequences of being wrong aren’t huge. We were surprised, and now we’ll continue learning. In other areas, though, where we take action based on those theories, the consequences can be worse. 

Consider the obesity epidemic that is taking a large financial toll on our country. That’s a problem a bunch of organizations, particularly government organizations, have been trying to solve. But there’s a shift taking place here as well (or an argument at the very least). 

Read Gary Taubes’ NYTimes Magazine piece "What if It’s All Been a Big Fat Lie" to get the full story, but the basic story he describes is as follows:

  • Conventional wisdom today. Fat and calories matter. Eat less fat and fewer calories, and you’ll lose weight. Taubes describes the history behind this idea, both the science and the sociopolitical factors, with the latter dominating. In 1977 a Senate committee published its “Dietary Goals for the United States” to address the epidemic of “killer diseases” sweeping the country. Then in 1984, the National Institutes of Health formally recommended that Americans over the age of 2 eat less fat. I’m cutting out a lot of complexity, but essentially that led to a focus on fat, and carbohydrates, particularly sugar and high fructose corn syrup, entered the equation to fill the void.
  • Emerging idea. Carbs, particularly sugar, are the problem. The simplistic idea that fat leads to fat is contradicted by a large body of evidence. Focusing on fats actually leads to the removal of good fats from the diet as well, causing hunger, which in turns leads to overeating, particularly of carbohydrates, which among the macronutrients of proteins, fats, and carbobydrates, are unique because eating them makes you want to eat more. This last point is fascinating. Taubes describes an experiments in which three groups of people tried to fatten themselves by eating one of three different diets that were either predominantly protein, fat, or carbohydrates. The first two groups couldn’t do it. They got too full. The last group succeeded because it could eat continuously. 

There is more—much more—to this story, but all that matters for now is that the conventional wisdom is wrong. To be honest, the focus on carbs is probably wrong as well. But the common theme to me here is the idea of being skeptical of common wisdom.

We need to realize we create stories. This is a central idea in Daniel Kahneman’s book Thinking Fast and Slow. We see cause and effect even where there is none. A series of drawings showing a square in motion that comes into contact with another square that immediately begins to move causes observers to believe the first caused the second to move. It leads to a very powerful “illusion of causality” that exists even for six month old infants. If they don’t see that causality, they’re surprised.

This is hardwired. Our ability to create stories to explain the world is a fundamental element of our nature. It falls in the same category as sugar, in fact. Our bodies developed an intense craving for sugar because it was rare in our environment and we needed it for energy. Similarly, our ability to create quick stories, even if they were wrong, helped us survive. The man who wondered if, in fact, the rustling bushes were due to bears or the wind didn’t survive, unlike the guy who assumed it was a bear and ran.

But like all evolutionary remnants, the tendency gets us into trouble. I wonder if, in fact, all of our modern problems can be tied to this basic idea: we evolved to live in a world that we no longer live in thanks to our remarkable success in shaping that world in our favor.

In any case, the point of all this is that we create stories that aren’t true. And remarkable failures will come about if we don’t question those stories and realize that they may be wrong.

Anything that involves humans is probably wrong

In fact, there’s some early progress in figuring out just how wrong any given story may be. Last September, the mathematician/scientiest Samuel Arbesman released a book titled The Half-life of Facts: Why Everything We Know Has an Expiration Date. It demonstrated scientifically precisely the dynamics described above of beliefs being proven false. (The Economist has a good Q&A with Arbesman here.)

Arbesman showed that each scientific discipline has a half life for its ideas, a predictable time at which half the ideas held true in that field will become obsolete.

Arbesman gives an example:

…in the area of medical science dealing with hepatitis and cirrhosis, two liver diseases, researchers actually measured how long it takes for half of the knowledge in these fields to be overturned. They gave a whole bunch of research papers from fifty years ago to a panel of experts and asked them which were still regarded as true and which had been refuted or no longer considered interesting. They plotted this on a graph. What they found is that there is a nice, smooth rate of decay; you can predict that every 45 years, half of this particular sort of knowledge gets outdated.

And there’s a difference in how fast given fields decay:

One of the slowest is mathematics, because when you prove something in mathematics it is pretty much a settled matter unless someone finds an error in one of your proofs.

…the social sciences have a much faster rate of decay than the physical sciences, because in the social sciences there is a lot more “noise” at the experimental level. For instance, in physics, if you want to understand the arc of a parabola, you shoot a cannon 100 times and see where the cannonballs land. And when you do that, you are likely to find a really nice cluster around a single location. But if you are making measurements that have to do with people, things are a lot messier, because people respond to a lot of different things, and that means the effect sizes are going to be smaller.

Startups and business are probably even more wrong

Arbesman doesn’t talk about startups and business, but I can’t help but project the idea in that direction.

Contradictions abound. For every path to success, there’s an opposite. For every conventional wisdom, someone smart is advocating the opposite:

  • Enterprise software is capital intensive, or not. SuccessFactors raised $63 million on the way to its IPO. Veeva Systems raised $4 million. Others raised none.
  • You should search and potentially pivot in the early stages, or not. Peter Thiel believes that entrepreneurs should have a definitive view of the future and decries what he calls the established religion of the Valley of pivoting and A/B testing.
  • The success of consumer apps depends on social/viral, or not. Phil Libin, the CEO of Evernote: “We’re not social. We’re not viral.” 
  • Growth over profits in the early stages is key for success, or not. Clayton Christen has a fascinating chapter in The Innovator’s Solution titled “There is Good Money and There is Bad Money.” It’s worth reading because the points are subtle and fit into a framework he builds throughout the book, but Christensen summarizes by saying “…the best money during the nascent years of a business is patient for growth but impatient for profits" (emphasis his). Christensen’s logic is that a focus on profits forces the company to "test as quickly as possible the assumption that customers will be happy to pay a profitable price for the product—that is, to see whether real products create enough real value for which customers will pay real money." He goes on to describe the death spiral that emerges if a company grows too quickly before answering this question. But this is certainly in contradiction to the conventional wisdom in Silicon Valley, particularly for consumer internet startups. 

So let’s make predictions about our world cautiously

So in thinking about the future, particularly from the standpoint of investors, I see a common theme: confidence is bad. Both Nate Silver and Philip Tetlock in their fascinating books on prediction discuss this point ("The Signal and the Noise" and "Expert Political Judgment," respectively).

I stress above “from the standpoint of investors” because the reverse is true for entrepreneurs, where I agree with Thiel: confidence, a definitive view of the future, is good.