Wednesday, June 25, 2008

What happens if we're wrong...

If I was pressed to describe my blog, I'd probably shy away from describing it as a "green" blog, but instead I'd describe it as a blog on human nature and why it's made being green hard. I'm far more interested in how human nature shapes our actions that led us to our current situation. So this post is going to focus on two articles that deal with human nature, and very little with being green. The first is a Peter L. Berstein column in the New York Times about our capacity for risk assessment. It focused on the subprime mess, but we can substitute and either opaque or distant phenomenon, and how we fail to prepare for it.

Bernstein focuses on Pascal's wager, which says you should believe in God because it doesn't take much effort and if you're wrong well not a big deal. But if you bet against God, and you are wrong. It'll be Hell. (sorry couldn't resist). Now much of what we talk about in the global warming debate centers around forecasting. The question is not whether your forecast is right, but what could be the worst possible thing if the forecast is wrong. Now my last post focused on experiments, and part of any experiment is sampling. If you look at Keeling Curve, if you take too small a sample it may look like the CO2 concentration is dropping. But if you take a larger sample it's clearly oscillating, and if you take an even larger sample it's going up. But in most cases in our lives, we rely on maps not the actual world to get around. And maps are approximations, and approximations lead to doubt and that's what the naysayers of global warming focus on that doubt.

Don't focus on the forecast, focus on what happens if you are wrong. Few people live without insurance of some form, so we are capable of doing this. Nassim Taleb calls these crazy unforecastable events black swans. In a risk assessment model, we know what to do, but then again if we were good at risk assessment, I wouldn't be reading about the housing crash.

Which brings me to the next article about human nature by Paul Buchheit of Google fame, and some product called AdSense and Gmail. He's become one of the more interesting tech bloggers and here he writes that you have to understand that people are not rational, but they are very good rationalizers. It's an interesting point, since it means that convincing people they are wrong is going to be hard. But then he also says that understanding this, allows us to make better decision. We can bet rational about our irrationality. Now I'm not sure we can not rationalize this away, but it does enable us to create decision trees that ask, "is it possible that we are wrong" We don't see much of this, that's what makes John Maynard Keynes quote "When the facts change, I change my mind. What do you do, sir?" so funny because it's so rare.

It's making me wonder about how can we internalize externalities, since the human condition is to create externalities. You see this in work, people try to push off blame or work. If I can pollute downstream, someone else will deal with it, not me. So perhaps the issue is not making externalities clear, but instead making catastrophes clear. But here's the rub, does making a catastrophe not happen mean that you should have not prepared in the first place.

The irony of life is you get paid more for the fires you put out, then the fires you prevent.

0 Comments:

Post a Comment

Subscribe to Post Comments [Atom]

<< Home