jump to navigation

Frames vs. reality October 27, 2013

Posted by Ezra Resnick in Economics, Math, Reason.
Tags:
2 comments

Suppose your household owns two cars, which are used equally: car A gets 8 miles per gallon of fuel, while car B gets 25. You have the opportunity to either trade in car A for a newer model that gets 10 miles per gallon, or you may trade in car B for a model that gets 50 miles per gallon. Which choice would save you more on fuel costs?

This seems like a no-brainer: trading in car A improves its mileage by only 2 mpg (25%), while trading in car B improves its mileage by 25 mpg (100%)! Just for fun, let’s use our brain anyway, and do the math. If each car drives 10,000 miles a year, then upgrading car A would save 250 gallons (consuming 1000 instead of 1250), while upgrading car B would save only 200 gallons (consuming 200 instead of 400) — so choosing to trade in car A would save you 25% more money!

frameHow could our intuition have been so wrong? The cause of the error (dubbed “The MPG Illusion” by psychologists Richard Larrick and Jack Soll) is in the framing of the question. We don’t really care about optimizing the distance we can drive on a fixed amount of fuel; we want to optimize the amount of fuel we consume for the distance we drive. Consider this alternative formulation of the above choice: you can either upgrade car A from .125 to .1 gallons per mile (saving .025 gpm), or upgrade car B from .04 to .02 gallons per mile (saving .02 gpm). This formulation is mathematically equivalent to the original, but they evoke opposite intuitions — which is quite disturbing, considering the widespread assumption that consumers (and policymakers) will reliably make choices that are in their own rational interests.

When comparing differences in fuel efficiency, it’s clear that one frame (gallons per mile) is superior to another (miles per gallon). This is not always the case, however, as shown by an example due to the economist Thomas Schelling. (Both the following scenario and the previous one are discussed in Daniel Kahneman’s Thinking, Fast and Slow.) Say we are designing a tax code, and are thinking of including a “child credit”: families with children will get a deduction on their taxes. Would it be acceptable for the deduction to be greater for rich families than for poor families? You probably answered with a resounding No.

Now, let’s think about it a different way. Giving a tax deduction to families with children arbitrarily designates a childless family as the default case, but we could just as well rewrite the tax code such that having children is the default case, and childless families would pay a tax surcharge. In that case, would it be acceptable for the surcharge paid by the childless poor to be as great as the surcharge paid by the childless rich? Again, you probably feel strongly that it would not.

The problem is that you cannot logically reject both proposals — since a surcharge that is smaller for childless poor families than for childless rich families is the same thing as a deduction that is smaller for poor families with children than for rich families with children. For instance, a surcharge of $500 for the childless poor and $1000 for the childless rich is equivalent to a deduction of $500 for poor families with children and $1000 for rich families with children.

The lesson is not that it’s impossible to design a tax code that burdens the poor less than the rich. The disturbing fact uncovered here is that our intuitions about fairness, like our intuitions about fuel efficiency, are unreliable: they can give contradictory answers to the same question depending on how that question is framed.

Kahneman’s conclusion is stark:

You have moral intuitions about differences between the rich and the poor, but these intuitions depend on an arbitrary reference point, and they are not about the real problem… Your moral feelings are attached to frames, to descriptions of reality rather than to reality itself.

Strong intuition is never a substitute for slow, careful analysis.

Advertisements

Blind to their own blindness October 29, 2011

Posted by Ezra Resnick in Economics, Reason.
Tags:
add a comment

In an excerpt from his new book, Thinking, Fast and Slow, Daniel Kahneman discusses “the illusion of validity”: not only do people (even professionals) make confident predictions in situations where they really don’t have enough information to do so; they continue to feel and act as if their predictions are valid even when they have been made aware that their past predictions performed little better than random guesses.

We are prone to think that the world is more regular and predictable than it really is, because our memory automatically and continuously maintains a story about what is going on, and because the rules of memory tend to make that story as coherent as possible and to suppress alternatives. Fast thinking is not prone to doubt.

The confidence we experience as we make a judgment is not a reasoned evaluation of the probability that it is right. Confidence is a feeling, one determined mostly by the coherence of the story and by the ease with which it comes to mind, even when the evidence for the story is sparse and unreliable. The bias toward coherence favors overconfidence. An individual who expresses high confidence probably has a good story, which may or may not be true.

One area which is especially prone to unfounded confidence is the stock market. Kahneman recounts how, in preparation for an invited talk at a firm of financial advisers, he analyzed their investment outcomes over a period of eight years. The firm naturally considered its advisers to be skilled professionals, and awarded annual bonuses based on performance. But Kahneman found that the year-to-year correlation in the ranking of advisers was basically zero — the kind of results you would expect from a dice-rolling contest. There was no long term consistency that would indicate differences in ability among advisers; the firm was rewarding luck as if it were skill. And of course, they continued to do so even after Kahneman presented his findings.

This doesn’t mean we should distrust all professionals. According to┬áKahneman, it is possible to develop true expertise in fields that provide good feedback on mistakes in a sufficiently regular environment, like medicine. But in general, we should not take expressions of high confidence at face value:

people come up with coherent stories and confident predictions even when they know little or nothing. Overconfidence arises because people are often blind to their own blindness.

So, now that you know about the illusion of validity, will you avoid it? Probably not. Kahneman predicts:

The confidence you will experience in your future judgments will not be diminished by what you just read, even if you believe every word.