jump to navigation

Frames vs. reality October 27, 2013

Posted by Ezra Resnick in Economics, Math, Reason.
Tags:
2 comments

Suppose your household owns two cars, which are used equally: car A gets 8 miles per gallon of fuel, while car B gets 25. You have the opportunity to either trade in car A for a newer model that gets 10 miles per gallon, or you may trade in car B for a model that gets 50 miles per gallon. Which choice would save you more on fuel costs?

This seems like a no-brainer: trading in car A improves its mileage by only 2 mpg (25%), while trading in car B improves its mileage by 25 mpg (100%)! Just for fun, let’s use our brain anyway, and do the math. If each car drives 10,000 miles a year, then upgrading car A would save 250 gallons (consuming 1000 instead of 1250), while upgrading car B would save only 200 gallons (consuming 200 instead of 400) — so choosing to trade in car A would save you 25% more money!

frameHow could our intuition have been so wrong? The cause of the error (dubbed “The MPG Illusion” by psychologists Richard Larrick and Jack Soll) is in the framing of the question. We don’t really care about optimizing the distance we can drive on a fixed amount of fuel; we want to optimize the amount of fuel we consume for the distance we drive. Consider this alternative formulation of the above choice: you can either upgrade car A from .125 to .1 gallons per mile (saving .025 gpm), or upgrade car B from .04 to .02 gallons per mile (saving .02 gpm). This formulation is mathematically equivalent to the original, but they evoke opposite intuitions — which is quite disturbing, considering the widespread assumption that consumers (and policymakers) will reliably make choices that are in their own rational interests.

When comparing differences in fuel efficiency, it’s clear that one frame (gallons per mile) is superior to another (miles per gallon). This is not always the case, however, as shown by an example due to the economist Thomas Schelling. (Both the following scenario and the previous one are discussed in Daniel Kahneman’s Thinking, Fast and Slow.) Say we are designing a tax code, and are thinking of including a “child credit”: families with children will get a deduction on their taxes. Would it be acceptable for the deduction to be greater for rich families than for poor families? You probably answered with a resounding No.

Now, let’s think about it a different way. Giving a tax deduction to families with children arbitrarily designates a childless family as the default case, but we could just as well rewrite the tax code such that having children is the default case, and childless families would pay a tax surcharge. In that case, would it be acceptable for the surcharge paid by the childless poor to be as great as the surcharge paid by the childless rich? Again, you probably feel strongly that it would not.

The problem is that you cannot logically reject both proposals — since a surcharge that is smaller for childless poor families than for childless rich families is the same thing as a deduction that is smaller for poor families with children than for rich families with children. For instance, a surcharge of $500 for the childless poor and $1000 for the childless rich is equivalent to a deduction of $500 for poor families with children and $1000 for rich families with children.

The lesson is not that it’s impossible to design a tax code that burdens the poor less than the rich. The disturbing fact uncovered here is that our intuitions about fairness, like our intuitions about fuel efficiency, are unreliable: they can give contradictory answers to the same question depending on how that question is framed.

Kahneman’s conclusion is stark:

You have moral intuitions about differences between the rich and the poor, but these intuitions depend on an arbitrary reference point, and they are not about the real problem… Your moral feelings are attached to frames, to descriptions of reality rather than to reality itself.

Strong intuition is never a substitute for slow, careful analysis.

How do you know she is a witch? October 19, 2013

Posted by Ezra Resnick in Science, Superstition.
1 comment so far

“We have found a witch, may we burn her?”

“How do you know she is a witch?”

“She looks like one! Also, last week she gave me a creepy stare when she walked by my house — and the very next day my kitten died!”

“But do you have any good reasons for thinking that witches exist at all?”

“Oh, I see: You’re one of those closed-minded, reductionist, scientism fundamentalists. Let me tell you something: Thousands of people have believed in witches for thousands of years — how many more reasons do you need? Are you calling all those people stupid? How arrogant of you, to think you’re smarter than everyone else. Science doesn’t know everything, you know. And even when science claims to know something, it sometimes turns out to be wrong. Anyway, there’s more to life than what you can measure in a lab. Just because you can’t explain something scientifically doesn’t mean it isn’t true!”

“You got me all wrong: I agree with all that. I merely meant to say that based on my own hallowed tradition and sacred texts, I believe that what you call witchcraft is actually caused by demonic possession. This calls for an exorcism, not a burning.”

“Oh. All right, then, let’s give it a shot — if that doesn’t work, we can always burn her!”