jump to navigation

We have a method July 12, 2011

Posted by Ezra Resnick in Science.
Tags:
add a comment

Carl Sagan begins his essay “Wonder and Skepticism” (published the year before he died) by describing the feelings of hope and awe that inspired him to study science as a child, leading to the joy and excitement of his career as a scientist and a popularizer of science. Sagan realizes, however, that scientific thinking remains foreign to many people, and he warns of the dangers inherent in a scientifically illiterate society:

There’s another reason I think popularizing science is important, why I try to do it. It’s a foreboding I have — maybe ill-placed — of an America in my children’s generation, or my grandchildren’s generation, when all the manufacturing industries have slipped away to other countries; when we’re a service and information-processing economy; when awesome technological powers are in the hands of a very few, and no one representing the public interest even grasps the issues; when the people (by “the people” I mean the broad population in a democracy) have lost the ability to set their own agendas, or even to knowledgeably question those who do set the agendas; when there is no practice in questioning those in authority; when, clutching our crystals and religiously consulting our horoscopes, our critical faculties in steep decline, unable to distinguish between what’s true and what feels good, we slide, almost without noticing, into superstition and darkness…

We have a civilization based on science and technology, and we’ve cleverly arranged things so that almost nobody understands science and technology. That is as clear a prescription for disaster as you can imagine. While we might get away with this combustible mixture of ignorance and power for a while, sooner or later it’s going to blow up in our faces. The powers of modern technology are so formidable that it’s insufficient just to say, “Well, those in charge, I’m sure, are doing a good job.” This is a democracy, and for us to make sure that the powers of science and technology are used properly and prudently, we ourselves must understand science and technology. We must be involved in the decision-making process.

Why is science so amazingly successful? How does it achieve such uncanny accuracy and predictive powers, despite our human fallibility? Sagan explains that the key to the scientific method is its “built-in error-correcting mechanisms”: arguments from authority are worthless; claims must be demonstrated; criticism is desirable; disproving previously accepted ideas is laudable.

It all comes down to experiment.

Scientists do not trust what is intuitively obvious, because intuitively obvious gets you nowhere. That the Earth is flat was once obvious. I mean, really obvious; obvious! Go out in a flat field and take a look: Is it round or flat? Don’t listen to me; go prove it to yourself. That heavier bodies fall faster than light ones was once obvious. That blood-sucking leeches cure disease was once obvious. That some people are naturally and by divine right slaves was once obvious. That the Earth is at the center of the universe was once obvious. You’re skeptical? Go out, take a look: Stars rise in the east, set in the west; here we are, stationary (do you feel the Earth whirling?); we see them going around us. We are at the center; they go around us.

The truth may be puzzling. It may take some work to grapple with. It may be counterintuitive. It may contradict deeply held prejudices. It may not be consonant with what we desperately want to be true. But our preferences do not determine what’s true. We have a method, and that method helps us to reach not absolute truth, only asymptotic approaches to the truth — never there, just closer and closer, always finding vast new oceans of undiscovered possibilities. Cleverly designed experiments are the key.

To avoid sliding into the “superstition and darkness” that Sagan feared, we must teach children to be skeptical and critical — while also preserving their willingness to evaluate new ideas with an open mind.

Science involves a seemingly self-contradictory mix of attitudes: On the one hand it requires an almost complete openness to all ideas, no matter how bizarre and weird they sound, a propensity to wonder. As I walk along, my time slows down; I shrink in the direction of motion, and I get more massive. That’s crazy! On the scale of the very small, the molecule can be in this position, in that position, but it is prohibited from being in any intermediate position. That’s wild! But the first is a statement of special relativity, and the second is a consequence of quantum mechanics. Like it or not, that’s the way the world is. If you insist that it’s ridiculous, you will be forever closed to the major findings of science. But at the same time, science requires the most vigorous and uncompromising skepticism, because the vast majority of ideas are simply wrong, and the only way you can distinguish the right from the wrong, the wheat from the chaff, is by critical experiment and analysis.

Advertisements

Essential tools to think with December 18, 2010

Posted by Ezra Resnick in Education, Reason, Science.
Tags:
2 comments

Despite appearances, the two tabletops are identical in both size and shape.

I’m preparing a special lesson for my high school students on the subject of critical thinking and the scientific method. In the first half of the lesson, I aim to undermine the students’ certainty about what they think they know, by demonstrating the many types of errors and biases we are all prone to. I’ll start with optical illusions, like Roger Shepard’s “Turning the Tables.” Carefully measuring the two tabletops shows that they are identical in both size and shape, even though we feel very strongly that this is not the case.

Exactly one of the doors has a new car behind it. After you choose door 1, the host opens door 3 to reveal a goat. He then offers you the chance to switch to door 2. Should you?

We know we cannot always trust our senses, but surely our intuitions are better in more theoretical areas? Try this: if a bat and a ball together cost $1.10, and the bat costs $1.00 more than the ball, how much does the ball cost? Sounds easy, and it is, but a majority of university students give the wrong answer (hint: it’s not 10 cents). Our intuitions are especially bad when it comes to probability. I’ll present the famous Monty Hall problem, to which many smart people refuse to accept the correct solution (switching doors doubles your chances of winning) even after it’s explained to them. I may also mention the gambler’s fallacy — if ten consecutive tosses of a fair coin come up heads, is tails more likely to come up next? — which makes lots of money for casinos.

Instead of jumping to conclusions based on intuition, we can attempt to construct formal logical inferences: if all Greeks are handsome, and Socrates is a Greek, then Socrates is handsome. Do we ever make mistakes in our use of logic? Consider this: if some men are doctors, and some doctors are tall, can we conclude that some men are tall? (No.) Or this: if we assume that all reptiles lay eggs, and we know that alligators lay eggs, does it follow that alligators are reptiles? (Nope.)

I next turn to the fallacy of assuming that correlation implies causation. If the percentage of black people among those convicted of violent crimes is significantly greater than the percentage of blacks in the population, can we conclude that blacks are inherently more violent? No: it could be that most judges are white and some are prejudiced, or that blacks are poorer on average and poverty causes crime, or that blacks are treated as second-class citizens creating a self-fulfilling prophecy, etc. Another example: let’s say surveys show that people who use olive oil are less likely to develop heart disease than people who use other oils. Does it follow that olive oil helps prevent heart disease? Consider that olive oil is more expensive than other oils, so people who buy olive oil are more likely to belong to higher socio-economic groups — implying a more healthy diet in general, a higher chance of belonging to a gym, more money to spend on health care, etc. Of course, this does not prove that olive oil does nothing to prevent heart disease: merely that the causal connection cannot be deduced from the correlation alone.

Perhaps the most subtle causes of error are cognitive biases. I’ll talk about confirmation bias: people tend to give more weight to evidence that supports what they already believe; they tend to seek data that confirm their hypotheses instead of attempting to disprove them; they tend to remember examples that support their theories and forget those that don’t. One study was conducted on two groups of people: one group contained people who were in favor of capital punishment, and the other group contained people who were against it. All subjects were shown the same set of data, which included evidence and argument for both sides of the issue. Participants from both groups tended to report that the data had caused them to strengthen their original beliefs! So much for objectivity. Confirmation bias is also what keeps many superstitions alive: people notice those times when unlucky events happen on the 13th floor, and disregard the times when they don’t, or when they happen on other floors.

I’ll conclude the first part of the lesson with the fallacy of appealing to authority: “Einstein was a genius, so whatever he said must be so;” “If it’s written in the Bible, it must be true;” “Democracy is the best form of government, because that’s what they taught us in school;”¬† “My teacher says that appeals to authority are logical fallacies.” The truth or falsity of a claim is not affected by the authority of the claimant; even Einstein made mistakes.

So far, then, we’ve seen that our senses can deceive us; our intuitions are often wrong; we are prone to logical fallacies and cognitive biases; it’s difficult for us to be objective; and even smart people can be mistaken. Is it impossible to obtain reliable information about the world?

Enter the scientific method. I’ll present the basic model of gathering evidence, offering a hypothesis to explain the observed phenomenon, making predictions based on the hypothesis, testing those predictions, and revising the hypothesis based on new data. This method is not infallible, of course, but science makes use of many mechanisms for minimizing errors and correcting them: transparency, documentation, reproducibility, peer review, etc.

To further explore the nature of scientific theories, I’ll use Carl Sagan’s example of the fire-breathing dragon in my garage: when my friend asks to see it, I reply that it’s invisible. She then suggests that we spread flour on the floor and look for tracks, but I explain that this dragon floats in the air. And there’s no point in trying to touch it, either, because it’s incorporeal. At this point my friend would hopefully begin to wonder what makes me think the dragon exists at all. The dragon hypothesis is unscientific because it’s unfalsifiable — there is no evidence that could possibly disprove it. This makes it useless: if there could never be any detectable difference between a world in which the dragon exists and one where it doesn’t, why should we care?

Another useful heuristic for judging scientific theories is Occam’s razor: all other things being equal, the simplest explanation of the facts is usually the right one. In other words, we should strive to minimize unnecessary or arbitrary assumptions. If I hear the clacking of hoofs coming from inside a race track, for instance, it could theoretically be a zebra escaped from the zoo, or a recording designed to fool me, or an alien language — but absent any evidence for those hypotheses, it makes sense to tentatively assume that it’s horses I’m hearing. It’s important to stress that all scientific knowledge is provisional: we can never achieve absolute certainty, but our confidence in an hypothesis grows with the amount of supporting evidence. A scientific theory is a hypothesis of sufficient explanatory power which has withstood all attempts to falsify it. But all theories are always open to revision based on new evidence.

Homology of forelimbs in mammals is evidence of evolution.

I’ll give two examples of successful scientific theories and the evidence supporting them. Firstly, how do we know the Earth is spherical? Thousands of years ago, people had already noticed that the stars in the night sky look different from different locations, that the sails of a ship can be seen on the horizon before its hull, and that the shadow cast by the Earth on the moon during a lunar eclipse is round. More recently, of course, people have circumnavigated the globe and even seen it from space. My second example is the theory of evolution, supported by evidence from fossils, comparative anatomy, the geographical distribution of plants and animals, genetics (DNA), artificial selection (e.g. dog breeding), observed natural selection (e.g. antibiotic-resistant bacteria), and more.

Let’s now try to apply the scientific method to medical testing. Does the fact that my condition improved after taking a certain drug or undergoing a certain therapy mean that the drug or the therapy are inherently effective? No: we must take into account the placebo effect, confirmation bias, the possibility of coincidence, etc. We can, however, attempt to neutralize those factors — and raise our confidence in a treatment’s effectiveness — by performing controlled double-blind trials, and analyzing the results statistically.

Having previously rejected appeals to authority, it’s important to point out the difference between authority and expertise. While the statement “90% of mathematicians recommend Acme toothpaste” does not carry any special weight, the statement “90% of dentists recommend Acme toothpaste” does. When we (provisionally!) accept the consensus of experts on matters of fact in their field of expertise, we are not doing so merely on the basis of their authority: we are relying on the scientific method itself, which includes all the self-correcting and error-minimizing mechanisms mentioned above.

The first smallpox vaccine was developed in 1796; smallpox was eradicated in 1979.

The strongest argument in favor of the scientific method is that it gets amazing results. Science can land a spacecraft on Mars, and can predict the exact time of an eclipse a thousand years in the future. Closer to home, the smallpox virus — which killed hundreds of millions of people over 10,000 years — was eradicated in 1979 after a successful vaccination campaign, the culmination of centuries of scientific effort. We often take such things for granted, but until the beginning of the 20th century, human life expectancy was only 30-40 years. It’s now around 80 years in the developed world.

This is a good place to give the students a chance to practice their own critical thinking skills on real-world examples, like the assertions of faith-healers or astrologers. What questions should we ask before accepting such claims? Are there any fallacies or biases that may be getting in the way? What data support these hypotheses, and are there more parsimonious ways of explaining them? What experiments could we perform to help us decide?

In summation: it’s essential to evaluate all ideas critically. The smartest people we know could be wrong; we should never accept something just because someone said so, or because it’s tradition, or because it feels right intuitively. Our level of confidence in a proposition ought to scale with the level of available evidence in its support. Being skeptical of extraordinary claims is a good default position — but we must also make sure to keep an open mind and be willing to consider strange and unintuitive ideas (quantum physics, anyone?). We must recognize that there are many things we do not know, and that some of what we think we know may be mistaken. It is possible, however, to expand our knowledge of the world, and to correct previous mistakes, using the scientific method.

I’ll conclude with an excerpt from Carl Sagan’s The Demon-Haunted World:

Except for children (who don’t know enough not to ask the important questions), few of us spend much time wondering why Nature is the way it is; where the Cosmos came from, or whether it was always here; if time will one day flow backward, and effects precede causes; or whether there are ultimate limits to what humans can know. There are even children, and I have met some of them, who want to know what a black hole looks like; what is the smallest piece of matter; why we remember the past and not the future; and why there is a Universe.

Every now and then, I’m lucky enough to teach a kindergarten or first-grade class. Many of these children are natural-born scientists — although heavy on the wonder side and light on scepticism. They’re curious, intellectually vigorous. Provocative and insightful questions bubble out of them. They exhibit enormous enthusiasm. I’m asked follow-up questions. They’ve never heard of the notion of a ‘dumb question’.

But when I talk to high school seniors, I find something different. They memorize ‘facts’. By and large, though, the joy of discovery, the life behind those facts, has gone out of them. They’ve lost much of the wonder, and gained very little scepticism. They’re worried about asking ‘dumb’ questions; they’re willing to accept inadequate answers; they don’t pose follow-up questions; the room is awash with sidelong glances to judge, second-by-second, the approval of their peers. . . .

There are naive questions, tedious questions, ill-phrased questions, questions put after inadequate self-criticism. But every question is a cry to understand the world. There is no such thing as a dumb question.

Bright, curious children are a national and world resource. They need to be cared for, cherished, and encouraged. But mere encouragement isn’t enough. We must also give them the essential tools to think with.

I won’t be giving the lesson for a while yet, so I’d be happy to hear any comments, criticisms and suggestions.