3 de novembro de 2010

Science education

Wanted: BS Detectors

What science ed should really teach.

Gallery: Dumb Things Americans Believe

Dumb Things Americans Believe

This column is about science education, but teachers and curriculum designers should click away now rather than risk apoplexy. Instead of making the usual boring plea for more resources for K–12 science (or, as it is now trendily called, STEM, for science, technology, engineering, and math), I hereby make the heretical argument that it is time to stop cramming kids’ heads with the Krebs cycle, Ohm’s law, and the myriad other facts that constitute today’s science curricula. Instead, what we need to teach is the ability to detect Bad Science—BS, if you will.

The reason we do science in the first place is so that “our own atomized experiences and prejudices” don’t mislead us, as Ben Goldacre of the London School of Hygiene & Tropical Medicine puts it in his new book, Bad Science: Quacks, Hacks, and Big Pharma Flacks. Understanding what counts as evidence should therefore trump memorizing the structural formulas for alkanes.

Photo Contest Highlights Microscopic Beauty

History

“People can be wrong in so many ways,” Goldacre told me—and by “people,” he includes scientists. All too many put too much credence in observational studies, in which people who happen to behave one way (eating a lot of olive oil, drinking in moderation) have one health outcome, while people who choose to behave the opposite way have a different health outcome. This is a reasonable way to generate a hypothesis, but not something to guide your life by, and certainly no basis for the health advice such scientists peddle. Take the claim that moderate drinkers have less heart disease or obesity than teetotalers. Eye roll, please. Unless people are randomly assigned to drink or not drink, those health outcomes are just as likely to reflect something inherent in the drinkers and teetotalers rather than the behavior. Maybe teetotalers belong to ethnic groups that both disdain booze and have a genetic risk for obesity and heart disease. Maybe they substitute even less healthy vices for alcohol. There’s no way you can tell from an observational study. (Observational studies underpinned the advice that menopausal women should take hormone therapy to avoid heart disease. A randomized study disproved that. There was something different—healthier—about women in the observational studies who chose to take hormones.)

Few things mislead us more than failure to grasp simple statistical principles. A harmless example is the Sports Illustrated jinx, in which being on the cover of that magazine supposedly causes an athlete to, say, tear his ACL in his next game (quarterback Tom Brady, 2008) or otherwise founder. Fun as it may be to believe the media have spooky powers, the reason for the “jinx” is that athletes make the SI cover when they are at the top of their game. Thanks to the statistical principle called regression to the mean—basically, returning to average—peak performance, which always involves some luck or random chance, eventually falls to earth. Hence pitcher Stephen Strasburg’s first loss the week he made the SI cover last June.


Ignorance about regression to the mean can fool us about why we recover from illness. Many ailments, including colds and pain, have a natural cycle. People tend to seek treatment when their symptoms are worst, which is also when, thanks to regression to the mean, they are most likely to get better. Falsely attributing recovery to the treatment is far from harmless. When the treatment is a quack remedy or a surgery for which there is no evidence of efficacy, crediting the treatment reinforces wasteful medical spending; when the treatment is antibiotics for a cold, doing so reinforces a practice that breeds drug-resistant bacteria and kills people. “People die at a biblical scale” because of such stupidity, Goldacre says.

The most useful skill we could teach is the habit of asking oneself and others, how do you know? If knowledge comes from intuition or anecdote, it is likely wrong. For one thing, the brain stinks at distinguishing patterns from randomness (no wonder people can’t tell that the climate change now underway is not just another turn in the weather cycle). For another, the brain overestimates causality. In one neat experiment, participants rewarded students’ punctuality or punished tardiness for 15 days, then evaluated whether carrots or sticks worked better. Verdict: punishment. Unbeknownst to the “teachers,” the exercise had been rigged: students arrived at random times (generated by computer) unrelated to what teachers did. Yet the teachers believed their intervention had an effect. (A nickel says parents fall for the same illusion.) Science is not a collection of facts but a way of interrogating the world. Let’s teach kids to ask smarter questions.

Nenhum comentário:

Postar um comentário