Exam Practice 10: “Why Do Many Reasonable People Doubt Science?” by Joel Achenbach

Original text:

We live in an age when all manner of scientific knowledge—from climate change to vaccinations—faces furious opposition. Some even have doubts about the moon landing.

There’s a scene in Stanley Kubrick’s comic masterpiece Dr. Strangelove in which Jack D. Ripper, an American general who has ordered a nuclear attack on the Soviet Union, confesses his paranoid worldview — that “fluoridation is the most monstrously conceived and dangerous communist plot we have ever had to face”.

The movie came out in 1964, by which time the health benefits of fluoridation had been thoroughly established, and antifluoridation conspiracy theories could be the stuff of comedy. So, you might be surprised to learn that, half a century later, fluoridation continues to incite fear and paranoia. In 2013, citizens in Portland, Oregon, one of only a few major American cities that don’t fluoridate their water, blocked a plan by local officials to do so. Opponents didn’t like the idea of the government adding “chemicals” to their water. They claimed that fluoride could be harmful to human health.

Actually, fluoride is a natural mineral that, in the weak concentrations used in public drinking water systems, hardens tooth enamel, and prevents tooth decay—a cheap and safe way to improve dental health for everyone, rich or poor. That’s the scientific and medical consensus. To which some people in Portland, echoing antifluoridation activists around the world, reply: “We don’t believe you.”

We live in an age when all manner of scientific knowledge—from the safety of fluoride and vaccines to the reality of climate change—faces organized and often furious opposition. Empowered by their own sources of information and their own interpretations of research, doubters have declared war on the consensus of experts. There are so many of these controversies these days, you’d think a diabolical agency had put something in the water to make people argumentative. And there’s so much talk about the trend these days—in books, articles, and academic conferences—that science doubt itself has become a pop-culture meme.

We’re asked to accept, for example, that it’s safe to eat food containing genetically modified organisms (GMOs) because, the experts point out, there’s no evidence that it isn’t and no reason to believe that altering genes precisely in a lab is more dangerous than altering them wholesale through traditional breeding. But to some people the very idea of transferring genes between species conjures up mad scientists running amok—and so, two centuries after Mary Shelley wrote Frankenstein, they talk about Frankenfood.

In this bewildering world we must decide what to believe and how to act on that. In principle that’s what science is for. “Science is not a body of facts,” says geophysicist Marcia McNutt, who once headed the U.S. Geological Survey and is now editor of Science, the prestigious journal. “Science is a method for deciding whether what we choose to believe has a basis in the laws of nature or not.” But that method doesn’t come naturally to most of us. And so, we run into trouble, again and again.

Even when we intellectually accept precepts of science, we subconsciously cling to our intuitions—what researchers call our naive beliefs. As we become scientifically literate, we repress our naive beliefs but never eliminate them entirely. They lurk in our brains, chirping at us as we try to make sense of the world.

Most of us do that by relying on personal experience and anecdotes, on stories rather than statistics. We might hear about a cluster of cancer cases in a town with a hazardous waste dump, and we assume pollution caused the cancers. Yet just because two things happened together doesn’t mean one caused the other, and just because events are clustered doesn’t mean they’re not still random. We have trouble digesting randomness; our brains crave pattern and meaning.

Even for scientists, the scientific method is a hard discipline. Like the rest of us, they’re vulnerable to what they call confirmation bias—the tendency to look for and see only evidence that confirms what they already believe. But unlike the rest of us, they submit their ideas to formal peer review before publishing them. Once their results are published, if they’re important enough, other scientists will try to reproduce them—and, being congenitally sceptical and competitive, will be very happy to announce that they don’t hold up. Scientific results are always provisional, susceptible to being overturned by some future experiment or observation. Scientists rarely proclaim an absolute truth or absolute certainty. Uncertainty is inevitable at the frontiers of knowledge.

Last fall the Intergovernmental Panel on Climate Change, which consists of hundreds of scientists operating under the auspices of the United Nations, released its fifth report in the past 25 years. This one repeated louder and clearer than ever the consensus of the world’s scientists: The planet’s surface temperature has risen by about 1.5 degrees Fahrenheit in the past 130 years, and human actions, including the burning of fossil fuels, are extremely likely to have been the dominant cause of the warming since the mid-20th century. Many people in the United States—a far greater percentage than in other countries—retain doubts about that consensus or believe that climate activists are using the threat of global warming to attack the free market and industrial society generally. Senator James Inhofe of Oklahoma, one of the most powerful Republican voices on environmental matters, has long declared global warming a hoax.

The idea that hundreds of scientists from all over the world would collaborate on such a vast hoax is laughable—scientists love to debunk one another. It’s very clear, however, that organizations funded in part by the fossil fuel industry have deliberately tried to undermine the public’s understanding of the scientific consensus by promoting a few sceptics.

But industry PR, however misleading, isn’t enough to explain why only 40 percent of Americans, according to the most recent poll from the Pew Research Centre, accept that human activity is the dominant cause of global warming. Science appeals to our rational brain, but our beliefs are motivated largely by emotion, and the biggest motivation is remaining tight with our peers. “People still have a need to fit in,” says Marcia McNutt, “and that need to fit in is so strong that local values and local opinions are always trumping science”.

Meanwhile, the Internet makes it easier than ever for climate sceptics and doubters of all kinds to find their own information and experts. Gone are the days when a small number of powerful institutions—elite universities, encyclopaedias, major news organizations, even National Geographic—served as gatekeepers of scientific information. The Internet has democratized information, which is a good thing. But along with cable TV, it has made it possible to live in a “filter bubble” that lets in only the information with which you already agree.

Doubting science also has consequences. The people who believe vaccines cause autism—often well-educated and affluent, by the way—are undermining “herd immunity” to such diseases as whooping cough and measles. The anti-vaccine movement has been going strong since the prestigious British medical journal the Lancet published a study in 1998 linking a common vaccine to autism. The journal later retracted the study, which was thoroughly discredited. But the notion of a vaccine-autism connection has been endorsed by celebrities and reinforced through the usual Internet filters.

In the climate debate the consequences of doubt are likely global and enduring. In the U.S., climate change sceptics have achieved their fundamental goal of halting legislative action to combat global warming. They haven’t had to win the debate on the merits; they’ve merely had to fog the room enough to keep laws governing greenhouse gas emissions from being enacted.

Our science has made us the dominant organisms, with all due respect to ants and blue-green algae, and we’re changing the whole planet. Of course, we’re right to ask questions about some of the things science and technology allow us to do. “Everybody should be questioning,” says McNutt. “That’s a hallmark of a scientist. But then they should use the scientific method, or trust people using the scientific method, to decide which way they fall on those questions.” We need to get a lot better at finding answers, because it’s certain the questions won’t be getting any simpler.

Reading notes:

  • Achenbach mentions Stanley Kubrick’s film Dr Strangelove to introduce the topic. In the film, Jack D. Ripper confesses that to him, “fluoridation” is the most challenging communist plot they have had to face;
  • Although the film was released in 1964, and there is strong scientific consensus regarding the safety of water fluoridation, nowadays, some people still believe that it is part of some conspiracy theory;
  • The scientific and medical consensus agrees that fluoride, a natural mineral, helps improve dental health for everyone, yet some people do not trust this consensus;
  • Achenbach states that nowadays, scientific knowledge is facing furious opposition; those who wage war on the consensus of experts claim to have their sources of information; science doubt has gone viral and has become a pop-culture meme;
  • GMOs represent another example of science scepticism: the idea of transferring genes between species in a lab scares people, even though this transferal happens in nature all the time; they call it “Frankenfood”;
  • Achenbach quotes a definition of the scientific method: it is a method by which people establish whether their hypotheses or beliefs are valid; however, the scientific method doesn’t come naturally to us, and this is problematic;
  • People struggle with the scientific method because they cling to their intuitions and naive beliefs; being scientifically literate means being able to repress those naive beliefs;
  • People prefer anecdotes over statistics, and because of this, they fall prey to all sorts of cognitive biases, such as “correlation does not equal causation” (the example with the cancer cases);
  • The scientific method is complicated even for scientists; they fall prey to a cognitive bias called the “confirmation bias” (by which they only see what they want to see), but this bias is corrected when scientific results are subjected to peer-reviewing; that is why scientific results are always provisional;
  • Another example of science scepticism regards climate change: albeit the scientific consensus has established that it is the result of human action, plenty of people still believe it is a hoax used to attack the free market;
  • The author contends that this is most likely the result of lobbying: organizations funded by the fossil fuel industry promote sceptics with the sole purpose of undermining public trust;
  • However, industry PR bears only part of the blame; while science appeals to rationality, people’s personal beliefs appeal to their emotions; people desire to fit in with their peers, and this pushes them to cling to their naive beliefs; it is the classic example of “peer pressure”;
  • The Internet and cable television have also played a part: they have democratised information and made it possible for individuals to live in a “filter bubble” where they see and hear what they want to hear and see;
  • Doubting science is not without consequences: 1) anti-vaxxers undermine “herd immunity.” (the vaccine-autism correlation that scientists discredited); 2) doubting science creates obstacles to public policy: it is halting legislative action in the fight against climate change;
  • Achenbach concludes the article by saying that in today’s day and age, asking questions is essential; yet, to find answers to those questions, people need to apply the scientific method or trust people who use the scientific method because things are going to become more and more complex in the future;

Summary of Joel Achenbach’s article “Why Do Many Reasonable People Doubt Science?”:

In his article “Why Do Many Reasonable People Doubt Science?” published in the National Geographic in March 2015, Joel Achenbach assesses the complex issue of why many competent people struggle to accept specific scientifically proven ideas and the consensus of experts.

Achenbach mentions four areas of scientific knowledge which continue to cause opposition. First, adding the mineral fluoride to tap water is a cheap way of improving public dental care, yet opposers see it as adding harmful chemicals. Second, many parents refuse to vaccinate their children because of research linking vaccines to autism, despite many studies discrediting it. Third, people fear genetically modified food despite assurances from scientists that it is safe. Finally, many people, and a majority of the population of the United States, reject the idea that human activity is the most significant cause of global warming, ignoring overwhelming consensus from the scientific community.

Achenbach defines science as a method for deciding whether people’s beliefs are grounded in the laws of nature. He explains that scientific knowledge is constantly tested, held to account, and even modified, so he rejects the idea of any conspiracy to hoodwink the public. However, sometimes humans find it difficult to accept new ideas; they cling to their “naive beliefs”, often the product of intuition. Furthermore, these beliefs are subject to “confirmation bias” because people only see what confirms their original ideas. Science speaks to people’s “rational brain”, but they prefer to believe their emotions through anecdotes and ignore statistics because they appear too arbitrary and unfamiliar.

The writer explains how these days, doubts are spotlighted. This happens because people like to reflect the ideas of their community, and with the Internet, information has become more democratic. However, it has made it easier to live in a bubble that filters out any information which goes against their mindset. Achenbach explains that this can be dangerous. For example, not immunising children prevents diseases from being eradicated. Instead, he suggests that people should use the scientific method to check their beliefs or trust people who do.