Evolution, Climate and Vaccines: Why Americans Deny Science
The U.S. has a science problem. Around half of the country’s citizens reject the facts of evolution; fewer than a third agree there is a scientific consensus on human-caused climate change, and the number who accept the importance of vaccines is ticking downward.
Those numbers, all gleaned from recent Pew and Gallup research polls, might suggest that Americans are an anti-science bunch. But yet, Americans love science. Even as many in the U.S. reject certain scientific conclusions, National Science Foundation surveys have found that public support of science is high, with more than 75 percent of Americans saying they are in favor of taxpayer-funded basic research.
“The whole discussion around scientific denial has become very, very simplified” said Troy Campbell, a psychologist at the University of Oregon. Campbell and other psychologists were presenting findings from polls and other research that they say reveal Americans’ complex relationship with science at the annual meeting of the Society for Personality and Social Psychology (SPSP) in San Antonio.
Science denial – whether it comes in the form of dismissing fact-based evidence as being untrue or in accepting notions that are not factual as being true – is not typically rooted in blanket anti-science attitudes, the research showed. But the facts aren’t always paramount, either. Often, people’s denial of scientific evidence is based on motivations other than finding truth, such as protecting their social identity, the research said.
One key thing to understand about people who engage in science denial is that very few people deny science as a whole, according to research by Yale University psychologist Dan Kahan, also presenting at SPSP. For example, the more liberal a person is, the more likely he or she is to agree that humans are causing global warming; a conservative is far more likely to blame natural climate variation or say scientists are making the whole thing up.
But that same conservative may be just fine with the evidence for the efficacy of vaccines, and there is virtually no partisan split on issues like the safety of nanotechnology, the use of artificial sweeteners in drinks or the health impacts of living near high-voltage power lines, Kahan wrote in a book chapter published in the Oxford Handbook on the Science of Science Communication.
Kahan’s research has also shown that the more science-literate people are, the more strongly they hold to their beliefs – even if those beliefs are totally wrong.
In other words, it’s not about hating science or misunderstanding the facts. It’s about motivation.
“Beliefs are difficult to budge, because people don’t act like scientists, weighing up evidence in an even-handed way”, Matthew Hornsey, a psychologist at the University of Queensland, wrote in an email to Live Science. “When someone wants to believe something, then they act more like lawyers trying to prosecute what they already want to be true. And they cherry-pick the evidence to be able to do that.”
The real question, Hornsey said, is why people want to believe something that flies in the face of scientific evidence. In some cases, the reason can be political: Solving the problems created by climate change would mean standing in the way of the free market, something conservatives tend to oppose.
In other cases, people might have some other vested interest in their beliefs, Hornsey said. A smoker may not want to believe her or his habit is really going to cause lung cancer, because that would mean the person would have to quit. Social identity can also be an important driver of beliefs, Hornsey said. Studies of teens in Midwestern towns have found that these individuals typically go along with the crowd, he said, believing in evolution if the majority of their friends do and believing in creationism if that’s what the people around them believe.
“For someone living in a creationist community to express belief in evolution might be seen as a distancing act, as a signal that one was defiantly assuming an outsider status”, Hornsey said.
When someone’s self-image or social acceptance is at stake, badgering them with facts isn’t likely to change their minds, research has shown.
In fact, a 2010 study found that when people were shown incorrect information alongside a correction, the update failed to reverse their initial belief in the misinformation. Even worse, partisans who were motivated to believe the original incorrect information became even more firm in their belief in that information after reading a correction, the researchers found. For example, conservatives who were told that Saddam Hussein had weapons of mass destruction before the Iraq war believed that claim more firmly after reading a correction.
So researchers are suggesting more-subtle ways to change people’s attitudes toward accepting scientific facts. Hornsey said he and his colleagues call this “psychological jiujitsu” in reference to the martial art that teaches people to use their opponent’s own weight against them.
In this approach, people who accept scientific facts might try to get at the root of the disbeliefs held by those who don’t, and then address that basis, rather than addressing the surface denial. Campbell and his colleagues have found, for example, that if free-market solutions to climate change are presented as an option, self-identified Republicans become less likely to deny climate science.
Using this jiujitsu approach is challenging, Hornsey and his colleagues wrote in an article soon to be published in the journal American Psychologist, because people’s underlying motivations are not always clear. Sometimes, the people themselves may not know why they think the way they do. And no single message will fit all possible reasons for disbelief, the researchers warned. “A two-tiered strategy would be optimal: messages about evidence and scientific consensus that should be sufficient for the majority, and a jiujitsu approach for the unconvinced minority” the authors wrote.
There’s another trap to watch out for, though, Campbell warned: smugness. If a message from a science-accepting person comes across to a denier as being holier-than-thou, or as judgmental of a person’s whole character, it’s likely to backfire, he said. “I like to say, ‘Tell people they already are the people you want them to be’,” Campbell continued. For example, “don’t go to somebody and say, ‘You don’t care about the environment enough.’ Point out all the ways they do care about the environment.”
From there, Campbell said, there is common ground to work from. Successful persuasion, he said, finds common values without triggering people’s self-protective instincts. “The general thing I think is important to say is ‘I like and care about you’,” Campbell explained. Once respect is established, he said, “any criticism is very much tapered, and is not a holistic admonishment of who you are.”
October 21, 2017