Summary by James R. Martin, Ph.D., CMA
Professor Emeritus, University of South Florida
Behavioral Issues and Culture Main |
Political Issues Main
This article addresses the conflict between scientific consensus and popular opinion. Although the principle of explication assumes evidence, data, and argument drive attitudes, there is considerable evidence that people engage in a biased search for information that reinforces their preexisting attitudes. The purpose of the paper is to describe the motives that lead people to reject science, and to develop a model of science communication based on attitude roots. Attitude roots are the beliefs, ideologies, fears, and identity issues that motivate people to reject scientific evidence.
Where do counterscientific beliefs come from? Attitude Roots
Hornsey and Fielding distinguish between surface attitudes and attitude roots. Surface attitudes such as creationism, climate skepticism, and antivacination are supported by what lies below the surface, i.e., the underlying fears, ideologies, worldviews, and vested interests that sustain and motivate those attitudes. The authors classify attitude roots in six themes including:
1. Ideologies, Values, and World Views:
Hierarchical versus Egalitarian world views,
Individualistic versus Communitarian world views,
Free-market ideology,
Systems Justification and Belief in a Just World,
2. Conspiratiorial Ideation,
3. Vested Interest,
4. Personal Identity Expression,
5. Social Identity Needs, and
6. Fears and Phobias.
Although only one attitude root is needed to motivate people to reject science, the roots may combine to create stronger more complex beliefs.
Ideologies, Values, and World Views
Hierarchical versus Egalitarian Worldviews
The term hierarchical versus egalitarian worldviews refers to a person's view of how society should be structured. A person with a highly hierarchical view tends to accept a social order defined by class, gender, ethnicity, or other characteristics, i.e., the view that social hierarchies are natural and valuable. People with an egalitarian view tend to feel that social class is irrelevant in relation to the distribution of entitlements and other resources. A person with a hierarchical world view is more likely to reject scientific evidence that threaten elites, e.g., climate change mitigation involves a cost to industry (an elite group).
Individualistic versus Communitarian Worldviews
A highly individualistic person tends to be self-reliant, independent, and more inclined to deny risk that would generate market restrictions or limits to civil liberties. Therefore highly individualistic people are predisposed to deny anthropogenic climate change because it implies government regulation that diminishes corporate freedoms. Those with communitarian views tend to place the interest of the collective as a higher priority than self-interest.
Free-market Ideology
A meta-analysis of 30 studies indicated a negative correlation between free-market ideology (i.e., the view that markets should be unconstrained) and belief in anthropogenic climate change.
System Justification/ Belief in a Just World
Evidence from systems justification theory supports the view that some people have existential and relational motivation to protect the prevailing social, political, and economic status quo. This motivation tends to cause people to deny environmental costs because it would require a more sustainable model. Those who view the world as fair and just tend to show more climate change skepticism.
Conspiratiorial Ideation
Conspiracy theories provide another category of attitude root where people come to believe that vast networks of people have executed sinister plots. For example, the view that a network of scientists, governments, and pharmaceutical companies have conspired to withhold evidence that treatments for health risks such as measles, mumps and rubella causes autism. A related example is the view that fluoride is poisonous. Another example includes the variations of the conspiracy related to climate change: That climate scientists are motivated by their own self importance and power, that they exaggerate threats to gain research funding, that they have underlying motivations to curb capitalism and dirty energy companies, and that climate science was created to promote nuclear energy.
Vested Interest
Research provides a convergence of evidence that people view evidence differently depending on whether it is personally convenient. The authors use the term motivated skepticism to describe this type of behavior. When accepting scientific consensus creates a high personal cost, it is natural for people to embrace evidence that indicates change is unnecessary or futile.
Personal Identity Expression
Attitudes, when communicated to others can be used to define a person's identity. For example, communicating antiscience beliefs may be viewed as a way to show a person is not sheeplike in accepting the prevailing beliefs. Defining one's self against scientific consensus can provide a way to show that a person is a nonconformist, or nongullible. Attitudes might be developed and embraced to help express what kind of person the holder is rather than through the principle of explication, i.e., evidence, data, argument, attitude.
Social Identity Needs
Social identity theory supports the view that people derive meaning and self-definition from membership in social groups. The prototype of a group might allow for considerable variability (e.g., Republican or Democrat), but disowning a relevant group may require a threatening process of reassembling one's sense of self. Through biased selection and processing of information it is possible to convince oneself that most of the group's beliefs are right. For example, when partisan Democrats and Republicans were presented with welfare policies, they were more likely to view the policy as fair and moral when told it was initiated by their own party. Attitude is not only motivated by in-group identification, but by opposition to the views of a disliked out-group. The combined result is that messages are accepted or rejected on the basis of the allegiances to the messenger, rather than to the quality of the message. The authors also note that some antiscience beliefs tend to cluster in geographic regions, e.g., creationist vs. evolutionist communities. In such an environment, expressing an opposing belief might be seen as a signal of outsider status. Creationist make up a fairly large antiscience group. A 2014 Gallop poll indicated that 42% of Americans believe that humans were created by God in their present form.
Fears and Phobias
The authors discuss two types of phobias in this section: Blood-injection-injury phobia (BII), and obsessive-compulsive disorder (OCD). BII relates to fear of anything associated with blood, e.g., hospitals, operations, wounds, and injections. Although there is little if any research connecting BII to anti-immunization attitudes, the logical leap is short. OCD is a similar clinical disorder where a person is afraid of being contaminated by radiation, chemotherapy, immunization or some other form of nanotechnology. The percentage of the population with one of these phobias is relatively small, (research studies indicated 4.9% for BII and 2.3% for OCD), but they represent another type of root cause for anti-science beliefs.
When and Why Explication Fails
One might assume that resistance to evidence based messages results from ignorance or failure to understand the information, but studies have indicated that there is no reliable correlation between understanding evolutionary theory and accepting it as fact. The failure of explication reflects a social psychological principle of attitude polarization. People assimilate information in a biased way that reinforces what they already think. This is the principle of motivated reasoning. If a person is motivated to reject a scientific message, then it is unlikely that more explanation of the evidence will change their attitude. The graphic below represents my interpretation of the difference between the two concepts.
A Solution: Jiu Jitsu Persuasion
Jiu jitsu is a old martial art based on the idea of using an opponent's strength against them. The goal of the jiu jitsu persuasion metaphor is to identify a person's underlying motivations and use them to develop messages that are compatible with those motivations. The recommended approach is similar to other methods such as social marketing models, psychotherapy, dialectical behavior therapy, mindfulness-based cognitive therapy and acceptance and commitment therapy. For example, focusing on the green jobs that are created by climate change mitigation efforts is a more effective way of promoting proenvironmental behavior among skeptics than evidence based messages about anthropogenic climate change. Another example involved a message that framed environmental action as patriotic and system preserving. This message reversed the previous established finding for those with anti-climate change attitudes based on an underlying system justification attitude.
Other studies have shown that messages are more effective when they are presented from in-groups, or when the behavior is described as normative by the group, e.g., the majority of an in-group conserves energy, recycles, reuses hotel towels etc. Another example for climate deniers involves focusing on the effects of lack of action on future generations. For immunization skeptics it might be useful to focus on the role of immunizations in reducing the necessity for injections and medical procedures when a person becomes ill. My sketch of the jiu jitsu model appears below.
A problem with the jiu jitsu persuasion approach is that people have limited insight into their own motivations. When communicating with a science denier one should resist the temptation to debate the ideas presented by the target, and instead focus on the underlying shadow argument that remains unspoken. This presents a tremendous communications challenge, particularly for those loaded with scientific evidence.
Caveats and Conclusions
Science deniers are not a small insignificant minority. Polls have indicated that those who have antiscience attitudes (anti-evolution, anti-anthropogenic climate change, anti-vaccination) are relatively large. The argument in this paper is that when a person has already developed an attitude, explication is likely to be an ineffective way of changing that attitude. Although messages based on evidence work for most people, the jiu jitsu approach may be more effective for science deniers.
The authors make a few additional points in this section. First, the six categories of attitude roots are not presented as discrete and nonpermeable categories. They may well be interrelated and combine in unpredictable ways. Second, the principle of attitude roots applies to everyone, but it is more dangerous in some communities than in others. For example, the scientific community may also suffer from motivated reasoning, but it has mechanisms for autocorrecting. Individual receivers of information do not have this mechanism. Third, the principle of attitude roots and jiu jitsu persuasion may be applied outside the domain of science, but beliefs about science are the focus of this paper because it is such a high stakes domain. Finally, skepticism about scientific consensus is high stakes and dangerous because data, and scientific evidence represent the default path to progress.
__________________________________________
Related summaries:
Coutu, D. L. 2002. The anxiety of learning. Harvard Business Review (March): 100-107. (Summary).
David, A. 2015. Non-positional thinking. Presentation at the In2:In Thinking 2015 conference. (Note).
Dawkins, R. 2008. The God Delusion. A Mariner Book, Houghton Mifflin Company. (Summary).
Esquire. 2015. America: These are your choices. Esquire (December/January): 149-153, 160-161, 164, 168. (Summary - This is a summary of ten questions related to the most critical choices for America based on information from the Brookings Institution).
Gladwell, M. 2002. The Tipping Point: How Little Things Can Make a Big Difference. Back Bay Books. (Summary).
Handy, C. 2002. What's a business for? Harvard Business Review (December): 49-55. (Summary).
Kenrick, D. T., A. B. Cohen, S. L. Neuberg and R. B. Cialdini. 2018. The science of antiscience thinking. Scientific American (July): 36-41. (Summary).
Martin, J. R. Not dated. Religious Affiliations Surveys 1972-2018 (Summary).
Martin, J. R., W. K. Schelb, R. C. Snyder, and J. C. Sparling. 1992. Comparing the practices of U.S. and Japanese companies: The implications for management accounting. Journal of Cost Management (Spring): 6-14. (Summary).
Martin, M. and K. Augustine. 2015. The Myth of an Afterlife: The Case against Life After Death. Rowland & Littlefield Publishers. (Note and Contents).
McKeon, A. and G. Ranney. 2013. Ongoing Discussion "Thought Piece". Thinking about management from a climate change perspective. Presentation at Aerojet Rocketdyne's InThinking Network. (September): 1-29. (Note).
Paine, T. 2006. The Age of Reason. The Echo Library. (Rebuff of church dogma originally published in 1796. Summary).
Prothero, S. 2007. Religious Literacy: What Every American Needs to Know - And Doesn't. Harper San Francisco. (Summary).
Shermer, M. 2015. The Moral Arc: How Science and Reason Lead Humanity Toward Truth, Justice, and Freedom. Henry Holt and Co. moralarc
Shermer, M. and P. Linse. 2001. The Baloney Detection Kit. Altadena, CA: Millennium Press. Ten questions to ask when examining a claim. 1. How reliable is the source of the claim? 2. Does the source make similar claims. 3. Have the claims been verified by someone else? 4. Does this fit with the way the world works? 5. Has anyone tried to disprove the claim? 6. Where does the preponderance of evidence point? 7. Is the claimant playing by the rules of science? 8. Is the claimant providing positive evidence? 9. Does the new theory account for as many phenomena as the old theory. 10. Are personal beliefs driving the claim? baloney detection kit