top of page
pawel-janiak-49LBMXrY5BE-unsplash.jpg

Your Ideological Immune System

Your Ideological Immune System (Metacognition Part 3)

It's important for us to consider that some of our more automatic thought processes, which we explored in part 2 of this module, are set up to preserve our preexisting views about the world. For example, we often look for information that supports our beliefs and broader ideologies and reject viewpoints that seem to conflict with our what we believe to be true.

Scott Lilienfeld and his colleagues (2009) outline several ways the human mind is set up to preserve preexisting beliefs and ideological structures. They and Jay Snelson  chunk these mental processes together under the label ideological immune system. In short, our minds are equipped to fend off or shield us from information suggesting we’re wrong. By analogy, we can think of the body’s readiness to fend off infectious organisms—opposing ideas are like infectious organisms out there in the world. The mind is set up to take protective measures to fight such contrary ideas kind of like the body's immune system fights disease causing agents. 

 

For instance, if you’re someone who believes that childhood Measles, Mumps, Rubella (MMR) vaccines cause autism (it's important to note that they don’t; see Hviid, Hansen, Frisch, & Melbye, 2019) your mind is equipped with several ways of shielding that belief from information that could undermine it. 

These defences may offer some comfort—particularly if you're averse to uncertainty—but when they're functioning to preserve false beliefs, they presents a significant barrier to getting closer to an accurate understanding of the world.

If you're hoping to get closer to the truth about a particular issue—like vaccine safety—understanding, identifying, and working to surmount these mental processes is an important step toward this goals. Below, you'll encounter some key features of our thinking to reflect on as you consume information and discuss important issues with the people around you.

 

Confirmation bias

 

When we're seeking to discover what's real about the world around us, understanding the role of confirmation bias in our thinking is essential. Confirmation bias is, “the tendency to seek out evidence consistent with one’s views, and to ignore, dismiss, or selectively reinterpret evidence that contradicts them” (Lilienfeld et al., 2009, p. 391). Confirmation bias is often at play when you’re skimming headlines and you see one suggesting that, upon clicking the link, you’ll find content that conflicts with one of your core or strongly held beliefs. For instance, a person who believes MMR vaccines cause autism might come across an article stating that there's no scientific evidence of a link to autism (like the one linked here). Confirmation bias is active when you want to ignore the link. If you do happen to click through and read further, confirmation bias is active when you dismiss what you read as nonsense or twist it to be less in conflict with your belief.

 

Similarly, if you're a person who believes MMR vaccines are perfectly safe, you might at some point come across an article that challenges that view. It doesn't matter whether your view is correct or not—confirmation bias may still be at play, and usually we are not aware of it. It might sound like confirmation bias can be a good thing, allowing us to hold onto our true beliefs or beliefs we cherish. However, it's important to note that it's not applied selectively to preserve high quality or rational beliefs. It should concern us that we have a tendency to hold onto views solely because they're ours. There are times when doing so can prevent important learning or might even be putting us in harm's way.

Confirmation bias can have quite subtle effects that are not easy to identify through casual reflection on how we process information. For instance, you might decide to go on reading the article that conflicts with your belief, but get more adversarial or nit-picky than usual with its good arguments and evidence. Although this careful analysis could be part good critical thinking, if it's practiced only—or to a greater extent—on arguments and evidence with which you disagree, there's a problem. A specific instance of apparently effective critical thinking may be couched in a larger pattern of bias, thus rendering one's information processing relatively ineffective overall. We want to strive toward effective critical thinking regardless of whether we agree or disagree with what we read, watch, and hear.

 

Importantly, none of this is deliberate—we're not thinking, "I better read this hyper-critically because it could be damaging to my beliefs about vaccines." Rather, we tend to go on believing that we're diligent truth seekers. It's valuable to question that assumption we have about ourselves. 

Desirability bias

Desirability bias is closely related to confirmation bias. In fact, many people just lump the two together. However, desirability bias is a little different. As it turns out, we don’t just seek out evidence that's in line with our actual views; we’re also biased toward finding support for how we want things to be. That’s desirability bias.

 

Sometimes what we currently believe to be true and what we want to be true are very much in line with each other. In such cases, it would be hard to disentangle confirmation and desirability bias. If you believe the Earth is flat and you’re a member of a social group for which this is a core belief—say, the Flat Earth Society—you’re also probably going to have a strong desire for this belief to be true and, as a result, you’ll likely seek evidence and interpret information in line with this prior and desired belief.

 

At other times, however, we may hold beliefs about the world that diverge from the way we want things to be. That’s when desirability may move us in a different direction than confirmation bias would. Yup—our minds are complicated.

 

For instance, you may be a strong supporter of a political candidate you strongly believe will lose an upcoming election. Confirmation bias, if active, might cause you to seek evidence in support of your belief that your preferred candidate will lose. If so, you might, for example, better remember news stories that confirm this belief compared to news stories that do not. By contrast, if desirability bias is active, you might better remember news stories suggesting your preferred candidate will win, despite your actual belief that your preferred candidate will lose.

 

So, will one trump the other, or will their competing effects cancel each other out? Recent research by Tappin and colleagues (2017) suggests that desirability trumps confirmation bias, at least in the context of the lead up to the U.S. presidential election in 2016. Unfortunately, that’s only one study, so it’s hard to say what would actually happen in the real world in this or any other context. It’s possible that emotional attachment to a given outcome will drive things. Perhaps desirability wins out in contexts where wanting something to be true is a more powerful driver than feeling like you're right. Alternatively, if you're someone who's strongly connected to getting things right, maybe confirmation bias will tend to carry the day or cancel out desirability effects. In any case, research looking at effects of both confirmation and desirability bias is relatively recent and more work needs to be done to better understand the independent effects of each.

 

What matters for our purposes is that what we currently believe to be true and what we want to be true may both bias our evidence seeking cognition and behaviour. So, jointly reflecting on what you believe to be true as well as your desire for something to be true is likely a fruitful route to gauging what might be biasing your evidence seeking behaviour and thought.

 

​​Naïve realism

 

Naïve realism involves the view that the world truly is as we perceive it to be (Ross & Ward, 1996). This perception that “seeing is believing” can have in several ways of distorting thinking as we consume media. For instance, if you believe you tend to see the world as it truly is, you may be prone to seeing people who disseminate contrary positions as ignorant, irrational, or deceptive. That is, if you believe that you see the world accurately, you might think that those with contrary views must have things totally wrong (and are perhaps unintelligent) or are willfully misleading their audience (and are, therefore, perhaps evil!). As a result, you might dismiss any arguments for their position without further consideration, or dismiss anything they might say on other topics.

Believing that our raw perceptions reflect the true nature of the world can also lead us to view our personal experiences as evidence that is of greater value than the evidence that has accumulated in the scientific literature. For instance, naïve realism could be guiding your thinking if you argue that “the world just seems flat to me so it probably is flat—science is wrong!” or “it just doesn’t feel right to me that humans evolved from a common ancestor with chimps, so it can't be true!” This tendency can lead us to dismiss good evidence and sound arguments that do not align with our own perceptions, beliefs, attitudes, and past experiences.

Of course, sometimes you'll encounter such a wacky perspective that perhaps it should be dismissed without further consideration. But have you given much thought to when such dismissal is sensible and when it might be unwarranted? When might the world not be as we see it? Think politics, for example—to what extent do you trust that your perspective on on a political or social issue accurately reflects something real? Where is that trust in your perspective coming from? Could it be that others with different views have reasonable views that look different from yours?

 

Hot Cognition

 

It's impossible to disconnect emotions and motivation from our thinking, no matter how cool and calculated we think we're being. And you wouldn't want to! Without the capacity for emotion, much of our thinking, like decision making and moral judgments, just wouldn't work (see Descartes' Error by Antonio Damasio). Sometimes, though, our emotions and motives can lead us astray. When our thought is heavily influenced by feelings and desires, we can call this hot cognition. Imagine, for instance, that your favourite NBA team loses an important playoff game. If you're a big fan, emotions are running high, and it can make you jump to unwarranted views about the causes of the loss (e.g., the refs always have it out for us!). 

 

Similarly, many of us identify with, and feel strong emotional attachments to, political teams. We can be motivated to evaluate political arguments, debates, and election results with bias toward parties and candidates we're attached to or invested in. In the realm of sports fandom it's often all in fun, but elsewhere—like in politics or when we're drawing conclusions about whether to vaccinate our children—judgments and decisions based heavily or solely on feelings like anger and fear, or even admiration, can get us into serious trouble. 

In the domain of digital media, It’s a good idea to consider whether you’re experiencing a particular emotional state as you consume content. Could your feelings be altering your capacity for critical thinking in one way or another. How do you feel, for instance, when you’re reading a political Facebook rant, whether it's supportive of or in competition with your views? Is your emotion (e.g., moral outrage about a word the author used or a little joy if the author has expressed a view with which you agree) making it harder to objectively assess the arguments that are set forth? Is your emotional state motivating you to act in a way you might later regret? For example, do you feel compelled to immediately express indignation before fully processing the apparently unsavoury or false information a friend has shared online? Consider the following too:

  • How do you feel when you are faced with arguments counter to your core beliefs or ideas that contradict your own? Do you feel tense or angry? On the other hand, how do you feel when you are faced with beliefs or ideas that support your own. Do you feel good? Does it seem like this could be helping or hindering your capacity to think critically? What can you do to work around that issue?

  • How do you feel about those authors or speakers whose politics you find disagreeable or who are affiliated with a controversial organization? Does it seem like your feelings about the writer and their affiliations could be undermining  your capacity to accurately assess things that they say? (see People & Context module)

​​Bias Blind spot

As you scratch the surface of some biases and other mental processes that serve as barriers to critical thinking, there's good reason to question whether you're getting the point. You may have found yourself thinking that, although other people think and act according to biased mental processes, you're personally relatively free from bias. How wonderful if that were true! Unfortunately, your personal insight into your own mental life is also fallible. You harbour these biasing tendencies whether or not you believe it to be the case.

 

Here, a new concept could help. It's called the bias blind spot—for various reasons, while we can sometimes do a reasonable job of identifying biased tendencies in others, seeing biases in ourselves can be very hard (this is like a bias about biases—take a minute to get your head wrapped around that!). This can be a huge barrier to effective reflection on one's thinking, even after attaining extensive understanding of human cognition. That is, people can perfectly understand some of the fallible mental processes we've explored above while utterly failing to see their relevance to our their own reasoning.

Social Pressures

 

As social beings our minds obviously don't function independently of what's going on around us. Thus, it's not just our own ideas that we're liable to protect—there are beliefs and systems of belief floating around in the social world that we may wish to preserve. Perhaps you want to align yourself with what seems to be a common belief among your peers or those you respect (or fear!) on social media. Perhaps you feel pressure from authority figures, like higher-ups at work, or the norms you perceive in a social group, like your classmates. We can even end up working to sustain particular viewpoints found in our social world when we don't personally endorse them at all.

 

Are there pressures in your social world guiding what you read, watch, and hear online or your interpretation of the info you consume? In general, are the people, groups, and organizations with which you affiliate supportive of a particular ideology (e.g., activist friends or groups)? How do you feel about potentially discovering and sharing evidence that might rub your affiliates or group-mates the wrong way? Does it feel bad to hold beliefs against the grain of your affiliations? It might sometimes even feel wrong to privately debate those ideas. Does it feel like you hold a view that you might have to keep hidden? These are signs that social forces are affecting how you explore and think about information, and ultimately what you do with it.

Here are a few sources of social pressure to consider:

  • Authority figures (e.g., educators, professional or community organizations, certain family members and peers with higher social status) who are invested in a particular viewpoint. For example, a professor of one of your courses may have a political ideology that you as a student feel you must adopt, or at least not contradict in class or assignments, for the sake of your grade or approval (hopefully this isn't the case, but it's not uncommon!).
     

  • Group norms for a certain way of thinking or belief to hold. What groups are you a part of—activist, family, academic, etc—and what are their ideological norms? For instance, do you fear reprisal from your social group or an external activist group if you express an opinion that contradicts the normal or expeted view within the group?

  • Inconsistency. We don’t often enjoy feeling like, or being perceived as thought, we are being inconsistent with ourselves. Are other people aware of your previous commitment to an ideology or belief? Are you worried that someone will find you've changed your mind? Would it feel bad if they knew you were exploring ideas or evidence that challenges your prior beliefs or that you are in the process of potentially changing your mind?

It's important to note that the above sources of bias are a small sampling of social and cognitive forces that can mess with our capacity to reason effectively. It's a foundation to get you started, but in no a complete portrait of what we have to cope with and surmount to get closer to good critical thinking about the information we're processing.

Key Terms & Ideas

Bias blind spot: We tend to find it harder to identify biased judgments in ourselves compared to in other people.

 

Confirmation bias: “The tendency to seek out evidence consistent with one’s views, and to ignore, dismiss, or selectively reinterpret evidence that contradicts them” (Lilienfeld et al., 2009, p. 391)

 

Hot Cognition: Thought that is heavily influenced by feelings and desires.

Ideological immune system: In many ways our minds are set up to preserve our pre-existing beliefs and ideological structures. The concept of the ideological immune system is a practical tool we can use to contain those mental processes that protect our beliefs from information that could undermine them.

Naive realism: the view that the world is truly the way we perceive or believe it to be.

Social Pressures: Various aspects of the social world may guide what you read, watch, and hear online or your interpretation of the info you consume. Think about social norms, authority figures, etc. Think about what might happen if you express competing views. Are you compelled to keep quite or express the opposite of what you feel? Do you feel compelled by those around you to follow particular social media influencers or read certain material? When might this undermine competent reasoning?

​Applying It

Briefly write down your beliefs about whether there is a link between childhood vaccines and autism. You might also consider the relative strength of your beliefs—does it feel like there's some uncertain there or does it feel firm and unshakeable?

Read the following very short summary of a recent research project investigating the link between MMR vaccines and autism: Measles, Mumps, Rubella Vaccination and Autism. What's the relationship between the findings of this study and your belief? That is, does it align with or counter what you believed before you read it? 

What kinds of things were going on mentally as you read? For instance, did reading the article make you feel vaguely positive or negative? Did you find yourself reading closely or quickening your pace as you read? Were you coming up with counterarguments or alternative explanations?

Now imagine that the article found the opposite of its conclusions. How would this change how you read it, what you thought, and how you felt? If possible, bring in confirmation bias to help you explain.

Further Reading

Social Cognition and Attitudes | Yanine D. Hess and Cynthia L. Pickett | Noba

This chapter from Noba will help you expand your thinking about cognitive processes like biases and mental shortcuts beyond just those that preserve what we believe or others in our social world believe to be true. There's a lot to explore here to expand your thinking about your own mind.

“Reality” is constructed by your brain. Here’s what that means, and why it matters |Brian Resnick | Vox

This is a very cool article that may help you to understand the importance of naive realism via a look at optical illusions. If what see directly in front of us with our own eyes doesn't always reflect reality, then what about the more abstract opinions about the world that vary from person to person? If we sometimes must question whether what we see in front of us is real, we certainly must questions our more abstract beliefs.

Learning Check

© Darcy Dupuis 2024

Contact

To provide feedback or to learn about using Fallible Fox content for personal, educational or organizational purposes, contact Darcy at dupuisdarcy@gmail.com

bottom of page