top of page
todd-quackenbush-IClZBVw5W5A-unsplash.jp

New Tools for the Fallible Mind

New Tools for the Fallible Mind (Metacognition Part 5)

Having now explored a few things to be aware of about the mind, you might feel more confident in your ability to consume media without bias, or at least in your ability to correct for some key mental errors. If anything, however, the previous sections in this module should make us less confident in our critical thinking. First, we've only scratched the surface of what psychologists have discovered about mental shortcuts and biases. Second, even if it were possible to have a complete understanding of fallible mental processes, you would still lack access to most of your own ongoing mental operations (recall Type 1 thought). Finally, assuming you could have full access to your thought process at a given time (impossible!), you would still need to know how to deal with your thinking if it happened to be inhibiting your reasoning.

 

A good place to sit for now, then, is to adopt an orientation of intellectual humility and realize that effective self-reflection is an ongoing project that can never be complete—you'll always be missing a great deal of understanding and hold inaccurate beliefs about your own thinking.

 

Next, we need to do more than cultivate a basic understanding of human mental life. We need ideas for how to take corrective action.

 

Research doesn't tend to support the idea that the mere knowledge of our cognitive tendencies helps us to reason better—some researchers suggest that learning about biases, by itself, is useless in this regard. Others are somewhat more optimistic, but most reasonable scholars don't believe that a basic education in cognitive biases is all it takes to surmount those biases. 

 

One issue is that education about biases is often domain specific, lacking generalizability to novel contexts. So, for example, a physician who learns about how biases can affect diagnosis may not see how those same biases may be impacting their sharing of false information about conspiracy theories on social media.

 

As we've seen in part 3, another problem is that we have a bias blind spot—a bias bias—compared to our ability to judge biases in others (an ability that, itself, is extremely limited), seeing them in ourselves is incredibly hard. As such, it can be difficult to know when biases are distorting one's thinking. In short, you can't stage a personal cognitive intervention when you have trouble telling whether there's even a problem.

The awareness of biases and other cognitive shortcomings is best thought of, then, as just a starting point, and can be supplemented by the adoption of debiasing strategies and other tools for improving your thinking. Below are some approaches to try out. Some require a small shift in thinking. Others are larger personal interventions to boost judgment and decision making, going so far as to require deep changes to our social world (for readers interested in a some of the research on debiasing strategies, have a look at this thesis, starting at p. 64).

 

As you peruse the below strategies, consider which ones may be feasible for you to use and which may be a bit of a stretch to implement.

Consider the opposite / consider an alternative

 

A consider the opposite strategy can help us to escape from biased views, misinterpretations, and incomplete perspectives. This approach is just what you might think—it involves considering a perspective that is at odds with, or just different from, your initial perception or belief (see brief intro video here). Depending on context, this could involve reflection on opposite perceptions, attitudes, beliefs, decisions, or outcomes. It can help us to question our views and decisions, arrive at more nuanced perspectives, and better understand our conversation partners and adversaries.

 

This can be extremely difficult when our views are wrapped up in strong attitudes and moral positions. For instance, if you've read a tweet that your gut tells you was intentionally hateful to a particular group of people, that position is hard to step out of and consider—particularly if you harbour a negative attitude toward the person who posted it. It could be even harder to temporarily adopt and consider the opposite of your judgment. However, doing that hard work could be what gives you the most accurate view of the instance at issue.

 

This is because your initial gut reaction might be incorrect. Our initial gut reactions—no matter how correct or moral they feel—are often wrong. For example, although you might have a gut response to what an author wrote, that intuition cannot tell you much about the intentions behind the writing. Your initial assessment could very well also neglect context in which the tweet was written. Perhaps the apparently hateful remark you read was simply a reiteration of what someone else had said and appears only as part of a longer thread condemning that hateful position? What if, it's your politics that are causing you to shine the most negative possible light on the remark. Maybe, alternatively, the author's communication was unclear, causing you to misunderstand the intended point. Could it also be possible that you're misinterpreting sarcasm or satire? It's never out of the question that you simply misread some aspect of the tweet or that you didn't have access to the larger context.

 

In instances like this, questioning your initial judgment could save you from arriving at hasty judgments, feeling unwarranted outrage, and making bad decisions. You could consider the opposite—that this person is not expressing hate, but actually opposing hate—and look for possible arguments for this alternative interpretation.

Considering alternatives is just as useful when your initial view turns out to be correct. Looking at things from an alternative viewpoint allows you to see your own ideas through a different lens. Perhaps you were largely correct, but more nuance could be injected to boost the precision of your position. Perhaps you were entirely correct but can now see that correctness more clearly because you’ve considered counterarguments.

 

It can also help you predict arguments that opposing parties might make against your position—thinking about other viewpoints will save you from being caught off guard with poor arguments or explanations for your own views. In other words, considering alternatives prepares you for counterarguments.

An additional application of consider an alternative bears mentioning. Once you have accepted a position on some issue, you may want to act on it. We are, in fact, often biased to see action as better than non-action. This is called action bias. As a result, we tend to jump to action rather than practice restraint.

It's easier for us to come up with reasons to act on what we believe. However, that in itself doesn't justify taking action. Our behaviours can have negative as well as positive consequences. As such, if you take a moment to consider the consequences of both (a) action and (b) non-action, you may sometimes find that non-action could produce better results (or save us and others from negative consequences).

 

For example, social media mobs are composed of many people taking small actions. These mobs have sometimes achieved positive change, but they've unquestionably also produced negative outcomes. Maybe your outrage about someone's tweet is totally justified. However, dozens or hundreds of other people may have already expressed similar outrage online. Is it better to contribute by piling on with the mob or by restraining your response (and possibly channelling your motivation to act in other ways)? This is a a hard question—in any case, we're better off considering both action and non-action and their respective possible consequences to inform our decisions.

Delayed decision-making

 

Taking time before making a final decision allows for fuller and deeper reflection and openness to alternative explanations (i.e., it can be used in combination with consider the opposite). Some research has shown that delayed decision-making can increase diagnosis accuracy among clinicians in a psychological setting. Of course, clinical decision-making is a context in which decisions are of great importance, so convincing a clinician to take some time to breath on an assessment may be relatively easy. By contrast, consuming media online is a relatively low-stakes activity in a context designed for speedy action. How can we increase the motivation to adopt delayed decision-making there too? How do we use a “slow” strategy in the very fast online world? How long a wait is long enough to help us make good decisions about the information we’ve gathered?

 

These are hard questions, but they're important. When we race to express outrage about a tweet that angers us without considering alternatives (see above), or giving ourselves time to cool off, we're liable to make mistakes—whether that involves a misreading another person's view or treating someone poorly because of the perspective they've articulated. Sometimes we're blocked from considering alternatives in the short term by our limited mental capacities, and so we must let time pass and allow alternate takes to gradually seep into our thinking.

I have found explicitly adopting delaying decisions to be extremely useful. There have been times when I’ve reacted too fast and chosen the wrong action, but there have been other times when I’ve paused on a decision for a while before doing anything at all. This has saved me from bad decisions more than once—both major and minor.

 

For instance, in the past I sometimes felt compelled to engage after reading an outrage inducing social media post. However, feeling dissatisfied with my capacity to articulate precisely what I want to say in the moment, I would write out a rough response in a word processor and then leave it for a bit. What I almost always found was that there was something more important at play than deciding what to say—the key thing for me was consideration of whether I should say anything at all. After leaving a response to sit for a few hours, responding often felt more trivial. At other times, it even felt like it would have left me and others worse off.

Upon delaying a decision, you might also find that the initial reaction was the right one. Sometimes there's still time to response, but the delay could mean that the time to respond has passed. It's up to the user of this tool to determine when it is good to delay decisions and when it could impede appropriate engagement. There are pros and cons of this approach in different situations and for different people.

 

Perspective taking

 

Some research suggests perspective taking as a strategy that can helps reduce stereotyped reactions and favouritism toward members of one’s ingroup. I think perspective taking can be useful more broadly to help us understand our own and others' thinking. It's a strategy for considering alternative views—by putting yourself in the shoes of the other, you may be better able to see why a person voicing a perspective different from your own might hold that view.

The psychologist Lisa Feldman Barrett offers this bit of advice in an article on getting more comfortable with social differences:

Spend five minutes a day deliberately considering the issue from the perspective of people you disagree with – not to argue with them in your head, but to understand how someone who’s just as smart as you can believe the opposite of what you do. I’m not asking you to change your mind, just to truly embody someone else’s point of view. If you can honestly say, “I absolutely disagree with that view, but I understand why people might believe it,” then you’re actively helping to create a less polarised world.

How else might we cultivate perspective taking? One of the reasons reading fiction can be so good for us is that it demands that we look at things from multiple perspectives, including those of people from other social groups, different parts of the world, and opposing political camps. We don’t have to—and in many cases shouldn’t—adopt others views, but thinking about them helps us to understand what they're about and to question our own beliefs.

 

​Active open-mindedness

 

The general idea of open-mindedness has garnered significantly more attention than the approaches discussed above. It's less a strategy and more of a broader disposition that harnesses several other mental tools (e.g., intellectual humility; scientific approach). It's about using one's capacity to integrate new info with what we already believe, with ongoing movement in belief along with the preponderance of evidence (Sturm, 2018). Some people are predisposed to an open-minded orientation toward life while others tend to be relatively closed off to new ideas and evidence. Everyone, however, can hone this disposition and find useful times to use it. It's just a questions of time and effort.

 

You can find it discussed in this Medium article.  It will offer perspective on how to be, “ready, willing, and able to change your mind about things by clearly defining what evidence would prove you wrong and actively looking for it” (Sturm, 2018). An excellent orientation, but not an easy one to shift to! 

 

Harnessing the social world

Acquiring knowledge is a conversation, not a destination. It is a process, a journey—a journey we take together, not alone. Others are always involved.

– Jonathan Rauch, The Constitution of Knowledge: A Defense of Truth

The gaps and inconsistencies in our individual knowledge are partially ameliorated by the social distribution of knowledge and the exchange of information in communication.

Oaksford & Chater, 2020, p. 309

None of us, thinking alone, is rational enough to consistently come to sound conclusions: rationality emerges from a community of reasoners who spot each other's fallacies.

Steven Pinker, 2021, p. xvi

In Thinking in Bets: Making Smarter Decisions When You Don’t Have All the Facts, Annie Duke, a former professional poker player, reflects on her adoption of a social strategy that I think is awesome, but is something most of us would find hard to do. In short, while people on their own are biased and are relatively ignorant concerning their own biases, having the right people around you to call you out on your mistakes can be an extremely effective intervention.

Ideally, this is one of the most important features of working on group projects for school or committees at work (though it doesn't always end up that way—particularly, perhaps, when we're overly concerned with our own and others' feelings). It's also why even the most brilliant of authors preface their books with long lists of people who've worked with them to develop, edit and critique earlier iterations of the work. 

 

In daily life, though, we tend not to wind up in these sorts of groups. Rather, our groups often consist of people who are like minded (sharing political views or religious beliefs, for example) or otherwise prefer not to step on each others attitudes or beliefs. 

 

However, if we want to—if we deem it important, for instance—we certainly can seek out other people to work with us on our reasoning and accuracy. ​Instead of, or in addition to, having a social group to support our beliefs, we can cultivate groups of critical thinkers to get us closer to an accurate view of the world where it counts. While some of us already have these people around us, many of us don’t and are not motivated or otherwise equipped to construct a social world that is critical of our thinking (it can be annoying!). That makes sense. It feels strange to engineer our social worlds in this way and it's not particularly fun to constantly worry about clear thinking when you're trying to enjoy time with friends.

 

Regardless of its difficulty or strangeness, you might be interesting in thinking about harnessing the social world in this or other ways to boost thinking. Check out Annie Duke's approach in this episode of Shane Parish’s Knowledge Project Podcast at around 25:40 minutes in.

Key Terms & Ideas

Active openmindedness: Actively seeking to integrate new info with what we already believe, with ongoing movement of one's thinking in the direction of the preponderance of evidence.

Consider the opposite/consider an alternative: considering perspectives that are different from your initial view. Depending on context, this could involve reflection on opposite or diverse perceptions, attitudes, beliefs, decisions, or outcomes.

Delayed decision-making: Extending the amount of time taken before arriving at your final decision.

​​Perspective taking: This strategy is about by putting yourself in the position of the other, trying to understand their viewpoint or actions by considering their experiences, personality, and ideology. You may be better able to see how a person voicing a perspective different from one's own could hold such a view.

Harnessing the social world: Though there are significant barriers to doing so, it's possible to structure part of one's social world to function as a check on the accuracy of our views and strength of our decisions. While this may not be feasible or desirable in everyday social context like friend groups, there could be huge benefits to doing so in particular contexts like study groups and other contexts in which one wants to develop rather than only receive social support

Applying It

1. Open and read the following article: "If Canada is serious about confronting systemic racism, we must abolish prisons"

2. What is the main point (overall conclusion), and what are the reasons (i.e., premises) they offer is support of this conclusion? What is your view—is it the same as the view the authors provide or does it differ slightly or drastically?

3. For each of the five tools discussed in this section of the module (e.g., consider the opposite; perspective taking), think of and describe at least one way you could use it to reflect on the article. For example, if you disagree with the authors, how could you use perspective taking to get a better idea of the authors' position and why they might hold it? Think deeply about this (and each of the five)—what can you find out about the authors; what do you know about the context in which the piece was written? There are many ways to use these five tools and they can be applied regardless of your level of agreement with the authors. Give it a try.

4. Which of the five tools do you think you'd benefit most from as you reflect on the article and your own thinking about the article? Why? Which tool seems least useful? Why?

Further Reading

How to Practice Active (not Passive) Open-Mindedness | Mike Sturm | Medium

Variation is the stuff of life. So why can it make us uncomfortable? | Lisa Feldman Barrett

 

Learning Check

bottom of page