top of page
carl-jorgensen-FBMF6cd3E7A-unsplash.jpg

The Limits of Your
Understanding

The Limits of Your Understanding (Metacognition Part 1)

When you encounter new information, what kinds of things can you do to determine whether you should, at least tentatively, accept the validity of the information and update your beliefs? 

There's a lot to think about in answering this question. The thing is, we don't usually bother reflecting self-critically on what we're mentally doing when we encounter new information and whether it differs from what we ought to be doing. 

 

A sensible place to begin is to consider your prior understanding about the subject, how confident you should be in that understanding, and what might be missing from your understanding.

Why? When done effectively, assessing your prior understanding can prevent overconfidence, facilitate a more careful reading of the information, and perhaps tell you when you need to fill gaps in your knowledge with further research. It can allow you to more effectively consider claims with which you initially disagree. Take, for example, the role of genetics in making people who they are (e.g., intelligence, personality, & life success). Perhaps you believe that genes have little impact on success or intelligence and you feel pretty confident in that view. When you come across a bit of media about behavioural genetics suggesting genes have quite a strong impact—an article, for example—you may think you're already fully informed and that it will be of much value to you.

 

But hang on a minute. If you take the extra step to think—to really think—about what you know (and don't know) about genetics and how you know what you suppose you know, you may find that there are significant gaps in your understanding or the reasons behind that understanding are tenuous. Perhaps your "knowledge" is little more than an opinion you've formed from anecdotes and experiences that are largely untethered from the science. Maybe it's conventional wisdom that's been passed around in one of your social groups with little basis in good solid evidence. That is, you may not have good reasons for your beliefs—you've just passively received those beliefs from people around you.  

If you're being honest with yourself, you might even find that you know almost nothing at all about the subject or you have no idea where your loose opinions come from. After some reflection, it may become apparent that you could learn quite a lot from spending a bit of time with a high quality source or two.

The genetics and success example context is extraordinarily complex. I think you know that you don't know much on that topic. The extent of our impoverished understanding of the world may be better illustrated by something more basic. For some reason, it's become conventional wisdom that, when you flush a toilet in the Southern Hemisphere, the water will rotate in the opposite direction from its rotation in the North. I first encountered this as a kid on the Simpsons episode in which they visit Australia and stuck it in my brain as a fun little fact to share. This "fact" is also found as a plot device in other media. The problem is it's not true. Our location on Earth does not determine how our toilet water flows! We think we know, but we don't.

 

Ask yourself where your factual knowledge comes from. You may find that it's based on nothing more than a joke from a TV show.

 

Of course, the same goes for other, far more complex topics—the one's we fight about online and at the Thanksgiving dinner table. How confident should you really be in your opinions on politics or vaccine efficacy, for example? Even simple received wisdom, like toilet water rotation, how much of our brains we are able to use (is it really only 10%—no!), and the belief that humans have five senses (we do not, in fact, have only five senses), often turns out to be false. So, how confident should any of us really be in our views about complex social and scientific issues? Probably not very.

As we'll find throughout this chapter, a big problem remains: our intuitive understanding of the extent of our knowledge can be wrong. To become effective at assessing our prior understanding, we need to go beyond intuition. Discovering some new mental tools can help us get there. Below, we'll explore a set of useful concepts that can help us reflect on the state of our understanding. These are tools you can add to the metacognition compartment of your critical thinking toolbox.

 

As you encounter some of these mental tools, it may be a good idea to maintain a running context in which you can try out some of them out. You might consider, for instance, a piece of media like a podcast episode, article, or blog post. You can use the example article on police training linked here (this one is also the focus of the "Applying It" section below), or find your own target article on a topic that interests you.

Things are often less certain than we think

To set the stage, most of our beliefs and most claims we encounter, whether in the media or during our education, should not be taken as absolute truth (no matter how appealing it is to feel certain). Experts in particular fields of science—biologists or psychologists, for example—must maintain the view that there's room for error and uncertainly in what they believe to be true. In other words, they need to acknowledge that they might be wrong.

 

Why? Well, if a scientist's beliefs are too entrenched, there's little chance that their thinking will be altered by good evidence. Thus, it will be hard for them to move closer to beliefs that accurately capturing the world around them. The same goes for the general population—if we want to learn about the world around us, our beliefs should be moved by good evidence. Accepting that much of what we believe doesn't perfectly reflect reality will equip us to go where the evidence takes us.

Keep in mind, too, that you're generally less knowledgeable than the experts (e.g., the scientists in a particular field). If the experts are uncertain or if there's considerable disagreement among those experts, it's probably really important to question your own certainty. 

 

Let's add some complexity. It may seem paradoxical, but this acknowledgement of uncertainty shouldn't prevent us from tentatively accepting claims when it's reasonable to do so. This is about adopting the principle of fallibilism, which is the view that we often, "have to draw conclusions, but that we may regard none of our conclusions as being beyond any further scrutiny or change" (Rauch, 2013, p. 45).

It may be stating things a little too strongly to say all conclusions should be questioned—there are some things that we can accept without any further consideration. Some obvious examples: squares always have four sides; vision is one of our senses; 2 + 2 = 4; the brain includes a band of nerve fibres (which we call the corpus callosum) that connects the two hemispheres of the brain. We don't need to waste our time and effort questioning these sorts of facts.

 

Likewise, we can also sit comfortably with many long established scientific conclusions without bothering to continuously critique them. Stephen J. Gould suggests that what we consider to be a scientific fact is something that's "confirmed to such a degree that it would be perverse to withhold provisional assent." Perhaps, it's not be 100% confirmed (i.e., it's not perfectly certain), but it's close enough that we can make better use of our time by moving forward and questioning other things. Fallibilism may still apply (i.e., we'll leave a tiny bit of room for doubt), but we don't want to make it our life's work to question every single little thing—when science has moved on, generally speaking, so should we. The truly difficult thing for members of the general public, though, is to understand to what degree experts are confident about a particular finding or theory. 

Fallibilism is therefore a mental tool for you to pick up but, like any other tool, it comes with caveats and limitations. It's a helpful thing to keep in mind to reduce a tendency toward unwarranted certainty. Like a hammer or a butter knife, it's a useful tool but not meant to be used on everything and should not be overused.

The key message here is that it's helpful for scientists, and anyone else who's trying to reason about the world, to be relatively comfortable with uncertainty. Based on the above, there are at least two good reasons for adopting an orientation of uncertainty. First, for most topics, there are always gaps or errors in our own personal understanding and our reasons backing up our views are often tenuous (e.g., received wisdom, overweighting personal experiences). Second, given the limitations of our means of knowledge production, absolute truth can often not be achieved by ourselves or anyone else. Third, some things are so complex, unobservable, or subjective that we can never know what's "true" (e.g., pizza with pineapples is not "good"—it's good to me; you may believe in a higher power, but we do not know that your belief is right or wrong). 

Uncertainty is difficult—being okay with feeling uncertain can be hard for scientists and students alike. For example, do you dislike it when a movie ends with key plot points intentionally left unresolved? Do you prefer instead to know what happens to the characters you've invested in for the past two hours? Similarly, do you prefer song lyrics and art for which the meaning is obvious? There's nothing wrong with any of this (and you probably don't have much control over such preferences). In these contexts, it's mostly just entertainment, and we all have our own tastes and proclivities. Some of us just don't like clear answers in the media we consume.

 

However, problems arise when that inclination to prefer certainty moves beyond entertainment and into contexts where we're trying to find out what's real about the world. Discomfort with uncertainty can make us firm up the sense that our beliefs perfectly reflect reality and the information we encounter is entirely true or false. 

 

That can be a significant hurdle to critical thinking, and not everyone will have to deal with it in equal measure—as with any personality trait, we vary in the degree to which we're okay with ambiguity.

 

So, it's time to reflect: how comfortable with uncertainty are you? Think about it! Working on that self-awareness will give you a sense of how much effort you'll have to put in if you hope to make good use of the principle of fallibilism discussed above.

What do you know?

Whether you're reading an article or sitting in on a lecture, where do you stand in terms of your topic-relevant knowledge? It's okay if you feel like you know little to nothing—for most subjects, that's the case for most people! What's important is that you're honest with yourself—you want to sit with an accurate view of your background knowledge and understanding.

 

Here are some questions to keep in mind as we encounter new information:

  • Do you have prior knowledge that can help you understand the information being conveyed? What's missing—is a lack of knowledge limiting an accurate and complete understanding? For instance, you may not be familiar with terms the author is using or the frameworks and procedures commonly used by experts in the field of study. To what extent might gaps in your understanding affect your ability to evaluate the information?

  • Might you have a false sense of understanding that interferes with an accurate and complete evaluation of the claim? You may, for instance, have unwittingly relied on a poor information source (e.g., friends' social media posts; untrustworthy podcast guests). Alternatively, you might overestimate your proficiency due to good but limited experience with a topic. For example, having taken an undergraduate course in psychology doesn't make you an expert in psychology, but it can sometimes make you feel like one. Even reading a book or listening to a lengthy podcast episode can sometimes lend a false sense of  understanding. 

 

Intellectual humility

Time for another new concept: a disposition of intellectual humility can be of great value as we encounter new information and try to answer some of the questions to ask yourself about your understanding. According to Whitcomb and colleagues (2015) intellectual humility, "consists in proper attentiveness to, and owning of, one’s intellectual limitations."

 

It's an ideal toward which to strive as we encounter new information. Why? Because it can help us get a more accurate read on how confident we ought to be in our beliefs and where there are gaps in our knowledge. It allows us to see when we are not experts and when we should open our minds to the perspectives of those who have a better understanding.

For more, check out an extended look at intellectual humility in this Vox article. Porter and colleagues (2022) have also written an excellent, albeit more academic, summary of the current state of research on the topic.

 

It's important to note that acknowledging your intellectual limitations is not simply about considering how smart or knowledgeable you are in a general sense—that's an extremely difficult task. What's more valuable is carrying with you the awareness that, in most areas, your knowledge is seriously limited; you may be smart, but you're probably not an expert on the subject. You may even be an expert, but there always remain gaps in your understanding. That's okay. You'll learn more if you have a good idea about where your limitations lie.

​​Unfortunately, it's not always so easy to accurately gauge our intellectual limitations. One thing to be wary of is that, in domains in which we know little or are poor performers, we're often unaware of just how little we know or how poorly we might perform. 

 

Take, for example, astrophysics. You may be well-aware that you know very little about this field of study. Think more closely though—do you even know what it means to be knowledgeable in astrophysics? Would you know an expert in astrophysics if you saw one? What kinds of courses or readings would you need for even a basic understanding of the field? You probably have no idea! Having little clue what a proper education in astrophysics would look like means that you don't even have a reference point to gauge how you stack up to others in terms of your knowledge. All you know is you know nothing. 

 

But you're aware that you're not particularly knowledgeable. That's a good place to start.

 

We're less likely to have problems seeing our lack of competence in astrophysics than we are in the realms of politics, psychology, or vaccine science. In areas that we encounter on a regular basis, we might think we're fairly knowledgeable when we're not. That's where we get ourselves into trouble. For example, because we often read and talk about politics many of us are quite confident in our political beliefs when, in fact, we know very little. Similarly, because we deal with human beings and their relationships every day, we may feel like experts in psychology. Nevertheless, if quizzed on our basic understanding, most of us would be found to know very little. That's a false sense of proficiency.

What's important here is that deficiencies in a particular domain result not only in mistakes but also in the inability to recognize that we are even making mistakes. If we lack expertise, we also often lack awareness of what knowledge and abilities would be required for expertise.

Moving forward, as a starting place, carry with you the ideas of fallibilism,(dis)comfort with uncertainty, and intellectual humility. As you encounter new information, continually ask yourself questions about the extent of your understanding and your confidence in that understanding. 

 

There's a lot more to effective metacognition than reflecting on your relative knowledge or competence. The next thing we want to do is think about key features of how humans think and some important limitations of our thinking (Metacognition parts 2 & 3). We'll then check out some mental tools we might use to try to correct for fallible mental processes, and make our engagement with digital media more fruitful (Metacognition parts 4 & 5).

Key Terms & Ideas

Comfort with uncertainty: Few of us are completely fine with not knowing what's true or what's going to happen. Some of us are better equipped for dealing with ambiguity than others. When it comes to assessing the state of your knowledge on a topic, adjusting in the direction of being okay with a lack of knowledge and the possibility of being wrong is key. It's hard though!

Fallibilism: The view that we often "have to draw conclusions, but that we may regard none of our conclusions as being beyond any further scrutiny or change" (Rauch, 2013, p. 45).

Intellectual Humility: "Proper attentiveness to, and owning of, one’s intellectual limitations" (Whitcomb et al., 2015)

Applying It

Check out the linked article, "Implicit Bias' Trainings Don't Actually Change Police Behavior"

You don't have to read the whole thing right now, but it's an interesting and important topic to spend some time with. As you read the title and the bit of summary text below it, you'll encounter the general focus, some specific concepts, and a claim.

Reflect on the following:

1. What prior knowledge do you have that helps you understand the claim? In what ways might your lack of knowledge limit an accurate and complete understanding of the claim? In other words, what knowledge are you missing? For instance, relative to true experts, to what extent do you understand "implicit bias" and "anti-racism courses"? Do you have an idea of what it would take to assess the effectiveness of training in these domains?

 

2. Consider whether you might be overestimating your prior knowledge. Perhaps, because you have some personal, work, or academic experience in a relevant domain. Keep in mind that this article deals with a difficult area of study that even researchers are uncertain about and have been vigorously debating. If that's not something you're aware of, you're missing critical knowledge!

3. How might adopting a stance of intellectual humility help you to understand, learn from, and ask good questions about this article?

Additional Resources

How well do you know your own reasoning strengths and weaknesses? Clearer Thinking

This is a nice little quiz to self assess how you see your reasoning ability compared to others (e.g., how gullible you are relative to other people; how good a critical thinker you believe yourself to be relative to others). It will give you a series of figures illustrating your reported self confidence in general reasoning relative to others. What sticks out to you about your own and other people's confidence? What makes you think you're better/worse at these things than others?

Article: Intellectual humility: The importance of knowing you might be wrong | Brian Resnick | Vox

Video: The joy of being wrong (on intellectual humility) | Freethink

Podcast: Intellectual humility (a 6-part series)| Philosophy Talk

Learning Check

bottom of page