Connect with us

Personal development

How to be a better Critical Thinker

Published

on

Philosophers, Critical Thinking, Intelligence, Wisdom, Kevin de Laplante

In this article I interview Kevin deLaplante PhD host of the Argument Ninja Podcast and founder of The Critical Thinker Academy 

Kevin worked for sixteen years as an academic philosopher teaching courses in philosophy of science, logic, critical thinking and ethics, has a double degree in philosophy and physics, and his mission is to combine the “light arts” of philosophy and physics, with the “dark arts” of persuasion, influence and manipulation.

In this article:

Let’s begin:

What is critical thinking?

Michael Frank: I like to start off with a clear definition of terms. What is critical thinking to you?

Kevin deLaplante: My approach to critical thinking focuses on the goals of critical thinking. So what are we aiming for? What’s the target?

I think there are three parts to that answer:

  1. You want your beliefs to be true rather than false
  2. You want your decisions to be rational or wise, rather than irrational or unwise
  3. You want to think for yourself. You want to be an independent, critical reasoner who can take responsibility for your beliefs and values and choices instead of just parroting the views of your family or your peers or your church or the community, so that when you assert: “I believe X” those are your reasons, you’ve made that choice, and you own that choice

So those are the goals.

In addition we need to understand the various psychological cognitive biases that we’re all prone to, and other facts about human nature that can interfere with our ability to think critically.

We also need to learn all of the ways that various social forces are trying to persuade and influence what you believe, what you value, and what you think, in ways that aren’t necessarily in your best interest. And then it’s part of the skill to be able to identify when that happens and to find some ways of protecting yourself from those influences.

Michael Frank: I think of critical thinking as a way to sort fact from fiction, truth from lies, reality from fantasy. It’s like an intellectual self-defense, or a firewall for your mind. It doesn’t stop the truth from getting in. It just stops BS, fake news and false information from getting in.

How to become a better critical thinker

Michael Frank: How does one become a better critical thinker?

Kevin deLaplante: Well if you take a critical thinking course in a college classroom, the standard text book approach is that you’ll learn a set of topics that are part of the history of argumentation theory:

  • How to use definitions well
  • What is an argument
  • Principles of argumentative reasoning
  • Basic principles of logic
  • Logical fallacies
  • Good reasoning versus bad reasoning

And all of that is under the heading of how ought we reason.

Unfortunately, you almost never learn in those classes anything about how we actually reason. You never learn about the psychology, the neuroscience, the sociology of human belief, reasoning, and judgment, which is a shocking emission frankly, that is slowly being rectified now.

Critical thinking requires an understanding of the obstacles that get in the way of our ability to think, which means that you better pay attention to what we’ve learned about human nature as it pertains to how we reason. For example: You have to learn about cognitive biases, especially confirmation bias, and you have to learn about all of the ways in which we systematically mislead and deceive ourselves, and the ways that those biases make us vulnerable to exploitation and manipulation.

I think you should start from the very beginning studying both the “dark arts” as I call them, the ways in which human beings have learned to be persuasive to one another over the course of human history. That involves everything from how marketing and sales works, to how political persuasion works, to how social media persuasion works and so on, to the light arts, which is about how we can use reason to pursue these higher goals of discerning fact from fiction, discerning wise choices from unwise choices, pursuing virtue rather than vice etc.

The problem is that in school, you might get a course in critical thinking and logic in a philosophy department, but you wouldn’t get the psychology, or you might get the psychology, but you wouldn’t get the logic and argumentation. Or you might learn something about reasoning with chance and uncertainty in the math or statistics department, but you wouldn’t learn it anywhere else.

So all of these topics are spread around and distributed across different departments so that students only ever get little fragments of them. So I think you need to adopt this broad comprehensive framework and commit to this as a goal.

And a lot of this process is going to involve you growing your self-awareness. Self-awareness is going to be fundamental to this process. You need to be aware of how you respond to certain situations, and why you’re attracted to this and not that. The more self-aware you become about how prone you are to these sorts of distortions in thinking, the more you are able to make changes that can compensate for them.

The illusion of explanatory depth

Michael Frank: You mentioned self-awareness, can you unpack that a little bit as to how it pertains to critical thinking?

Kevin deLaplante: Sure. Self-awareness is just vital in understanding for example, the fallibility of one’s own beliefs. We are prone to overestimate how much we know about any given topic. This is called The Knowledge Illusion and there are various forms of these kinds of optimism biases.

For example: If you were to ask someone: “Do you understand the greenhouse effect?” they might say yeah I think I do, and then if you asked them to rate their understanding on a scale from 1-10 they might give you a high rating, but then when you ask them to actually draw you a little picture and to explain the greenhouse effect, then all of a sudden they’re stuck and they can’t do it. They only have the vaguest sketchiest idea, and then they come to realize that they didn’t understand it as well as they initially thought they did.

This is an extremely common occurrence. It’s called The Illusion of Explanatory Depth in cognitive science literature and it’s true for all kinds of things like explaining how a zipper works, how a toilet works, how a bicycle works, as well as more complex things like explaining what an atom is, or the signs and symptoms of depression.

There are all kinds of common concepts like these that we routinely overestimate our knowledge of, and if you just prod someone to give you an explanation in even a little bit of depth, it often reveals just how little we really know.

How to improve your critical thinking skills

Michael Frank: In addition to having a basic understanding of cognitive biases and logical fallacies, what are some other ways that we can improve our critical thinking skills?

Kevin deLaplante: If you want to improve your ability to reason well, most of the time it involves changing your environment and surrounding yourself with different people that have different backgrounds.

If you had to scale the environments that are least to most conducive to good critical thinking, the least conducive ones are the ones where you’re an individual reasoning alone in your basement, maybe with the web and some books, instead of interacting with people that have different points of views. That’s the worst case, no matter how smart you are, that’s the worst case scenario.

The other worst case scenario is if you’re interacting with a group of very similar people, or if your peer group is very homogeneous and you all share the same ideology and background assumptions and so on, because then you’ll amplify whatever distortions are going on around you. And it’s even worse if it’s an ideologically polarized group, because then you have a sort of ideological tunnel vision and the whole group reinforces it.

So changing your environment and being exposed to people outside of those groups with a broader and different range of ideological diversity, can have a huge impact on the quality of your thinking and is an important first step.

Now if you’re used to a kind of homogeneous environment where there’s a lot of agreement and consensus, then it’s usually not very comfortable when you move into an environment where it’s less so, but that’s one of the prices you pay for improving the quality of your judgment and your reasoning.

The secret sauce

Kevin deLaplante: One of the key mistakes that people make about human reasoning is thinking that the secret sauce is our unique individual brains. But that’s not the case at all. The real secret sauce is in the dialogue and the quality of the interactions you have with other people.

This is important because over time the groups knowledge is much better than the individuals knowledge, that is if you allow for argumentative discourse between members of the group. We need to use each other to maximize our ability to think well. Think of it as a social activity that we do together, not as an individual sport, but as a team sport that you need training partners for.

This is the thing about tribalism, yes tribalism has its problems, but the secret is to create tribes where rational discourse and disagreement is not only tolerated – but encouraged.

We’ve got to create these social networks and social norms of communication with people where it’s okay to disagree, but they don’t just happen by themselves, they have to be developed and nurtured.

One of the reasons it’s important to reason with other people is because of the confirmation bias, where I’ll selectively seek out and remember information and facts that are supportive of my idea, whilst selectively ignoring or forgetting arguments and facts that go against it, so I’ll have this one-sided stream of argumentation towards you.

You on the other hand, as the recipient, are actually much better at identifying weaknesses in my reasoning than I am, when you are the target of it. And similarly, if you start to try to persuade me, you become weakest at identifying the problems in your own reasoning, but I become much better at identifying those problems.

So there’s an asymmetry in our ability to identify good versus bad reasoning between when we’re the one who’s giving it versus when we’re the recipient of it. And that means that communicative dialogue where we’re both trying to spot the weaknesses in each other’s views is critical for the whole system to improve over time.

In a nutshell: The real test of your knowledge and skill is when you exercise it in environments where you’re communicating with other people. It’s not just you thinking alone inside of your room, or inside of your own head.

Michael Frank: Agreed. I want to recap some of the points so far in this interview:

    • You need to learn not only critical thinking skills, cognitive biases, and logical fallacies, but also the ability to persuade people in different environments
    • Most people think they’re smarter than average
    • Most people think they know more than they do
  • Change your environment. Don’t reason alone. Get out of your bubble, get out of your echo chamber. Listen to and talk to people who think differently than you, that have different perspectives, and ideally are smarter than you. Argumentation within the group, and exposure to different perspectives is absolutely crucial

Critical thinking questions

Michael Frank: Let’s talk critical thinking questions and you asked some really good ones in episode 29 of your Argument Ninja Podcast

“Does it matter to you whether or not what you believe is true or not?

“Does it matter to you whether or not your choices are rational or irrational?”

“Does it matter to you whether your thoughts, beliefs and values are not your own?”

“Does it matter to you whether or not you think for yourself?”

I think these are very good questions to determine, first of all, are you a truth seeker? Because if you’re not a truth seeker, then who cares about critical thinking? And I personally believe that most people aren’t truth seekers. I think most people are intellectually lazy and dishonest, and I think most people believe whatever they want to believe regardless of the evidence. I think most people don’t give a damn about truth.

However: If someone was a truth seeker, what are some good questions that we should be asking ourselves when we’re reading a book, listening to a podcast or watching the news?

Kevin deLaplante: Well although people commonly ask questions like:

“How do we evaluate the truthfulness or the accuracy of what’s being said?”

“How do we detect a bias in what’s being said?”

I think that these are the wrong questions to ask. And the reason I say that is because I don’t think that we as individuals are capable of answering these questions on our own, or at least our ability to evaluate information is very limited, simply because our own personal background and knowledge is limited.

The more we know about the subject matter the more we’re able to make critical judgments about the quality and the accuracy and the reliability of what’s being said. But to be a reliable judge of that really does require almost expert level knowledge. And we’re not experts at very much, in fact our background knowledge is really quite limited for the various reasons that we’ve talked about. We often think we know more than we do, and we lack the information to even recognize that we don’t know enough. So we’re prone to suffer from a kind of false confidence in our ability to critically evaluate the arguments and the information that we’re consuming.

So knowing that about ourselves, what are the questions that we should be asking ourselves?

I think the questions initially have to do with self-awareness:

“How do our group affiliations influence our psychological responses to information and our judgments about what sources are trustworthy or untrustworthy?”

So if I’m a liberal and I’m listening to a bunch of liberal bros talking politics on a podcast where it’s anti-trump, I need to be aware of the fact that I’m going to have positive responses to a bunch of things they say, and not because their arguments and reasons are necessarily persuasive, but because I’m psychologically programmed to respond positively to views that I identify with.

So having that level of self-awareness is the first stage, the next question to ask is:

“Is the source of the information coming from a standpoint which is deep within a kind of tribal bubble? Or is it coming from a standpoint where it’s actively engaging with more diverse sources and arguments?”

And that you should be able to assess.

The challenge is that everyone thinks that their viewpoints are reasoned and weighing all of the evidence.

The truth is that it’s very hard to get outside of your bubble, and it requires a very strong level of mental and intellectual discipline to be able to be honest and say that my judgments here are probably not reliable, and the fact that I agree with them so strongly might be evidence that they’re not reliable.

You know we often have a warm feeling that comes when someone says exactly what we want to hear, but that warm feeling is bypassing our higher cognitive functions all the time.

Usually when we engage our higher cognitive functions in critical evaluation it’s an uncomfortable feeling, a stressful feeling, there’s a hint of discomfort to it.

However if it’s a smooth, positive, warm experience, it’s usually a sign that you’re in the pocket of a set of views that have been channeled, and you’re only operating within this channel. You’re not exploring alternatives to it.

Michael Frank: So question the source:

Is it predominantly liberal or conservative?

How does it make you feel? Does it make you feel good?

Are you being told what you want to hear?

Or what you already believe?

If so, you’re probably operating in a bubble, an echo chamber.

Logic isn’t enough, you need persuasion skills

Michael Frank: I like the way you talk about the need for persuasion skills in addition to critical thinking skills. It often amazes me how many “smart people” tend to bash people with facts and figures as if that was enough to change the average person’s mind.

The truth is that when you attack someone with facts and figures, even if you’re 100% right, instead of changing their mind most people tend to double down on what they believe and dig their heels in. And so when you talk about persuasion techniques and the way that people are actually persuaded, as opposed to the way you’d like to imagine they’re persuaded, I think that’s extremely important.

Kevin deLaplante: I agree. It’s one of the blind spots in traditional critical thinking training and education actually.

If you think that all that matters is learning logical fallacies or the structure of a good argument, then you’re going to be in for a world of trouble when you go out into these different social environments and try to persuade people where the norms are different, because what you say in those environments is not going to have any impact, or it won’t have the impact you want.

You’ve got to be able to speak the language of the environment you’re in, otherwise you won’t be speaking persuasively. And what might be a persuasive argument in one social group or setting may not be a persuasive argument in another social group or setting.

What you’ll find when you move from one social circle to another is that there are different social rules for what counts as persuasive argumentation, and they differ from environment to environment, from the judicial court system, to how working scientists write a persuasive scientific paper, to the political arena.

Michael Frank: I couldn’t agree more. You make a damn good point. You need to understand these persuasion techniques and they do differ from person to person, environment to environment, from finance to legal to science to politics. Everyone has a different buying criteria whether it’s an individual or a group and there is no one size fits all.

How to approach sacred cows

Michael Frank: One thing that I don’t think is spoken about enough in critical thinking circles is Sacred Cows

Examples of Sacred Cows

  • Anything considered “politically incorrect”
  • Cultural traditions
  • LGBTI community
  • “Me too” – Tony Robbins recently found this out the hard way
  • Jesus, Buddha, Muhammad, and any other religious leaders or Gurus
  • Holy books – The Bible, Quran, Bhagavad Gita etc.
  • Religions – especially Islam
  • Race – especially in America

These things (and many others) seem to be off limits for many people, and they won’t allow any questions or scrutiny.

A lot of religious critical thinkers for example, are willing to think critically about everything except for their religion.

What are your thoughts on sacred cows?

Kevin deLaplante: I think the idea that you’re pointing to is that everyone has a set of beliefs which are close to their identity. It may not matter to you whether or not you’re wrong about other things, but when you get close to beliefs that are central to your core, then to be wrong about those things becomes a challenge to your conception of who you are as a person and who you want to be, and then you naturally want to defend against that.

So these beliefs are sacred in the sense that to challenge these core beliefs is to challenge your identity. I think that’s what this comes down to, and it’s a defensive response because it threatens the coherence and integrity of one’s internal psychology.

So in that sense it’s perfectly natural, perfectly human, and we’re all prone to it to some degree or other.

I think from a critical thinking standpoint, there’s one part that is just diagnostic: Do you know when you’re engaging in a conversation with someone when you’ve hit a sacred cow?

Obviously there are common responses that you can look for such as a kind of reflexive defensiveness, or an attempt to shut down the conversation once you’ve crossed a line, right?

Sacred cows are a barrier, no doubt, to critical thinking. So when you know that you’re touching on something that is close to someone’s identity, you need a different strategy to engage with them so as not to trigger defensive reactions.

For example, when you’re talking about someone’s life choices, or when the issue is really close to someone’s identity, it’s helpful not to talk about the rightness or wrongness of an issue as it pertains to them. Instead I recommend talking about it in terms of how you would tell an autobiographical story: “When this happened to me” or “When I heard about this for the first time, this is how I responded to it”. That way you’re talking about how you responded to it, rather than making a claim about how they ought to respond to it, and that’s a way to broach an issue without it triggering these kinds of defensive patterns.

But from a larger political standpoint this is harder because you’ve got all of these different polarized social identity groups.

One thing you have to realize is that the psychology is similar to the psychology of threat or the psychology of war, where they believe that they’re defending something, and the other group is a serious, almost existential threat to something that they care about. And they will act defensively. They’re not interested in having a rational conversation about this issue because that’s not how you respond to existential threats, right? Their goal is to overcome the threat, to resist the threat, not to understand it better. And if you don’t understand that psychology, you’re going to run into roadblocks all the time.

Michael Frank: Agreed. So if you do speak to someone about their sacred cow, you need the emotional intelligence to recognize when you’re doing so, and you also need to touch upon it gently and indirectly so they don’t feel threatened and under attack.

This interview has been edited and condensed for clarity.

Kevin de Laplante

Kevin deLaplante studied physics and philosophy as an undergraduate student, and did a PhD in philosophy from Western University. He worked for sixteen years as an academic philosopher teaching courses in philosophy of science, logic, critical thinking and ethics. From 2008-2012 he served as Chair of the Department of Philosophy and Religious Studies at Iowa State University. 

In 2015 Kevin gave up the security of a tenured academic position and a steady paycheck to move his family back to his hometown of Ottawa, Canada, to work full-time as an independent educator, creating resources that help people develop their critical thinking, decision-making and persuasive communication skills. Kevin is the host of the Argument Ninja Podcast and creator of The Critical Thinker Academy  video tutorial site. 

Trending

Copyright © 2019 lifelessons.co All Rights Reserved.