Beyond The Hype: How To Sharpen Your 'Bullshit' Radar

Cognitive scientist Shane Littrell explains why people who think they are immune to misinformation may actually be most susceptible to it.

By Mark Travers, Ph.D. | Tue Jul 11 2023 15:35:01 GMT-0400 (Eastern Daylight Time)

A new study published in Thinking & Reasoning has revealed that people who are overconfident in their ability to detect bullshit are often the ones who fall for it.

To understand the mechanics of why we sometimes believe ridiculous things, I spoke to the lead author of the study, Shane Littrell of the University of Miami. Here is our conversation.

Can you define what you mean by "pseudo-profound bullshit" and how it was measured in the study?

A lot of people in the general public have their own definitions of "bullshit," and those definitions can sometimes vary wildly. To be able to study any aspect of human behavior, you have to define it in a very specific way that allows you to measure it.

So, for our studies, we define bullshit (in general) as information constructed with a carefree indifference for conveying truth, accuracy, clarity, or meaning that is often used to impress, persuade, or otherwise manipulate the opinions and attitudes of others.

With this definition, we can examine different types of bullshit across different contexts while ensuring that they all share core characteristics that allow us to identify something as bullshit.

Pseudo-profound bullshit is a specific type of bullshit that sounds similar to the stuff you might encounter from New Age, inspirational self-help books and speakers. It's full of spiritual-sounding terminology (e.g., consciousness, awareness, self-transcendence) often mixed with just enough scientific buzzwords (e.g., quantum, potentiality, superposition) to make it sound more legitimate than it arguably is.

Pseudo-profound bullshit was constructed by a computer algorithm that combines random New Age buzzwords into sentences that are syntactically correct (i.e., all the nouns, verbs, etc. are in the right places) but semantically meaningless (i.e., the sentences aren't intended to make any sense). In this way, the pseudo-profound bullshit meets two criteria from our definition because it was created without an intention to convey meaning or truth.

So, two examples of what might be considered pseudo-profound bullshit are:

  1. Attention and intention are the mechanics of manifestation
  2. Perceptual reality transcends subtle truth

The key difference between these two quotes is that the second quote was randomly generated from a bullshit generator while the first one is an actual quote from Deepak Chopra.

For our study, we gave people a list of 10 pseudo-profound bullshit statements ("Hidden meaning transforms unparalleled abstract beauty") and 10 real inspirational quotes ("A river cuts through a rock, not because of its power but its persistence"), and asked them to rate them as either 'Profound' or 'Not Profound.' From these ratings we were able to calculate a score representing each person's ability to successfully discern bullshit from non-bullshit.

What factors did you consider when gauging participants' confidence in their bullshit detection abilities?

After participants had completed the bullshit detection task, we asked them three questions:

  1. How many do think you got right?
  2. What's the average percentage of items everyone else got right?
  3. On a scale of 0 (much worse than everyone else) to 100 (much better than everyone else), how would you rate your ability to detect bullshit in your everyday life?

From people's answers to these questions, we were able to calculate their overconfidence in their bullshit detection abilities (called 'overestimation') as well as calculate a measure of how much better they think they are at detecting bullshit than everyone else (called the 'better-than-average bias' or 'overplacement'). We also compared this to their ratings of their overall BS detection ability.

Can you elaborate on the concepts of "intuitive" and "reflective" thinking processes that people use when evaluating misleading information?

Cognitive scientists have identified two ways that people think about and process information:

  1. Intuitive thinking (sometimes called Type I thinking) is a very fast and automatic type of thinking. It's the type of thinking we use when we have to act fast and is generally what we mean when we 'go with our gut.'
  2. Reflective thinking (Type II) is relatively slower, more methodical, and more analytic. It's the type of thinking we use when we're solving tough problems, imagining hypothetical situations, or thinking very deeply about an issue.

One type of thinking is not necessarily 'better' than the other because they are each useful in different contexts. It's in those times when we use the wrong type of thinking for a particular situation where we can get into trouble.

Early research in this area hypothesized that people were more likely to fall for misinformation like bullshit and fake news mainly because they used intuitive thinking when evaluating the misinformation when they should've used reflective thinking instead.

Our results from Study 2 of this paper showed that this isn't necessarily the case. Some people actually do engage in reflective thinking when evaluating misinformation but then commit various reasoning errors when doing so.

In essence, while some people might fall for (or reject) misinformation immediately using intuitive thinking, others might inadvertently 'talk themselves into' believing it by using a type of error-prone reflective thinking (while others are able to reject it using error-free reflection).

How does your study's finding – that people with the lowest and highest bullshit detection abilities misjudge their capabilities – contribute to our understanding of misinformation susceptibility?

Our main finding was that the people who are the most susceptible to falling for bullshit are not only very overconfident in their ability to detect it, but they also think that they're better at detecting it than the average person.

This applied whether they evaluated the bullshit quickly/intuitively or spent more time reflecting on it. This is kind of a double whammy in terms of bullshit susceptibility that we call the bullshit blind spot.

The opposite side of that coin is that the people who are best at detecting bullshit are actually under confident in their detection skills and think they're worse at detecting it than the average person (i.e., they have a bullshit blindsight).

These findings shed more light on the cognitive mechanisms that cause us to fall for misinformation.

We published a previous study showing that overconfidence (including narcissism) can disrupt or otherwise prevent people from engaging in reflective thinking when they really need to or from realizing that they're not as good at it as they think they are.

These new findings suggest that this partly explains why some people are more prone to falling for misinformation. That is, the overconfidence they have in their ability to spot misinformation actually makes them even more likely to fall for it because it prevents them from engaging in the type of thinking they'd need to do to spot it in the first place.

Given these findings, what strategies or interventions do you suggest to improve people's ability to detect and resist misleading information?

My main interest is figuring out why reasonable people believe dumb things, which happens to all of us more times in our lives than we'd like to admit. And, it might go without saying, but it's objectively worse to experience a bullshit blind spot than to experience bullshit blindsight.

That is, a person who's not only bad at spotting bullshit but thinks they're awesome at it will likely be worse off in a lot of ways than a person who is good at spotting bullshit but underconfident at it.

I think one of the more effective things that people could do right now to improve their bullshit detection ability is to practice intellectual humility (i.e., accept the fact that you could be wrong) and make a habit of 'pumping your mental brakes' when you encounter new information.

I often encourage people to practice what I like to call 'productive doubt,' which is a type of skeptical mindset that leads you to check sources, do some factual digging, and verify the veracity of claims rather than just blindly accepting them. This is tough for most people, because we all like to believe that we're smart, that we're in complete control of what we think and believe, and that we aren't easily fooled. Unfortunately, and as our findings suggest, the people who are the most easily fooled are the ones who believe they aren't easily fooled.

This is important to keep in mind when you're on social media, and especially when the consequences for believing a claim would directly impact your health (or your family's), your money, or your vote.

These are some of the most important and impactful areas of our lives, which makes them prime targets for bullshitters and grifters. When I encounter claims that seem a bit fishy to me, I kind of jokingly like to imagine that John Cena meme in the back of my head saying, "Are you sure about that?"