top of page

Can this article change your mind about how minds change?

Updated: Jun 18



Whether you know it or not, you have a scale of believability for conspiracy theories in your mind.


You might think you don’t believe in any conspiracy theories at all, but if you were asked whether you’re 100% certain the seven heads of the major tobacco companies weren't colluding when they swore under oath that nicotine isn’t addictive, you might admit that perhaps there is some room for doubt. Not least of all because we have definitive proof of actual conspiracies occurring - such as Watergate, the MK Ultra project, and Cointelpro.


Or perhaps you’re a committed truther for conspiracy theories? Hopefully, though, you’d concede that maybe not all political leaders are lizard people, and that birds are real.

The point is not to exalt pedantry, but rather to recognize something important that might help us avoid a cultural apocalypse:


There’s not much we can know for certain, and that’s a very good thing. 


Understanding that not a single one of us has special access to absolute truth provides us with a vast common ground of doubt. This common ground exists in optimistic defiance of the increasingly polarized epistemic crisis we find ourselves in, which is marked by a sense of moral righteousness and intolerance on all sides.


Not only that, but this understanding also gives us a much more useful framework. When we recognize that we don’t know anything for certain, we are forced to think in terms of probabilities rather than true-or-false binaries. Far from non-committal nihilism, probabilistic thinking is characterized by degrees of confidence. 


Of course, no one thinks of themselves as thinking in rigid binaries, but we naturally slide into that way of thinking when we feel strongly about something because we can't help but be biased by that feeling. When we experience something as a moral imperative, we are predisposed to various forms of motivated reasoning and biases that serve to reinforce our beliefs and discredit those with whom we disagree. We might be able to control for at least some of our biases using reason and rationality, but it's a process of offsetting rather than a nullification. It also doesn't ultimately matter whether our beliefs are justified or not, our brains simply work this way regardless.


This is not to say that we shouldn’t view anything as a moral imperative, but when we do we also have an imperative to be especially skeptical of our own motivations biasing our thoughts and behavior.


However, something quite paradoxical occurs when we admit our shared uncertainty and try in earnest to think probabilistically: it is far easier to arrive at something closer to consensus. Not the blind, tribalistic consensus of conformity, but rather a tentative consensus of shared doubt tempered by reason. 


In such a good-faith crucible of truth-seeking endeavor, we can and ought to still disagree, but the intent and tenor are transformed entirely; and so too the consequences. When we’re willing to demonstrate our own uncertainty and openness to reason, it provides a compelling social cue for reciprocation. Intellectual humility isn’t just good for us, it also helps foster good-faith engagement by others.


I founded The School of Thought with the mission of teaching the next generation how to think critically, so that we might have some hope of not regressing back into the darkness that Carl Sagan warned us against:


“I have a foreboding of an America … with our critical faculties in decline, unable to distinguish between what feels good and what's true, we slide, almost without noticing, back into superstition and darkness.”


Sagan was right about a lot of things, it seems, but he was wrong that we wouldn’t notice it happening. Like a lot of other people, we noticed quite a while ago that things were falling apart and becoming more polarized and irrational, and every subsequent year has only served to affirm our fears in an alarmingly clear way (for a lucid and even-handed analysis of just how things are falling apart, see Jon Ronson’s excellent documentary series Things Fell Apart).


I was wrong too, because I no longer think teaching the next generation how to think independently is going to be effective enough to help us. I still believe that teaching philosophy, critical thinking, and in particular philosophy of science (how and why a scientific way of thinking works) is critically important, it’s just not going to be nearly enough, soon enough.


More importantly, what I’ve come to understand in the past 12 years of working in this space, is that human beings are social and emotional creatures first, and rational creatures very much second. The sequence is important. Any strategy that fails to understand and incorporate this reality is bound to either fail or at least be profoundly sub-optimal.


This might seem like an additional challenge, but I think it actually presents more of an opportunity than would otherwise be the case. We don’t have to teach all 100+ known cognitive biases, or play whack-a-mole with every irrational belief; instead all we really need to do is change people’s mindset. Once we have a truth-seeking, rationally motivated mindset, learning the tools and techniques of critical thinking is a natural consequence.


Changing people’s minds and mindsets might seem unrealistic at this point in history, but there’s an emerging wealth of research showing the vast majority of people are far more willing to update their beliefs and change their minds if the conditions are conducive to it. Even about subjects where we’d expect them to have strong and intractable moral convictions, such as abortion and marriage equality.


The important part to understand is “if the conditions are conducive to it.”


Our colleague David McRaney wrote a book called How Minds Change, which engagingly presents the wildly optimistic science of how people change their minds.

If you’re at all interested in understanding the mess we find ourselves in, and how we might get out of it, I strongly recommend reading it in its entirety, but here are some of the key takeaways:


  • Listen, and build rapport through common values. When people feel like their point of view has been understood, they’re much more receptive to alternative, and even contradictory, points of view.

  • Introduce a scale. We tend to think in binaries i.e. true/false, but when you ask people to consider a percentage probability or scale of extremes on either side of an argument, it forces us to think in a more rational and malleable way.

  • Revisit belief formation. When we’re asked to think about when and how a particular belief was formed, it allows us to re-examine and reconsider its validity.

  • Start with the heart. We are emotional creatures first, and this imperative drives our reasoning, not the other way around. If you understand the affective drivers of someone’s beliefs and demonstrate genuine sympathy for their perspective, they will be much more likely to listen to a reasoned argument.

  • Social connection fosters cognitive connection. We are deeply motivated to see things the way people on our side see them. An adversarial approach that makes someone feel like you’re not on their side is likely to be counterproductive. This doesn’t mean you can’t question someone’s ideas, they just need to feel like you’re doing so shoulder-to-shoulder rather than nose-to-nose. For example, rather than poking holes in someone’s argument, instead ask them questions that help them to be able to identify the limitations or qualifiers to their point.

  • Give it time. Most beliefs change slowly over time, so it’s much better to plant seeds of doubt and give them the fertilizer of kindness than to try and force someone to agree with you now.

  • Pull, don’t push. Providing ways for people to change their minds on their own terms and via their own reflective insights is far more valuable than any attempt to persuade them will ever be.

  • Model good faith. None of this works if you don’t commit yourself to the same openness. The effectiveness of these techniques is proportional to the authenticity of your own good-faith engagement. 


You can see a shareable presentation format version of these key insights here or hear David on the Clearer Thinking Podcast. 


We collaborated with David, Professor Sander van der Linden at the University of Cambridge (who is one of the world’s foremost experts on conspiracy thinking and misinformation), and our friends Prof. Deb Brown and Dr. Peter Ellerton at the University of Queensland’s Critical Thinking Project to create a gamified way to put many of these insights into practice. It’s called The Conspiracy Test.



This test provides a free and interactive way to see whether a particular conspiracy theory you think might be true can stand up to the scrutiny of your own self-directed critical thinking test.


A deep-state alien Illuminati lizard named Captain Zardulu guides you through a series of steps, and you can update your baseline skepticism at each step to increase your critical thinking score.


So far, results show a 30% average increase in skepticism for conspiracy theories, moving most participants from a state of relative ambivalence (53% skepticism) to a mostly skeptical position (83%).


The School of Thought is seeking philanthropic support to run a formal research study to determine the effectiveness of techniques for increasing healthy skepticism so that these techniques and insights can be implemented more broadly. For example on the major social media platforms.


We also recently launched The Critical Thinking Alliance as a way to galvanize the efforts of everyone who is taking an evidence-based approach to increase rationality and fight misinformation.


If you like to join the Alliance, along with Spencer Greenberg and Clearer Thinking, please visit criticalthinkingalliance.org

bottom of page