Minds have infrastructure for spotting and removing bad ideas. Critical thinking skills, for example. Curiosity, honesty, and fair-mindedness are also key infrastructure. Together, traits like these work to keep the mind relatively free of mind-parasites. (We don’t often talk this way, but bad ideas are mind-parasites.) We can do much more to prevent outbreaks of unreason. The first step, though, is see mental immune systems for what they are: a critical factor in human wellbeing.
Yes! Every culture and subculture has infrastructure for slowing the spread of bad ideas. Fact-checkers, investigative journalism, the norms of critical discourse—these are the all elements of cultural immune systems. Some subcultures have high-performing immune systems (think science). Other subcultures have deeply compromised immune systems (think QAnon). The question we must ask is this: How can we make our new Internet-connected global culture highly resistant to cognitive infections?
“Bad idea” doesn’t have to mean “idea I dislike.” Instead, it can mean “idea with troublesome properties.” Falsehood is a troublesome property. When an idea incites violence or creates unnecessary division, it’s bad in another way. An idea can also be harmful, baseless, confusing, dispiriting, etc, and each bad-making quality counts against the idea. The trick is to overcome the cultural conditioning that has says, “Who am I to judge?” We’re all qualified to make such judgments. In fact, it’s negligent not to.
Just look past what you want to see, and see the idea’s actual properties. Listen with an open mind to other people’s observations, and give them credit for calling attention to relevant details. Filtering out bad ideas is a team sport—one we must play in a collaborative spirit.
A belief is a tendency to rely on an idea. When we rely on the wrong ideas, we end up harming ourselves or others. Believe that Covid is a giant conspiracy, for example, and you’re more likely to go maskless and spread a deadly disease. Responsible thinkers have always tried to avoid irresponsible beliefs, and philosophers have long sought the true test of responsible belief. The short answer is this: if a belief can’t withstand questioning, it’s not a responsible belief.
Reasoning done right illuminates both the good-making and the bad-making properties of ideas. When we reason together and help each other see the problematic features of the ideas we rely on, we become less susceptible to those ideas. This is the best-known way to strengthen mental immune systems.
A biological vaccine teaches the body’s immune system how to recognize a parasite. A mind vaccine teaches the mind how to recognize a mind-parasite. For example: when you learn that two things can correlate without one causing the other, you can free your mind from jumping to certain mistaken conclusions. When Socrates showed that we should let go of ideas that can’t withstand questioning, he was prototyping a general-purpose mind inoculant. In Mental Immunity, I take this inoculant and develop it into a mind vaccine.
There are many ways to bolster mental immunity! We can uphold standards of accountable talk. We can ask for evidence. We can gently and supportively encourage people to examine cherished convictions. We can help people understand that an idea’s "feeling right" is no guarantee that it’s true. Or good for you. Or good for others. Cognitive immunology points to treatments we’ve long overlooked: we can learn to spot mind-parasites; we can shed immune-disruptive ideas; we can understand the different types of mental immune disorder, and learn how to avoid them. We can inoculate minds against misinformation and propaganda. We can vaccinate against conspiracy thinking. The sky’s the limit!