The Skeptics Guide to the Universe by Steven Novella is one of the best books on critical thinking and skepticism since Carl Sagan’s The Demon-Haunted World. Although you would hope, in the 21st century, that it shouldn’t have to be explained why treating eczema with turmeric infusions is a bad idea, gullibility for pseudoscience is a recurring feature of human psychology and in need of constant debunking.
The running theme throughout the book is the concept of fallibilism, and how we are all wired to engage in biased and logical fallacious thinking (even self-proclaimed skeptics or critical thinkers). As the authors constantly remind us, this is a tendency we all have to perpetually work to overcome, and that no one is immune to bias simply because they identify as a skeptic.
With that in mind, here are five concepts/tools to become a better critical thinker.
Fallibilism is the epistemological stance that no belief or theory can ever be fully justified or rationally supported in a conclusive way. Some beliefs are more or less likely to be true, based on evidence and careful reasoning, but cannot be established with complete certainty. Fallibilism ultimately recognizes the limitations of human perception, memory, and cognition.
Consider that there are currently 192 biases listed on Wikipedia’s list of biases and 140+ logical fallacies in Wikipedia’s list of fallacies. The list of ways in which you can be wrong—and in which you can be delusional about your very ability to be wrong—is staggering. The world is complex, our cognition is faulty, and the tendency to be overconfident is strong. And so, the first rule of critical thinking is humility.
There are several ways to develop intellectual humility; the first is to learn more and read more as a method to discover how little you actually know relative to the volume of knowledge available. It has been demonstrated that those least knowledgeable about a particular area tend to be the most confident in their beliefs, a tendency identified in the Dunning–Kruger effect. The second method is to study the lists of biases and fallacies to see specifically how your thinking can go wrong, and to recognize that you are not immune to biases, especially if you are very intelligent or highly educated.
2. De-confirmation Bias
With the complexity involved in most problems, and the number of ways thinking can go wrong, it is nothing short of delusional to suppose that all of your beliefs are accurate and that all of your thinking is sound. It’s as close to certain as anything can be that some of the things you believe are wrong and some of your thinking is faulty; but, because you assume all of your beliefs to be true (otherwise you wouldn’t hold them), you don’t know which ones are false, so you are faced with a situation in which you know some of your beliefs must be false or misguided but you don’t immediately know which ones.
To begin the process of weeding out erroneous beliefs, you must embrace the de-confirmation bias, which is the opposite of the confirmation bias. The confirmation bias begins with wishful thinking; you believe something to be true because you want it to be true. Because you are emotionally attached to your belief, you search only for confirmatory evidence that can convince you that you are right, while ignoring all evidence to the contrary.
The solution is simple, if hard to implement in practice: employ the de-confirmation bias by actively searching for information to disprove your beliefs. Find the strongest and best arguments against your position, and, if your belief can withstand the scrutiny, the probability that your original belief is true dramatically increases. If you find the counter-arguments to be more persuasive, then you’ve uncovered one of your erroneous beliefs and displayed the virtue of intellectual courage and maturity by changing your mind.
3. The Steel Man
If you want to not only hold true beliefs but also be more persuasive to others, then you should develop the habit of “steel manning” your opposition. This was a technique frequently employed by Karl Popper; before a debate, Popper would strengthen his opponent’s argument before refuting it, so that when he did refute it, he did so conclusively and completely.
This, of course, is the opposite of the Straw Man fallacy, whereby you misrepresent or weaken your opponent’s argument so as to make it easier to refute. To begin with, this is hardly ever persuasive to anyone who doesn’t already agree with you, and you run the danger of being exposed as not really understanding the counter-argument.
It also deprives you of the ability to replace faulty beliefs. Instead, try this: identify a belief that you honestly haven’t considered from the opposition’s side. Now, instead of immediately rejecting the opposing idea or looking for its weakness, pretend you believe the opposite and construct the most persuasive argument you can for the counter-position. If you do this, one of two things will happen: you will realize that you were originally mistaken, or you will understand your opposition’s position better (perhaps better than they do), strengthening your original convictions.
4. Occam’s Razor
Occam’s razor is a conceptual tool that reminds us that the explanation with the fewest assumptions is more likely to be true. Occam’s razor does not, as is often said, prefer the “simplest” explanation; rather, it prefers the explanation that contains the fewest assumptions and takes the least for granted.
For example, we can ask the question, who built the ancient Egyptian pyramids? One answer is that humans did, which only requires the assumptions that humans existed and that they often cooperated in large numbers to erect massive monuments, using existing technology and conscripted labor, as documented in historical records.
The other answer is to posit that aliens built the pyramids. This explanation assumes that aliens exist, that they visited earth, that they were motivated to build pyramids, and that they covered their tracks by making it appear as if the pyramids were built by humans.
Since the alien hypothesis requires more unwarranted or weakly supported assumptions, taking into consideration all of the facts, Occam’s razor demands that we accept the first explanation of human design and construction. Likewise, Occam’s razor can be used to slice through all far-fetched conspiracy theories by pointing out all of the unwarranted assumptions required to make the theories plausible.
5. Hitchen’s Razor
Last, we have Hitchen’s Razor, named after Christopher Hitchens, who stated that what can be asserted without evidence can be dismissed without evidence. In essence, Hitchen’s Razor reminds us that the burden of proof lies with the person making the claim. For example, if I insist that unicorns actually exist, it is on me to provide the evidence and not on you to prove me wrong. In fact, it would be impossible to prove me wrong because it is impossible to prove the nonexistence of something that doesn’t exist. It is not the responsibility of anyone to disprove every preposterous theory that can be imagined, but rather it is the responsibility of the person making the claim to provide sufficient evidence for the claims they are making.
And remember, to believe something with weak evidence, or with no evidence, or in the face of overwhelming evidence to the contrary, is called a delusion. To others, it goes by the name of faith.
Further Reading in Critical Thinking
The Art of Thinking Clearly by Rolf Dobelli
Thinking, Fast and Slow by Daniel Kahneman
Being Logical: A Guide to Good Thinking by Q.Q. McInerny