The danger of isolated beliefs
The internet has made it easier than ever to acquire patchy, incoherent knowledge that doesn’t connect to anything stable. How do we avoid being wrong?
Many of the beliefs we hold and the things we learn come as part of a package. We don’t learn them in isolation, and nor do we deduce them from first principles: they’re part of what the philosopher Willard Van Orman Quine would have called “a web of belief”, a broader set of interconnected beliefs and knowledge that underpins everything we think.
This web of belief serves a useful purpose. It provides mutual reinforcement and justification for our knowledge: individual facts support and reinforce others, coalescing into broader ideas and informing our understanding of the world as a whole. It also provides social proof, because it’s never entirely of our own creation. We inherit our beliefs from our communities, our schooling, our work. In doing so, we effectively outsource some of our thinking: if we come to the same conclusions as a lot of other people who’ve thought about the same issues, then that means something for our chances of being correct (and of being understood). If we inherit a belief system and a knowledge about the world that’s stood the test of time, that’s much better than reinventing our own view of the world from first principles.
Sometimes, though – and particularly in the age of the internet – we learn something about a subject where our web of belief is less strongly developed. That leaves us in danger of holding “isolated beliefs” – beliefs that aren’t woven into our broader web of belief, and therefore stand more of a chance of being incoherent and incorrect. Sometimes this process is harmless: we get facts wrong and we develop theories that are a little askew; we hold two ideas that are actually, if we examine them, contradictory. But sometimes it’s actually dangerous: people believe conspiracy theories, fall for cults and develop extremist views that are entirely unmoored from reality and the wider community.
We should respond to this idea of isolated beliefs in two ways. First, we can audit our own beliefs and try to figure out where there are tensions. One example of this is in Julian Baggini and Jeremy Stangroom’s book Do You Think What You Think You Think?, which opens with a “philosophical health check”, a test that examines your beliefs on big issues and reveals where there might be tensions. (You can do the test online if you’re interested.) By revealing and resolving these tensions, we can better connect the disparate strands of our web of belief.
But we can only go so far by examining our existing beliefs. We also need to develop some kind of defence mechanism, to try to prevent ourselves from acquiring new isolated beliefs.
One way to do that is by adopting a conservative approach to acquiring knowledge and to making conclusions. Quine uses the example of your friend showing you a magic trick:
“There could be such a case when our friend the amateur magician tells us what card we have drawn. How did he do it? Perhaps by luck, one chance in fifty-two; but this conflicts with our reasonable belief, if all unstated, that he would not have volunteered a performance that depended on that kind of luck. Perhaps the cards were marked; but this conflicts with our belief that he had no access to them, they being ours. Perhaps he peeked or pushed, with the help of a sleight-of-hand; but this conflicts with our belief in our perceptiveness. Perhaps he resorted to telepathy or clairvoyance; but this would wreak havoc with our whole web of belief.”
Quine’s conclusion is that the principle of conservatism, and the strength of our web of belief, should suggest to us that the likeliest option is that we were tricked: “The counsel of conservatism is the slight-of-hand.” Believing it requires us to reject the fewest other elements of our web of belief; it’s the smallest step forward from where we were previously.
Quine’s conservatism doesn’t advocate never changing your mind, but rather that you should prefer to change it progressively and iteratively. It’s not that you’re more likely to be right, per se, but more that you’re far more likely to end up out on a limb, having made a huge error:
“The truth may indeed be radically remote from our present system of beliefs, so that we may need a long series of conservative steps to attain what might have been attained in one rash leap. The longer the leap, however, the more serious an angular error in the direction. For a leap in the dark the likelihood of a happy landing is severely limited. Conservatism holds out the advantages of limited liability and a maximum of live options for each next move.”
Sometimes, this conservatism might extend to not attempting to acquire knowledge at all. Scott Alexander wrote about an idea called “epistemic learned helplessness”. This is a solution to the problem of encountering ideas expressed by very persuasive but very wrong people: how do we stop ourselves from being drawn into their alluring nonsense?
As an example, Alexander describes reading the crackpot pseudohistorian (and persuasive writer) Immanuel Velikovsky:
“I read it and it seemed so obviously correct, so perfect, that I could barely bring myself to bother to search out rebuttals. And then I read the rebuttals, and they were so obviously correct, so devastating, that I couldn’t believe I had ever been so dumb as to believe Velikovsky. And then I read the rebuttals to the rebuttals, and they were so obviously correct that I felt silly for ever doubting. And so on for several more iterations, until the labyrinth of doubt seemed inescapable.”
The answer is to acknowledge your own lack of knowledge, and instead to trust the consensus of experts:
“Given a total lack of independent intellectual steering power and no desire to spend thirty years building an independent knowledge base of Near Eastern history, I choose to just accept the ideas of the prestigious people with professorships in Archaeology, rather than those of the universally reviled crackpots who write books about Venus being a comet.”
The main thing is recognising that our beliefs are probably less coherent and correct than we’d wish them to be. That will always be the case, but being aware of the existence of isolated beliefs – and trying to counter them – is a good discipline.
Further reading
Willard Van Orman Quine. “The Web of Belief”. McGraw-Hill Education, 1978
Julian Baggini and Jeremy Stangroom. “Do You Think What You Think You Think?”. Granta Books, 2011
Scott Alexander. “Epistemic Learned Helplessness”. Slate Star Codex, 2019
Add a comment