A field guide to bullshit
- 13 June 2011 by Alison George
- Magazine issue 2816. Subscribe and save
- For similar stories, visit the Interviews Topic Guide
Intellectual black holes are belief systems that draw people in and hold them captive so they become willing slaves of claptrap. Belief in homeopathy, psychic powers, alien abductions – these are examples of intellectual black holes. As you approach them, you need to be on your guard because if you get sucked in, it can be extremely difficult to think your way clear again.
There’s a belief system about water to which we all sign up: it freezes at 0 °C and boils at 100 °C. We are powerfully wedded to this but that doesn’t make it an intellectual black hole. That’s because these beliefs are genuinely reasonable. Beliefs at the core of intellectual black holes, however, aren’t reasonable. They merely appear so to those trapped inside.
This involves appealing to mystery to get out of intellectual hot water when someone is, say, propounding paranormal beliefs. They might say something like: “Ah, but this is beyond the ability of science and reason to decide. You, Mr Clever Dick Scientist, are guilty of scientism, of assuming science can answer every question.” This is often followed by that quote from Shakespeare’s Hamlet: “There are more things in heaven and earth, Horatio, than are dreamt of in your philosophy”. When you hear that, alarm bells should go off.
There probably are questions that science cannot answer. But what some people do to protect their beliefs is to draw a veil across reality and say, “you scientists can go up to the veil and apply your empirical methods this far, but no further”. Behind the veil they will put angels, aliens, psychic powers, God, ghosts and so on. Then they insist that there are special people who can see – if only dimly – through this veil. But the fact is that many of the claims made about things behind this veil have empirically observable consequences and that makes them scientifically testable.
Psychologist Christopher French at Goldsmiths, University of London, ran an experiment into the effects of crystals to explore claims that holding “real” crystals from a New Age shop while meditating has a powerful effect on the psyche, more so than just holding “fake” ones. But French found no difference in participants using real and fake crystals. This was good evidence that the effect people report is down to the power of suggestion, not the crystals.
Some things may be beyond our understanding, and sometimes it’s reasonable to appeal to mystery. If you have excellent evidence that water boils at 100 °C, but on one occasion it appeared it didn’t, it’s reasonable to attribute that to some mysterious, unknown factor. It’s also reasonable, when we have a theory that works but we don’t know how it works, to say that this is currently a mystery. But the more we rely on mystery to get us out of intellectual trouble, or the more we use it as a carpet under which to sweep inconvenient facts, the more vulnerable we are to deceit, by others and by ourselves.
When someone is cornered in an argument, they may decide to get sceptical about reason. They might say: “Ah, but reason is just another faith position.” I call this “going nuclear” because it lays waste to every position. It brings every belief – that milk can make you fly or that George Bush was Elvis Presley in disguise – down to the same level so they all appear equally “reasonable” or “unreasonable”. Of course, you can be sure that the moment this person has left the room, they will continue to use reason to support their case if they can, and will even trust their life to reason: trusting that the brakes on their car will work or that a particular drug is going to cure them.
There is a classic philosophical puzzle about how to justify reason: to do so, it seems you have to use reason. So the justification is circular – a bit like trusting a second-hand car salesman because he says he’s trustworthy. But the person who “goes nuclear” isn’t genuinely sceptical about reason. They are just raising a philosophical problem as a smokescreen, to give them time to leave with their head held high, saying: “So my belief is as reasonable as yours.” That’s intellectually dishonest.
Any theory, no matter how ludicrous, can be squared with the evidence, given enough ingenuity. Every last anomaly can be explained away. There is a popular myth about science that if you can make your theory consistent with the evidence, then that shows it is confirmed by that evidence – as confirmed as any other theory. Lots of dodgy belief systems exploit this myth. Young Earth creationism – the view that the whole universe is less than 10,000 years old – is a good example. Given enough shoehorning and reinterpretation, you can make whatever turns up “fit” what the Bible says.
Suppose I look out the window and say: “Hey, there’s Ted.” You say: “It can’t be Ted because he’s on holiday.” I reply: “Look, I just know it’s Ted.” Here it might be reasonable for you to take my word for it.
You should be suspicious when people pile up anecdotes in favour of their pet theory, or when they practise the art of pseudo-profundity – uttering seemingly profound statements which are in fact trite or nonsensical. They often mix in references to scientific theory to sound authoritative.
It can cause no great harm. But the dangers are obvious when people join extreme cults or use alternative medicines to treat serious diseases. I am particularly concerned by psychological manipulation. For charlatans, the difficulty with using reason to persuade is that it’s a double-edged sword: your opponent may show you are the one who is mistaken. That’s a risk many so-called “educators” aren’t prepared to take. If you try using reason to persuade adults the Earth’s core is made of cheese, you will struggle. But take a group of kids, apply isolation, control, repetition, emotional manipulation – the tools of brainwashing – and there’s a good chance many will eventually accept what you say.
Stephen Law is senior lecturer in philosophy at Heythrop College, University of London, and editor of the Royal Institute of Philosophy journal, Think. His latest book is Believing Bullshit: How not to get sucked into an intellectual black hole