There’s an old adage that when you repeat a lie enough, it becomes the truth. Psychologists call the illusory truth effect. BBC Future digs into how it works, why it works and why it doesn’t.
Image by www.hacienda-la.colora.com [Flickr]
As humans, we tend to believe things that are plausible, which means when someone makes a ridiculous statement with no scientific backing but it feels like the statement could be true, we tend to believe it. The more that statement is repeated, the more believable it sounds. Unless, of course, we bother to actually research a statement.
If you have prior knowledge of a subject, say, you've been studying greenhouse gas emissions for 30 years, someone just repeating that climate change isn't real won't have an effect on you. When you're armed with knowledge, you can fight against the illusory truth effect. Still, repetition might make a statement feel true, but it can't override knowledge that we have to the contrary. This doesn't mean that the illusory truth effect doesn't effect everyone, because you're probably not an expert in everything. Here's BBC Future:
If every time you heard something you assessed it against everything you already knew, you'd still be thinking about breakfast at supper-time. Because we need to make quick judgements, we adopt shortcuts -- heuristics which are right more often than wrong. Relying on how often you've heard something to judge how truthful something feels is just one strategy. Any universe where truth gets repeated more often than lies, even if only 51% vs 49% will be one where this is a quick and dirty rule for judging facts.
Which is all to say that while you can't convince an expert that their knowledge is wrong, everyone else out there is still ripe for manipulation. Here's BBC Future on fighting against this:
If repetition was the only thing that influenced what we believed we'd be in trouble, but it isn't. We can all bring to bear more extensive powers of reasoning, but we need to recognise they are a limited resource. Our minds are prey to the illusion of truth effect because our instinct is to use short-cuts in judging how plausible something is. Often this works. Sometimes it is misleading.
Once we know about the effect we can guard against it. Part of this is double-checking why we believe what we do -- if something sounds plausible is it because it really is true, or have we just been told that repeatedly? This is why scholars are so mad about providing references - so we can track the origin on any claim, rather than having to take it on faith.
In short, a person's BS detector will eventually run out of batteries. Keep at it and eventually you can convince people of all kinds of things. This doesn't even need to be all that nefarious. It can just be something stupid. Like, if you repeat over and over again that The Core is a good movie to anyone who will listen, eventually people will believe you, because they have probably never seen it and don't have any knowledge to the contrary.
How liars create the 'illusion of truth' [BBC Future]
This post is part of our Evil Week series at Lifehacker, where we look at the dark side of getting things done. Sometimes evil is justified, and other times, knowing evil means knowing how to beat it. Want more? Check out our evil week tag page.