‘Should you finish a course of antibiotics?’
‘Drinking a few times a week ‘reduces diabetes risk’.’
Recently, headlines reporting ground-breaking findings in health research have dominated the news. However, often, findings are later proven wrong. Flaws in our thought processing mean these headlines should come with a warning. Ironically, when the media corrects misinformation, this can result in a backfire effect, whereby the original claim is believed more, and continues to influence behaviour.
In 2003, the US government officials stated there was no doubt that Suddam Hussein had weapons of mass destruction (WMDs). No weapons were ever found. All original media reports about possible sightings were later corrected with the absence of weapons widely reported. Despite this, these false beliefs persisted 3 years later (Kull et al, 2006). Unfortunately, clinging on to implanted misinformation has societal consequences. One such example is a study published in 1998 suggesting a link between MMR vaccines and autism. Since, the media has reported there was insufficient evidence to support original claims, the article was officially retracted because the author had a significant conflict of interest, and the author was found guilty of misconduct and lost his licence to practice medicine. However, the original myth is still endorsed. This year, a centre set up for autism research employs individuals who claim ‘vaccines are toxic’, despite the lack of evidence and dangerous implications. The misinformation has resulted in less children being immunised, which has led to a marked increase in vaccine-preventable disease and preventable hospitalisations deaths and substantial costs for the NHS (Larson et al, 2011).
Why is correcting misinformation so difficult? First, lets debunk a myth. Memory is not a tape camera. False memories are terrifyingly easy to implant. Simply adding in a childhood story, of being lost in a shopping centre, to a book of true tales collected from friends and family, can lead to this being ‘recalled’ several interviews later (Loftus, 1997). Instead of being exact replications of episodes in our lives, memories are reconstructions based on schemas of previously stored knowledge. Schemas are composed of concepts and the relationships between them. We have these knowledge structures for everything and they influence our daily behaviour.
To explain what i mean by these funny terms I shall talk you through a little restaurant scenario. Imagine you’ve booked a table for 4 at a restaurant you’ve never been to before, and a large 40-year-old man comes over and asks you what food you would like to order. Your response is slower than normal, because this man is inconsistent with your schema’s predictions. You probably expected to have been served by a young slim female waitress. You have learned this stereotype based on previous experiences. In your head, the concept of ‘waitress’ was strongly connected to ‘young’ and ‘girl’. If the same situation occurred again, you’d be slightly less surprised to have a male waiter, having built a connection between ‘man’ and ‘restaurant’, and perhaps be faster at responding you hadn’t yet had time to choose your food. You will have accommodated this new experience into your restaurant schema to leave you better equipped for future experiences. Without such alterations, when the routine is always the same, differences are harder to manage. You probably find restaurant situations filled with awkwardness are less frequent now you have tackled this little feat multiple times. I have sidetracked a lot from headlines in newspapers. But, hopefully you can now understand how your previous experiences shape your typical behaviour, and frequently activated schemas are difficult to shift away from. But, gradually, experiences help us adapt and change to deal with new experiences and we built up connections which challenge our stereotypes and expectations.
This reconstructive nature of memory sounds like an evolutionary problem, but it saves us time and energy. Rejecting correct information makes our brains sound useless and irrational. But, flip-flopping easily would be more useless. We have built up evidence to support our beliefs which bias us when faced with new information. Being skeptical is usually efficient. But of course this system is far from perfect. The reconstructive nature of memory caters to the stubbornness of our beliefs. In the Iraq example, Republicans are more likely to continue to believe the presence of WMDs in Iraq despite retractions (Kull et al, 2003), because the connection between these concepts is stronger.
Schemas make it more difficult for us to accept new information, but are not the only mechanism at play. How the media frames the correcting misinformation affects how influential it can be. Negation claims are usually futile. The Innuendo Effect (Wegner et al, 1984) is where qualifiers, words such as ‘not’, are processed differently to the key concepts of the claim. This means even if the claim states the word ‘not’ the connection between the concepts ‘WMD’ and Iraq’, for example, are still strengthened because this schema is activated. Correcting misinformation requires a lot more than negating claims. How about the credibility of the original source? Unfortunately, The Sleeper Effect (Hovland and Weiss, 1951) means we tend to lose track of where we acquire our information from. Whether its The Tab, the Daily Mail, or The Sunday Times, headlines leave their mark.
So how can we overcome backfire effects? Lewandowsky identifies several recommendations (Lewandowsky et al, 2012). Breaking connections between concepts is difficult. We prefer an explanation over no explanation. Therefore, providing an alternative explanation can help correct misinformation, particularly if the counter argument is generated by the individual. But, this is not always possible. Myths are usually endorsed when we know little about the topic, to fill causal gaps. In these cases, preexposure warning and fostering skepticism has been proven effective. It is important to raise suspicions about the rationale behind claims, for example bare in mind the being able to reduce antibiotic prescriptions would be an ideal outcome of research. Being aware of these effects and understanding the uncertainty of scientific conclusions helps us correct misinformation (Oreskes and Conway, 2010). So, bare in mind our mind’s imperfections when you pick up the paper.
Hovland, C. I., & Weiss, W. (1951). The influence of source credibility on communication effectiveness. Public opinion quarterly, 15(4), 635-650.
Kull, S., Ramsay, C., & Lewis, E. (2003). Misperceptions, the media, and the Iraq war. Political Science Quarterly, 118(4), 569-598.
Kull, S., Ramsay, C., Stephens, A., Weber, S., Lewis, E., & Hadfield, J. (2006). Americans on Iraq: Three years on. The WorldPublicOpinion. org/knowledge networks poll, 15.
Larson, H. J., Cooper, L. Z., Eskola, J., Katz, S. L., & Ratzan, S. (2011). Addressing the vaccine confidence gap. The Lancet, 378(9790), 526-535.
Lewandowsky, S., Ecker, U. K., Seifert, C. M., Schwarz, N., & Cook, J. (2012). Misinformation and its correction: Continued influence and successful debiasing. Psychological Science in the Public Interest, 13(3), 106-131.
Loftus, E. (1997). Creating false memories. Scientific American, 277, 70-75.
Oreskes, N., & Conway, E. M. (2010). Defeating the merchants of doubt. Nature, 465(7299), 686-687.
Wegner, D. H. (1984). Innuendo and damage to reputations. ACR North American Advances.