Painted Dog in Campaign BriefJuly 8, 2016
Painted Dog’s Jessica Henderson had her article published in June’s edition of Campaign Brief. She looks into the resilience of misinformation and how it can be overcome. Read the full article below:
The truth is out there – but I prefer the misinformation
Every day we are bombarded with information from television, the internet, social media, radio, out of home, and people we interact with. We accept a lot of information from these sources as true unless we have a reason not to. However, occasionally (and perhaps less occasionally during a federal election campaign!) information we have accepted as being true turns out to be inaccurate. This misinformation may be retracted or publicly revealed as false.
But misinformation is stubbornly resilient, lasting even once retracted or proven inaccurate. This phenomenon is known as the ‘Continued Influence Effect’. Research has shown that despite clear retractions, people are reluctant to dismiss the original piece of misinformation from their minds. Think of the claimed link between childhood vaccinations and autism. A medical article in 1998 claimed that a common vaccination administered to young children was linked to the development of autism later in childhood. Despite this study being discredited and the findings retracted, people continue to believe a link exists.
Why do we cling to misinformation despite retractions?
There are many reasons why the Continued Influence Effect occurs. A simple one is repetition. Repeat something enough and it eventually becomes hard to forget.
Secondly, people may believe that the original piece of information must represent a grain of truth from somewhere. Once embedded in a person’s mind it is hard to erase.
A more closely studied cause is alignment with a person’s current beliefs, known as confirmation bias. Acceptance of information that fits in with one’s belief system is easy to do. Having to accept a retraction of this information, however, may cause a challenge which makes it harder to accept.
This links in with a fourth influencing factor: how widely-held the misinformed belief is. If other people believe something, especially in our reference groups (e.g., our friends, family, colleagues etc.), then we are more likely to believe it too. Social media can also act as an echo chamber for our own beliefs based on the people we choose to associate with. We often think that if other people believe a piece of information, there’s probably something to it.
Added to the mix is the influence of the source providing misleading information. We often resort to mental shortcuts, known as heuristics, in processing information. As social beings, a useful shortcut when assessing the believability of a piece of information is assessing the credibility of the source rather than scrutinising the message in its entirety. Surely if the Australian cricket team says that Weetbix is the breakfast of champions it must be true!
Finally, perhaps the most interesting theory behind why we hold on so tightly to misinformation is the mental models theory. This theory proposes that in our mind’s eye we create ‘mental models’ of events as we know them.
For instance, if you are told that Jim was convicted of stealing from his company, your mental model would consist of Factor A (company property being stolen) due to Factor B (Jim stealing from the company). If it turns out that in fact Jim was falsely accused and is acquitted of this crime, it is likely that you will still be at least slightly suspicious of Jim. This is because there is now a gap in your mental model of the event. You still have factor A (company property was stolen), but you are missing Factor B (who stole the property). Even though you know that Jim has been acquitted, you are likely to prefer a complete model over one with a gap and so may continue to be suspicious of Jim and his involvement in the crime.
People will often rely on misinformation even if they believe and can recall the retraction. If they are not provided with an alternate piece of information, they will turn to the inaccurate piece rather than having no explanation at all.
Can the damage be undone?
In a technology-driven world where information is rapidly distributed it is inevitable that incorrect stories are spread. Like Barack Obama running for office in 2008, an opponent may start a malicious rumour, such as claiming he was not born in the U.S. We can retract this information or prove that it is unfounded (such as President Obama publicly presenting his U.S. birth certificate) but how likely are we to successfully counteract the damaging misinformation?
There’s no denying misinformation is incredibly resilient. However, there are a few ways to increase the likelihood of an effective retraction. The simplest is repetition. Just as repetition increases the likelihood of misleading information being ingrained in a person’s mind, repetition of a retraction works in the same way and makes it more effective.
A second technique is to replace the information retracted with new information. This refers back to our mental models of events. If you are told that Jim did not steal from his company but it was actually Sally, you are more likely to accept that Jim is innocent because you don’t have a gap in your mental model. To make the replacement information even more pervasive it should also explain why the misinformation was believed to be true (e.g. Sally blamed Jim first). This counteracts our inclination to wonder why the original information was given to us to begin with and be sceptical of the replacement information.
Finally, source credibility and trustworthiness influences the effectiveness of a retraction. Retractions coming from people who look trustworthy are more effective than those from untrustworthy faces. In fact, retractions from untrustworthy faces have been found to be completely ineffective.
Interestingly, face trustworthiness is highly correlated with attractiveness and perceived emotion. People who appear to be happy or attractive are deemed more trustworthy than those who appear more aggressive or less attractive.
So, if you are looking to retract a piece of misinformation effectively one should look to 1) have an attractive, happy-looking person provide the retraction; 2) replace the retracted information with plausible alternative information if possible as well as an explanation for the belief of the original misinformation and 3) repeat the retraction as often as possible in order to solidify it and make it more familiar.