Facts can make misconceptions stronger

Among the wit and wisdom of Homer Simpson is the dismissive comment that “You can prove anything with facts.” But now it appears that for once, Homer was wrong.

A paper from academics at the University of Michigan and Georgia State University argues that not only can facts sometimes fail to correct misconceptions, but in some cases they can even make people more likely to believe in those misconceptions.

The research was within the context of political reporting, where what is true and what is a misconception is somewhat fuzzy. The authors chose to define a misconception as when “beliefs about factual matters are not supported by clear evidence and expert opinion”, which includes both beliefs which are absolutely incorrect and beliefs which can’t be proven untrue, but are unsubstantiated.

The new studies built on previous research which had found that the less accurate people’s beliefs are, the stronger they hold those beliefs. They also showed that people who hear a factual statement and then a claim which falsely discredits the statement are not only likely to believe the statement is false, but be more confident in that belief than somebody who simply hears and believes a statement without any discrediting claim.

The new research consisted of five studies which used mocked up news reports, combining extracts from a variety of real reports. The reports contained both facts and claims about various subjects. In most cases, the report would present the claims followed by the factual statement, usually with the claim being in the form of a suggestion or implication rather than an outright lie. That was designed to mimic the pattern in real reports to present opposing arguments, even if the article gives additional weight to one side.

These included those where the misconceptions were more likely to be held by conservatives (that Iraq had weapons of mass destruction before the 2003 US-led invasion) and more likely to be held by liberals (that President George W Bush banned stem cell research).

(It’s important to note with the Iraq question that the misconception was the idea that the WMD’s existence was a fact; while they indeed may have existed, that isn’t a proven fact.)

The overall pattern of the findings was that while the reports enhanced the beliefs of those who previously held the factual view, those who already believed the claims didn’t just dismiss the inclusion of the fact but wound up believing in the claim even more strongly.

The results also showed that not only were both liberals and conservatives equally affected by this pattern, but that it made no difference what source the researchers claimed as having produced the mocked-up article. Even where a test participant read an article they were told was from a source which shared their political persuasion, they were still likely to strengthen their misperception upon being presented with a claim that supported it and a fact that contradicted it.

The conclusions have ramifications both in political reporting and psychology. They show that the effects of biased reports aren’t necessarily softened by the practice of at least including a token contradictory viewpoint, even when that viewpoint is clearly factual. And they raise questions about the way in which people come to reasoned judgments when confronted with contradictory information.

[Picture credit: MIT OpenCourseWare]