What causes people to believe or accept irrational misinformation and conspiracy theories?
Several biases can affect how we process information.
Bias is a prejudice, unjustified favouritism or the “tendency to believe that some people, ideas, etc., are better than others.”
The classic psychology study “They Saw a Game – A Case Study” illustrates the effect of bias on our thinking. “We behave,” concluded the psychologists, “according to what we bring to the occasion.” Researchers showed the same film of a particularly tough 1951 college football match between Dartmouth and Princeton. Princeton won the game with many penalties that sparked an uproar in both campus newspapers. The psychologists conducting the research showed the same film of the match to a sample of undergraduates at Princeton and Dartmouth. Students at the two Ivy League schools perceived a very different game. Subsequent research has replicated these findings.
It is difficult to shake or dislodge stubborn beliefs. Our biases can cloud how we process new information. Social media can reinforce our biases and tailor our online experience to feed us more of the same information. The algorithms can push us in the direction we are already going.
Proportionality bias is our inclination to believe significant events come with proportionally big causes, explaining some people’s tendency to accept conspiracy theories about substantial events such as Princess Diana’s death or the 9/11 attack on the World Trade Centre in New York.
Algorithmic bias happens when human bias gets baked into data and search engine algorithms. These algorithms, in turn, systematically “disadvantage certain groups of people,” potentially harming disadvantaged people in the health-care, criminal justice and banking systems.
Media bias, according to Sage’s Encyclopedia of Political Communication, happens when “the media exhibits an unjustifiable favouritism as they cover the news,” presenting “viewers with an inaccurate, unbalanced, and unfair view of the world around them.”
Confirmation bias involves the selective recall or interpretation of information that affirms or confirms prior beliefs and values. It is our “subconscious tendency to seek and interpret information and other evidence in ways that affirm our existing beliefs, ideas, expectations, and/or hypotheses.” Confirmation bias becomes significantly pronounced when we process highly self-relevant or essential information or evoke an emotional response.
Defining Confirmation Bias - Video
While confirmation bias is a natural tendency to pay attention to information that lines up with our preexisting beliefs and ignore contradictory information, motivated reasoning involves the impulse to readily accept new information that aligns with our worldview and scrutinize or be skeptical about information that does not correspond with our fundamental beliefs and values. In this case, the “reasoning processes (information selection and evaluation, memory encoding, attitude formation, judgment, and decision-making) are influenced by motivations or goals.” Motivations are desired end-states that individuals want to achieve.
Hearing is often believing. Humans accumulate knowledge in haphazard ways. Daniel Gilbert's book Stumbling on Happiness summarizes a lot of thinking about our brain and believing, stressing that “people are credulous creatures who find it very easy to believe and very difficult to doubt.” Gilbert’s experimental research with colleagues found that our default is to believe what we hear and read. It’s, in fact, an evolutionary response. Our ancestors believed what they saw because it helped them stay alive.
Quickly consider the following question and when you are ready, click to reveal the answer:
A baseball bat and a ball cost $1.10. The bat costs $1 more than the ball.
How much does the ball cost?
Answer:
Ball = $0.10 & Bat = $1.10 = $1.20 ❌
Ball = $0.05 & Bat = $1.05 = $1.10 ✅
Many people get the bat and the ball question wrong, illustrating how we do a lot of fast thinking. Our quick mental process can be biased. Nobel Prize winning scholar Daniel Kahneman’s book Thinking, Fast and Slow summarizes decades of research about the faults and biases baked into our decision-making. We are not always as rational as we like to think. We can’t always trust our intuition. Kahneman describes two types of thinking:
System one is near-instantaneous. It happens with little effort and is fluent, fast and spontaneous. Our experience and instinct drive it.
System two is slower, requires effort and is logical and conscious
Lots of us consume social media quickly, skimming and paying little attention. In this autopilot mode, we don’t critically engage, sometimes passing on or sharing misinformation unwittingly. Some research suggests lazy thinking – and not ideology or partisan motivations – is responsible for people’s failure to sort fact from fiction in news content. People fall for disinformation when they fall back on intuitions and emotions over reason and logic to determine what’s fact and what’s fake. The tsunami of misinformation in social media compounds the problem. Plus, research suggests people overestimate their ability to distinguish between fake and real. Those most confident in their ability to discriminate between fact and fiction are also more likely to get duped by misinformation.
Behavioural economist Dan Ariely offers the Funnel of Misbelief to visualize what causes people to accept irrational beliefs and conspiracy theories. People get sucked down the funnel when they are stressed. Ariely’s book Misbelief: What Makes Rational People Believe Irrational Things highlights how unpredictable stress can draw people into the funnel of misbelief, impairing cognitive functions and decision-making skills. Stress can lead people to feel hard done by. In turn, people seek answers as a means to regain some control. Control and relief can come when people identify a villain they can blame for their situation. This relief, warns Airely, is only temporary. People in this situation keep looking for short-term relief in the same place.
The emotional elements of the funnel of misbelief
We all feel stress.
Unpredictable stress can leave people feeling out of control.
Stress impairs cognitive functions and decision making .
Economic inequality can exacerbate the problem; whereas community support can dampen the effect.
Compounding stress leads people to feel a sense of being hard done by.
People seek answers to regain control.
Identifying a villain to blame can provide some relief.
The relief is fleeting and people keep looking for someone or something (conspiracy theories like a global conspiracy of elites running the world) to blame for their problems or poor economic circumstances.
The cognitive elements of the funnel of misbelief
Stress can push people down the funnel of misbelief. They look for easy answers or a villain to blame for problems.
Faulty cognitive impairment leads us further down the funnel of misbelief.
Confirmation bias clouds people’s thinking, leading them to dig in their heels for what they perceive to be the truth.
When people believe something (a misbelief, perhaps), they work hard to convince themselves they are right (motivated reasoning).
Conspiracy theories prey on these cognitive biases.
People deep down the funnel of misbelief often over-confidently estimate their understanding of how things work.
This video explores conspiracy theories in politics.
Engage with the following interactive activity
Listen to behavioural scientist Dan Ariely explain to CBC Radio One host Matt Galloway “what makes rational people believe irrational things.”
Biases make people vulnerable to misinformation spread by social media
Most Americans think social media sites censor political viewpoints
MIT Sloan study finds strong evidence of political bias in formation of social media ties