The Science Behind Believing Misinformation
Understanding Why We Fall for Misinformation—And How to Stop It
In today's information-saturated world, distinguishing truth from falsehood has become increasingly challenging. Politicians across the political spectrum make claims that contradict established science—from vaccine misinformation to climate change denial to evolutionary science skepticism. But the problem isn't just about dishonest politicians; it's also about how our brains are wired to process information.
Your brain has built-in mental shortcuts, known as cognitive biases, that make you vulnerable to believing claims that contradict established science. These shortcuts evolved to help our ancestors make quick decisions for survival, but in the modern information landscape, they can lead us astray. The good news? Once you understand these biases, you can learn to override them with intentional, critical thinking.
The Problem: How Politicians Exploit Cognitive Vulnerabilities
Political figures from all parties have made false claims that contradict well-established scientific evidence:
"Vaccines cause autism" – This claim has been thoroughly debunked by extensive research, yet it continues to circulate
"Evolution isn't real" – Evolutionary theory is supported by overwhelming evidence from multiple scientific disciplines
"Climate change is a hoax" – The scientific consensus on anthropogenic climate change is robust and based on decades of research
These aren't just political talking points—they represent a fundamental challenge to evidence-based decision-making. But why do these false claims gain traction despite contradicting established science?
The Science: Four Cognitive Biases That Make You Vulnerable
1. Confirmation Bias
What it is: Your brain actively seeks information that confirms what you already believe, while dismissing or ignoring contradictory evidence.
How it works: Confirmation bias functions like a mental filter. When you encounter new information, your brain unconsciously evaluates whether it aligns with your existing beliefs. Information that confirms your worldview gets processed more readily and remembered more easily, while contradictory evidence faces higher scrutiny and is often discounted or forgotten.
Real-world example: If you believe vaccines are dangerous, you'll remember every side effect story and ignore the millions of safety data points showing their efficacy and safety. You might recall a social media post about someone's negative vaccine reaction while overlooking peer-reviewed studies involving hundreds of thousands of participants that demonstrate vaccine safety.
The research: Studies published in Frontiers in Public Health (2024) have demonstrated how confirmation bias significantly increases susceptibility to misinformation, particularly in health-related contexts. Researchers found that individuals with strong prior beliefs about health interventions were more likely to accept false information that aligned with those beliefs, even when presented with contradictory scientific evidence.
Source: Frontiers in Public Health (2024) - "Confirmation bias & misinformation susceptibility"
2. Emotional Reasoning
What it is: Your brain prioritizes how information feels over whether it's factually true. Emotional responses can override logical analysis.
How it works: Emotional reasoning occurs when feelings become evidence. If something triggers fear, anger, or moral outrage, your brain interprets that emotional response as a signal that the information must be important and true. This bias is particularly powerful because emotions are processed faster than rational thought, meaning your emotional reaction happens before your logical brain has time to evaluate the claim's validity.
Real-world example: The claim "They're putting chemicals in our water!" feels scary and urgent, creating an immediate emotional response. This emotional activation makes the claim stick in your memory—even without any supporting evidence. The fear response can be so powerful that it persists even after you've been presented with data showing water safety standards and testing protocols.
The research: Research published in Cognitive Research: Principles and Implications (2020, 2021) has identified emotional reasoning as a key factor in the "illusory truth effect"—the phenomenon where people judge information as more truthful when it elicits strong emotional responses. The studies found that emotionally charged false information was more likely to be remembered and subsequently judged as true compared to neutral false information.
Source: Cognitive Research: Principles and Implications (2020, 2021) - "Emotional reasoning & illusory truth effect"
3. Illusory Truth Effect (Repetition Effect)
What it is: The more frequently you hear something, the more true it feels—even if it's completely false. Repetition creates a false sense of familiarity that your brain mistakes for accuracy.
How it works: Your brain uses familiarity as a mental shortcut for truth. When you encounter information multiple times, it becomes easier to process (a phenomenon called "processing fluency"). Your brain interprets this ease of processing as a signal that the information is likely to be true. This happens automatically and unconsciously, meaning you can fall victim to this bias even when you're actively trying to be critical.
Real-world example: See the same false claim on social media five times? Your brain unconsciously thinks: "I've heard this before, so it must be true." This is why misinformation campaigns often rely on repetition rather than evidence. A false claim about election fraud, for instance, becomes more believable each time it's shared, regardless of fact-checking efforts.
The research: A comprehensive review in Nature Reviews Psychology (2022) examined the psychological mechanisms underlying misinformation susceptibility. The research demonstrated that repeated exposure to false claims increased their perceived truthfulness, even among participants who were initially skeptical and even when the repetitions were explicitly labeled as misinformation.
Source: Nature Reviews Psychology (2022) - "Psychological drivers of misinformation"
4. Source Confusion
What it is: Your brain remembers the claim but forgets where it came from, treating information from unreliable sources the same as information from credible experts.
How it works: Source confusion (also called "source amnesia") occurs because your brain stores content separately from context. Over time, you remember what was said but not who said it or where you heard it. This means that information from a random wellness blogger can eventually carry the same mental weight as information from a peer-reviewed scientific study.
Real-world example: You remember the claim "GMOs cause cancer" but can't recall whether it came from a rigorous peer-reviewed study or a wellness blogger with no scientific credentials. Your brain treats both sources equally when the source information has faded from memory. Days or weeks later, you might find yourself sharing this claim as if it were established fact, completely unaware that it originated from an unreliable source.
The research: Research from The Decision Lab has extensively documented source confusion as a critical vulnerability in information processing. Their studies show that people frequently misattribute information to more credible sources than the actual origin, particularly when time has passed since initial exposure. This misattribution significantly increases the likelihood of accepting and spreading misinformation.
Source: The Decision Lab - "Source Confusion"
The Solution: Four Checks to Fight Back
Understanding these biases is the first step. The second step is developing practical strategies to counteract them. Here are four evidence-based checks you can use before accepting or sharing information:
1. Confirmation Bias Check
The question: Does this confirm what I already believe?
The action: Actively seek out opposing facts and perspectives. If information aligns perfectly with your existing worldview, that's a red flag—it means confirmation bias might be at work.
How to do it:
Search for "[claim] debunked" or "[claim] fact check"
Read sources that disagree with your initial position
Ask yourself: "What evidence would change my mind?"
Consult multiple sources across the political spectrum
2. Emotional Check
The question: Am I reacting emotionally to this?
The action: Pause for 10 minutes before sharing or acting on emotionally charged information. This brief delay allows your rational brain to catch up with your emotional response.
How to do it:
Notice physical signs of emotional activation (rapid heartbeat, tension, urgency)
Step away from the screen for a few minutes
Ask: "Why am I having such a strong reaction?"
Return to evaluate the claim more objectively after the initial emotional response has subsided
3. Repetition Check
The question: Have I seen this repeated a lot?
The reminder: Repetition ≠ truth. Frequency of exposure doesn't validate a claim.
How to do it:
Recognize that viral spread is not the same as verification
Ask: "Am I believing this because I've seen it multiple times, or because I've seen evidence?"
Remember that misinformation campaigns deliberately use repetition
Seek original sources rather than relying on repeated claims
4. Source Check
The question: Where did this actually come from?
The action: Verify the original source before trusting the claim. Trace information back to its origin.
How to do it:
Click through to the original source, not just the social media post
Evaluate source credibility: Is this a peer-reviewed study, a government agency, a credible news organization, or an anonymous blog?
Check the author's credentials and potential conflicts of interest
Look for whether reputable sources are reporting the same information
Use fact-checking websites like Snopes, FactCheck.org, or PolitiFact
Your Brain, Your Vote, Your Power
These cognitive shortcuts aren't your fault—they're part of being human. However, you have the power to override them through intentional, critical thinking.
Before You Vote: Do Your Research
Your vote shapes science policy, healthcare, education, environmental protection, and much more. Making informed decisions requires:
✓ Verify sources before trusting claims – Don't take information at face value, even from sources you typically trust
✓ Question possible misinformation – Develop a healthy skepticism, especially for claims that seem too alarming or too good to be true
✓ Use the 4 bias checks from this post – Make them a habit every time you encounter new information
✓ Seek evidence, not just emotion – Base your decisions on data and expert consensus rather than fear or outrage
The Stakes: Why This Matters
When voters fall prey to misinformation, it has real consequences:
Science policy: Funding decisions for research are influenced by public opinion
Healthcare: Public health initiatives succeed or fail based on community acceptance
Education: Curriculum decisions reflect voter priorities and understanding
Environmental protection: Climate policy depends on public recognition of scientific evidence
Technology regulation: Emerging technologies are governed based on public comprehension of risks and benefits
Your ability to distinguish fact from fiction doesn't just affect your personal decisions—it shapes the policies that impact entire communities and future generations.
Take Action
Your brain's shortcuts aren't your fault—but you CAN override them with intentional thinking. By understanding these four cognitive biases and applying the four checks consistently, you can become a more informed voter and citizen.
Make your vote count. Make it evidence-based.
References and Further Reading
Frontiers in Public Health (2024). "Confirmation bias & misinformation susceptibility" - Research on how pre-existing beliefs increase vulnerability to health misinformation.
Cognitive Research: Principles and Implications (2020, 2021). "Emotional reasoning & illusory truth effect" - Studies on how emotional responses override logical evaluation of claims.
Nature Reviews Psychology (2022). "Psychological drivers of misinformation" - Comprehensive review of cognitive mechanisms that make people susceptible to false information.
The Decision Lab. "Source Confusion" - Research on how people misattribute information to more credible sources than the actual origin.
This content is provided for educational purposes and represents current scientific understanding of cognitive biases and misinformation susceptibility. STEM to the Polls is a non-partisan organization dedicated to providing evidence-based information to young voters.
