In an age saturated with information, misinformation has become one of the most destabilizing forces shaping politics, public health, and democratic trust. Governments typically respond with reactive tools such as fact-checking units, content moderation policies, or regulatory reforms aimed at platform transparency. However, these strategies tend to overlook a deeper issue, which is that misinformation does not simply arise from technology or platforms, but from the ways people think, feel, and make sense of the world.
As Jon Roozenbeek and Sander van der Linden address in their book The Psychology of Misinformation (2024), false narratives spread not simply because people are uninformed, but because such narratives speak to cognitive tendencies, emotional needs, and shared identities. In other words, misinformation exploits the same psychological architecture that allows communities to make meaning together. Understanding how and why misinformation takes hold is therefore essential not only for countering foreign interference or online extremism, but for strengthening the cultural and cognitive foundations of democratic life. This article therefore argues that misinformation cannot be fully understood, or effectively countered, by treating it simply as a failure of information systems.
Instead, it approaches misinformation as a social and psychological phenomenon shaped by three interlocking dimensions: cognition, emotion, and identity. By examining how false narratives become believable and socially binding, the article shows why fact-checking alone is rarely sufficient and why effective responses must speak to the deeper human needs that misinformation fulfills.
Cognition: Why Some Narratives “Feel True”
Humans are not designed to evaluate every claim with forensic scrutiny. We rely on heuristics, mental shortcuts that help us navigate overwhelming information environments. One of the most powerful is familiarity, where repeated information feels truer, regardless of whether it is accurate. This “illusory truth effect” is one reason debunked stories continue to circulate long after fact-checks have been issued.
However, the issue is deeper than cognitive laziness. A growing body of research about the psychology of misinformation shows that people who accept misinformation often reason very differently from those who reject it. Rather than simply being “less critical,” believers in implausible or unverified claims tend to ground their judgments in personal experience, anecdotal stories, and intuitive impressions. Non-believers, those who judge misinformation to be false, generally rely on more conventional standards, such as expert consensus, empirical evidence, and transparent methodological reasoning. What may seem like irrationality to outsiders is, for believers, a coherent and internally consistent way of determining what counts as reliable knowledge and a way of making sense of the world. This helps explain why factual corrections often fail, given that if people are not using the same criteria to judge truth in the first place, more information alone cannot close the cognitive gap.
Emotion and Meaning: The Hidden Functions of False Beliefs
Cognition explains how misinformation takes root, but emotion explains why it stays there. False narratives often gain traction in periods of uncertainty because they offer a sense of structure when the world feels chaotic. Consider the wave of misinformation during the COVID-19 pandemic, when
people confronting an invisible threat gravitated toward explanations, such as the ideas that the virus was intentionally engineered, or linked to 5G towers. Both thoughts provided clear villains and motives for people to blame during times of uncertainty. A similar pattern emerged after the 2020 U.S. election, when claims of a stolen election gave anxious communities a simple storyline to make sense of a complex political moment. These narratives “work” because they present good and evil, cause and effect, and agency and intention.
Misinformation, in this sense, is not merely a false statement but a form of storytelling that resonates because it feels emotionally and psychologically true. During periods of uncertainty, these stories become especially powerful because they offer structure and direction, organizing experience into clear patterns of cause, blame, and meaning.
Crucially, misinformation also meets deep social needs. Jetten et al. (2023) find that Individuals with stronger conspiracy mentalities feel more socially isolated and rely more heavily on online communities for validation. These groups provide solidarity, belonging, and a shared sense of grievance that traditional institutions often fail to meet. This emotional logic helps explain why
de-platforming conspiracy groups sometimes backfires. As people who already feel excluded simply migrate to more fringe platforms, their identity as “persecuted truth-seekers” becomes even more salient.
Identity: Misinformation as Group Belonging
A growing body of research suggests that susceptibility to misinformation is shaped as much by social identity as by individual cognition. Narratives that reinforce a group’s status, dignity, or worldview are more readily accepted, regardless of factual accuracy. In this sense, misinformation is not simply “believed”, but is performed as an expression of loyalty. For example, one research shows that in the case of Germans of Russian descent, individuals who identified more strongly with Russian heritage were significantly more susceptible to Kremlin-backed narratives about the war in Ukraine. Exposure to Russian-language media magnified the effect, creating an identity-affirming loop between information and belonging.
This is not unique to any single community. Identity-protective cognition shapes beliefs across the political spectrum, where people are more inclined to believe narratives that protect the status of their political, national, or cultural in-groups. The implication here is an understanding of misinformation not merely as a failure of knowledge, but a reflection of who we think we are, and who we think we are not.
Beyond Fact-Checking: Building Reflective Resilience
If misinformation takes shape through the intertwined forces of cognition, emotion, and identity, then responding to it requires more than simply correcting false claims after the fact. One place to begin is with how people are taught to engage with information in everyday life. Traditional media literacy often emphasizes broad skepticism, encouraging audiences to “question everything” or to focus narrowly on assessing sources. Yet this posture of indiscriminate doubt can be counterproductive. As O’Mahony et al. (2024) show, it may slide into a generalized cynicism that undermines confidence in accurate information as much as in falsehood. More promising are approaches that cultivate discernment rather than suspicion. Skills-based methods such as lateral reading, which train individuals to cross-check sources and situate claims within wider evidentiary contexts, have been shown to substantially improve people’s capacity to judge credibility in practice.
Finally, any long-term response must grapple with the emotional and social conditions that make misinformation appealing in the first place. Loneliness, declining trust in institutions and feelings of exclusion all make people more receptive to narratives that promise certainty and recognition. Efforts that strengthen community ties, create space for dialogue, and encourage civic participation may therefore be just as important for long-term resilience as any technical or informational intervention. Overall, these strategies point to a simple but demanding conclusion: countering misinformation is not only about improving the quality of information, but about tending to the human conditions through which people decide what, and whom, to believe.
Collective Consciousness in a Fragmented Information World
The challenge of misinformation is not simply about preventing deception. It is about preserving the conditions under which democratic societies can share a common reality. A society’s “collective consciousness”, its shared understanding of what is real and meaningful, relies on common reference points, trust in institutions, and norms of accuracy. When misinformation fractures those shared foundations, what erodes is not just factual knowledge but the capacity to deliberate, empathize, and govern together. Here lies the deeper threat of misinformation, where people will not only believe false things, but also that they will no longer believe the same things, even when they are true.
To respond effectively, we must therefore shift the conversation. Misinformation is not simply an information problem but a social one, rooted in belonging, identity, and emotion. Countering it requires building cultures of reflection, humility, and shared responsibility for truth not only online, but in classrooms, communities, and political discourse. If democracies hope to thrive in an era of informational fragmentation, they must invest not only in regulating platforms but also in strengthening the psychological and social foundations of truth itself.
Any views or opinions expressed in articles are solely those of the authors and do not necessarily represent the views of the NATO Association of Canada.
Photo retrieved from GettyImages.




