This is an era where information warfare is evolving at an unprecedented pace. The digital landscape, marked by rapid shifts, such as the acquisition of Twitter (now X), the Cambridge Analytica scandal, and targeted disinformation campaigns, has exposed the increasing vulnerability of democracies, particularly in the Global North. NATO distinguishes between “misinformation”—false or inaccurate information spread without malicious intent, though still harmful—and “disinformation,” which is deliberately spread to manipulate opinions and actions. Together, these dangerous forces advance rapidly across platforms, eroding the very foundations of trust in our institutions. This situation demands immediate attention.
For Canada, the stakes are especially high. With the passage of the Online News Act in 2023, a growing chasm has developed in the Canadian media landscape. Corporations like Meta (owner of Facebook, Instagram, and WhatsApp) have responded by blocking the sharing of Canadian news content, causing significant disruptions to how Canadians consume information. This decision, aimed at addressing the financial pressures faced by media outlets and redirecting revenue towards Canadian publishers, has inadvertently left many Canadians without direct access to reliable news. The Media Ecosystem Observatory reports that a year after Meta’s news ban, engagement with local Canadian news outlets dropped by more than five million – a 58% decline – across social media platforms. In turn, this exacerbates the information gap, leaving Canadians vulnerable to alternative narratives, often driven by algorithms and mis/disinformation.
The Toxicity of the Social Media Landscape
The toxic nature of the current social media environment cannot be overstated. Terms like “authenticity” (Merriam-Webster’s Word of the Year for 2023) and “polarization” (for 2024) reflect the deepening divide within Canada’s social fabric. Social media platforms, often shaped by algorithms, have created spaces that fuel echo chambers and amplify division. As users increasingly seek alternatives to escape the toxicity, Bluesky has emerged as one such platform. With its promise of an uncensored, algorithm-free experience, it’s drawing users who want to curate their own feeds and engage in more authentic exchanges. However, this shift also raises serious concerns about the fragmentation of discourse.
This intensifies the divide between users on different political sides, making cross-communication and consensus-building more difficult than ever. Now, what should be a shared public platform has instead become a series of fractured, polarized spaces, further undermining the possibility of meaningful conversations. Along with mis/disinformation attacks, the risk here is that, by retreating into these ideological silos, users may become more entrenched in their views and less willing to engage with those who disagree.
The Shadow of Russian Disinformation: Tenet Media’s Alleged Connections
The dangers of a fragmented and unchecked social media environment are evident in the recent controversy surrounding Tenet Media, a far-right media outlet founded by Canadian influencer Lauren Chen and her husband, Liam Donavan. Allegations surfaced that Tenet Media was involved in a covert Russian propaganda campaign aimed at amplifying pro-Kremlin narratives. The U.S. Justice Department unsealed an indictment accusing two Russian nationals of setting up Tenet Media as a front for Russian propaganda, with connections to high-profile far-right figures like Lauren Southern. Whether it’s Russia’s use of social media to spread disinformation or the influence of domestic actors promoting divisive content, the digital landscape has become a battleground for competing narratives. Richard Fadden, former director of CSIS, stated: “If you think naively that the Russians don’t care as much about influencing Canadian thought, penetrating the Canadian government at all levels — I’m here to tell you they care deeply about shaping how you think, how you vote and sowing chaos and discord.” Canadians are left to navigate this complex web of information, much of it perpetuated in spaces where moderation and accountability are minimal or non-existent.
The Way Forward: Evolving Policies and Safeguards
Policies and safeguards against mis/disinformation must adapt as the social media landscape evolves. The EU’s second annual disinformation report reviewed over 750 cases, showing how mis/disinformation targets public trust and fuels division. Josep Borrell, the EU’s chief diplomat, called this a new form of warfare: “It does not involve bombs that kill you, but words and ideas that colonize you.” The weaponization of information threatens global stability.
To combat mis/disinformation in Canada, a multifaceted approach is needed, including stronger social media regulations, improved digital literacy, and proactive policies. Canada could partner with the Privacy Commissioners to launch a nationwide campaign, teaching critical thinking and online safety in schools and universities. Targeted social media ads could promote digital literacy and guide users in identifying and reporting false information. This strategy would hold platforms accountable and empower citizens to navigate the digital landscape responsibly.
Further, Canada should develop its own Digital Services Act (DSA), similar to the EU’s legislation, to hold platforms accountable. The Canadian DSA could enforce transparency in content moderation, require platforms to promote fact-checking and issue warnings about misleading content, and ensure algorithms don’t amplify harmful narratives. Platforms failing to comply could face penalties, incentivizing responsible behavior.
In the era of Trumpism and deregulation, social media is undergoing seismic shifts. Mark Zuckerberg’s introduction of Community Notes, which removes fact-checking, and Elon Musk’s hands-off moderation on X, place the responsibility of identifying mis/disinformation on users. This raises significant concerns about the spread of false information, the erosion of trust, and the potential for further polarization. These moves highlight the growing power of U.S. tech giants in shaping online discourse and policy making, while also underscoring the need for more robust regulation and accountability.
Policy recommendation such as a Canadian DSA could face pushback from influential U.S. tech leaders like Musk, whose growing power in the U.S. government challenges international regulations. Yet, nations like Brazil, which temporarily banned X for failing to moderate harmful content, show decisive action is possible.
Canada must confront these challenges, balancing the fight against mis/disinformation with the realities of a globalized digital ecosystem. By prioritizing transparency, accountability, and collaboration, it can lead by example and safeguard democracy in the digital age.
Photo: “Close-up of a hand holding a smartphone displaying various social media app icons on a dark background” (2019). By Magnus Mueller via Pexels.
Disclaimer: Any views or opinions expressed in articles are solely those of the authors and do not necessarily represent the views of the NATO Association of
Canada.