Canada, NATO, and the Global Threat of Disinformation: An Interview with Brigadier-General Jay Janzen

Ryan Atkinson, Research Analyst at the NATO Association of Canada, recently sat down with BGen Jay Janzen to discuss the threat of information warfare on Canada and global affairs.

Ryan Atkinson: Please describe what your position entails as Director-General of Military Strategic Communications at the Canadian Armed Forces?

BGen Janzen: My main focus is to design and build a new strategic communications capability for the Canadian Armed Forces (CAF). Our desired end-state is the rapid delivery of impactful strategic communications effects in support of military commanders facing a complex and chaotic information domain.  

There are three main components to this effort. First, we need to design and build a Military Strategic Communication capability. We are quite good at communicating on routine issues, but much less effective at engaging in a deliberate, coordinated, and truly national way that helps achieve results aligned with Canada’s national interests. Second, we are working to enhance the culture, training and employment models of military public affairs officers and imagery technicians to be more effective in a rapidly changing and very complex operational environment. Finally, we are working to create deployable teams of military public affairs officers and imagery technicians that will work effectively together during military operations as part of a wider joint capabilities team.

There is a lot of culture change that needs to accompany these efforts. For example, our military planners will need to consider whether sending a cyber message might be as effective in some cases, as would be sending a missile. Defeating an insurgent group decisively on the battlefield using weapons and tactics has proven to be very difficult. Perhaps we should spend more time focusing on how to undermine the will to fight, or the reasons for fighting that these groups have.

Part of this may be achieved through kinetic means and weapons. But we also need to consider other tools such as information. Can we leverage informational tools and techniques to cause insurgents to question their leaders, reject violence, or to move onto more peaceful mechanisms to achieve change? I think we can.

RA: From your experience, what are the best methods to counter disinformation targeting the CAF specifically?

BGen Janzen: There are many valuable lessons learned and many factors considered when confronted with disinformation. A good place to start is what an adversary or actor hopes to achieve by leveraging disinformation. You need to understand the intent of actors in the information space and the space itself. For example, an adversary could be simply trying to use up the decision time of your leadership by throwing out twenty information attacks per day, as it is easy to come up with conspiracy theories, fake information, fake pictures, and fake allegations. If you can throw twenty of those at a government or a military per day and the government has a process in place by which the senior-most people need to make all the decisions and be aware of every detail of all those acts, you can imagine how much time can be used up. If you believe that is the main goal you need to take steps to deal with these things at a much lower level.

Their goal could be to cause confusion amongst populations. In the case of the shooting down of Flight MH17 over Eastern Ukraine, the Kremlin probably had one goal inside Russia which was to control the viewpoints of their own audiences. Externally, I think they realized they weren’t going to be able to impose a Kremlin view on, say, audiences from Western Europe or North America, so instead they opted for an approach of confusion… trying to sow a belief that it was futile to even try to understand who was behind the attack because there were so many conspiracy theories, so much conflicting information, so much noise that it would cause people to say: “I don’t know what is true.” When you are faced with something like that, and I think this is one of the universal truths about confronting disinformation, often the best approach is not to react to what your adversary or other actors are doing. You need to reserve reactions for times when it is absolutely necessary.

Instead, what your normal focus should be is on your own narratives, on the elements you think are important to communicate into the strategic information environment. In the case of NATO, we are up against information and hybrid activity in places like Eastern Ukraine, Crimea, Georgia, Moldova, etc. Instead of focusing on everything that potential adversaries are doing, we need to focus on why NATO is important, why our unity is important, why our presence to deter potential hostile activity is important, and why it is important to reassure allies that Article 5 and the Washington Treaty means something. NATO is the most successful military alliance of all time and that it is not going away, despite the wishes of potential adversaries.

RA: The integration of artificial intelligence with social media and internet bots will utilize machine driven communication tools to create individualized propaganda targeting people personally. How can the problem of disinformation be countered as technology becomes more sophisticated?

BGen Janzen: The main factor that overrides just about everything is that we are really talking about a chaotic environment that is characterized by constant change, evolution, and more information than we have ever seen before. The rate and volume of information available to people and the ability to leverage it in a very deliberate strategic fashion is unprecedented and creates potentially temporary windows of opportunity… like we saw with bots.

Bots in their current adaptation are becoming something that are less effective, but for a time they were quite useful, particularly in elevating the visibility of certain pieces of information, or overwhelming specific accounts to create the drowning out of certain messages in favor of amplifying others. Technology certainly allows you to do new things… for example it allows you to target certain information by region, profile, or directly to specific individuals. But we can’t forget at the same time that as this is happening it is changing how people consume information.

On the one hand, the vulnerability to this sort of thing has increased because people are looking at so much information so quickly and are not bothering to validate where it is coming from. At the same time, people are also becoming more skeptical in general. For example, the ability to scam somebody online was quite high seven years ago, compared to today where people are very cynical because they are aware that these types of techniques are being leveraged. I think that as we go forward, we are going to see this continual back and forth. As new techniques and technologies become available, they may be able to be leveraged for a short period of time, but then the trick will be exposed, and the level of awareness will quickly rise. Actors will then have to look for the next new technique.

There is a lot of discussion and belief out there about certain actors, like the Kremlin for example, leveraging these disinformation tools to great effect. We have to remember that the benefit they are seeing is very short-term. Yes, they are able to mount these troll or bot attacks that are allowing them to temporarily get their message heard and cause confusion over an incident. However, look at the long-term credibility and polling numbers on the views of non-Russian people vis-à-vis the intentions of the Kremlin, for example. What you see is a real erosion of Russia’s international standing as the global citizen sees Russia as not a responsible actor in the world today.

That has a longer term significant strategic cost and it is one of the reasons why NATO and its allies are not following suit, because we believe that in the longer-term, things like the trust of your population and credibility are quite frankly more important than short-term gains. Not to mention the importance we place on our over-arching values that prohibit us from doing these types of activities.

RA: You mentioned the tactic where threat actors use disinformation to target decision making, which is similar to Reflexive Control Theory of the Soviet Era. Are there similarities between the disinformation climate of today and during the Cold War?

BGen Janzen: Things like Reflexive Control were developed a long time ago and they still apply today. Maskirovka deception techniques were developed years ago. The idea of mirroring back if you are levelled with accusations from an adversary then you just reflect those accusations, has been used as a tool for a long time. I think the main thing that has changed is the information environment itself. The ability to reach far more people with far more precision with very little entry costs. You don’t need to spend a lot of money to become a significant player in the information domain. That is definitely one thing that has changed.

The second thing that has changed is an increased awareness on the behalf of state actors that what is really effective is the ability to utilize levers and tools across the spectrum of elements of state power simultaneously in a coordinated fashion to achieve results. I think this was going on in the Cold War as well, but there is a more deliberate effort today. Rather than just displaying economic and military power, states are thinking about how to leverage other layers using information to help leverage that.

For example, leveraging culture and in the case of Russian culture and orthodoxy in terms of religion and even entertainment. There is the phenomenon of the Night Wolves touring around and putting on shows that incorporate these various aspects, touching on Russian military and economic might, their history, and differences between the Russian and Western view. The cultural aspects are brought together to achieve particular effects. I think that was done during the Cold War, but it is being done more effectively today.

RA: During the annexation of Crimea, the question of whether what was observed was a new form of hybrid warfare was debated at the time. Do you think the hybrid warfare utilized by Russian forces in Crimea in 2014 was in fact new? Or are the technological means new, whereas the strategies being used through these means can be traced back to the Cold War?

BGen Janzen: There is nothing new under the sun. A lot of what we saw has been done before and certainly Western militaries have spent a lot of time in the last number of years dealing with counter-insurgencies. There are a lot of similarities with how counter insurgencies attempt to create influence in an operational theatre, and what a state actor might do in a hybrid construct. I think what they both have in common is there is a recognition of where their true strengths and power lies.

For example, if you are an insurgency group, you do not have the military power to fight back against the government and all the capabilities that it would have, including their military, their police, etc. Similarly, if you are a power like Russia, you understand that you don’t have the military capability to take on the United States or the NATO Alliance. That you are outmatched in terms of hard military and economic power. So where does the advantage lie? It lies in this pre-conflict phase, where things can be done covertly, where there is plausible deniability, and a level of uncertainty. If we consider the invasion of Crimea as an example, the deniability and uncertainty created a certain level of paralysis for Western governments that the Kremlin was able to exploit.

I think actors like Russia, China, North Korea, and Iran recognize that because of the authoritarian nature of their governments, they are far more mobile and flexible, unencumbered by many checks and balances in place in the West. Such long-term governments have the ability to think strategically and their planning can extend over a period of a decade or more. Whereas Western governments, by the nature of democracy, are generally focused on three to five-year cycles, and all of this adds up to a set of conditions that favour an authoritarian regime in a pre-conflict hybrid type engagement. They are able to maximize their advantage in this so-called ‘grey-zone’ and this is the very thing that the West and NATO need to come to terms with. How to minimize that advantage to ensure we remain successful, should adversaries continue to attempt to leverage such tactics.

RA: The Russian annexation of Crimea and the vast increase of anti-Western propaganda efforts has resulted in the rise of anti-disinformation entities like Stop Fake and Bellingcat. Are these institutions valuable and effective and are there dangers that they present?

BGen Janzen: I think that this is a wonderful reaction speaking to the strengths we have as NATO and as responsible democracies who are values-based and believe in the freedom of information, expression, and empowering our citizens. It’s very encouraging to see these grass roots movements in response to what common people recognize as a very unpleasant and, in fact, abusive use of information, the Internet, social media, and this sort of visceral reaction against it. This desire to expose it and get the facts out there in spite of an overwhelming and, in some cases, state sponsored efforts to repress them.

I spoke just a few minutes ago about the West coming to terms with this. I really believe that this is part of the solution, how do we make our society and our citizens more resilient to these types of attacks and then, how do we empower them to be the ones on the front lines exposing this type of false information, providing what they view to be the facts, and enabling societies in general to access better information, conflicting views, and to make more educated and informed decisions for themselves.

RA: Reportedly, Russian actors created a fake version of Stop Fake, could you comment on the influence that setting up a fake organization has on countering disinformation? Does it threaten the influences of these anti-disinformation groups?

BGen Janzen: This speaks to the overall Kremlin tactics in the world of disinformation called “mirroring” where you look at what your adversary is doing that is successful and you flip it on its head to use against your adversary. I think Stop Fake rose as a grassroots movement much like Bellingcat did, as a response to this disinformation. Stop Fake was started as a result of the conflict in Ukraine, particularly the manipulation and falsification of imagery. To support pro-Russian narratives using absolutely false and manipulated information. The Kremlin saw that it was quite effective because they were being exposed as lying and thought that it was hurting their campaign and could be used themselves.

RA: Could you talk about how the distinction between individual actors and groups pursuing disinformation for different purposes can make attribution difficult. Can the distinct characteristics between more guerilla disinformation and state sponsored efforts help with attribution efforts?

BGen Janzen: This is an incredible challenge. The leveling of the playing field in the social media environment, where the barriers to entry are so low, a single individual with the right equipment and training can enter into this game. It is often very difficult to attribute where information is coming from, making it really a challenge. Not only are there attribution issues for cyber reasons because it is quite easy to cover the origins of your attack or where you are coming from. But also, just in the number of players and folks that want to get into this game, it is quite difficult to parse all that out.

Adding on another layer where we in the West, obviously you want to understand who the actors are in the information space, what they are up to and what their motives are, but we have legal obligations in terms of how much information we can collect on individuals, and who is authorized to do that and under what circumstances. So, at times, folks like me in the public affairs and strategic communications world, we just don’t do that. We monitor the information space in general. We want to be aware of trends, but we don’t collect information on individuals to understand who they are. That would be left to law enforcement or intelligence agencies with the right authorities to do that. We are not into that game. So, attribution is a problem.

RA: Various NATO member states have been taking distinct actions to counter disinformation, including passing bills and creating specific action forces targeting the threat. What specifically are the CAF doing to counter the problem of disinformation?

BGen Janzen: What we are worried about within the Armed Forces is obviously our ability to operate around the globe on future operations, should the government decide to send us. We need to be able to get our message out both to our citizens, to global citizens and allies, and to within the areas we are going to operate. To win their trust so that they can understand why we are there and what we are doing. We understand that our adversary will seek to cause us problems in each of those areas, trying to turn our allies against us to erode support of our own population for what we are doing. They will also try to disenfranchise the people on the ground amongst who we are operating. We need to be prepared for that. We are not going to use the same tools and techniques that our adversaries will use. We are going to find ways that fit within our own values and our own ways of doing things, so that we effectively counter them and effectively communicate our intentions.

In terms of what we are going to do here in Canada it is much more limited because other government departments have the lead in that area, but we will be mindful particularly regarding military subjects. We will be looking for disinformation and unusual trends in the information environment, and we will be flagging those to the relevant authorities. Perhaps providing some advice recognizing that within Canada the overall responsibility for communicating to citizens and information writ large rests with the government. We will communicate within our own mandate but are not policing the Canadian information environment.

RA: Do you believe that this year’s upcoming Federal Election in October 2019 will be targeted by threat actors and, if so, what do you think such an incident could look like?

BGen Janzen: It is difficult to say. Certainly, Western elections have been targeted, that is clear. It remains to be seen whether and to what extent Canadian elections will be targeted. I talked about how certain tools and opportunities are there for a time, and they become less useful because the gig is up, and people are onto the tricks that have been used. To some extent I think there is a whole lot of people watching for the type of election interference that we have seen in the past. With the creation of fake personalities and the leveraging of divisions within society. I think a lot of people will be looking for that, a smart adversary might not do it or look for another way to do it. So short answer is I don’t know, but I do think that going forward we need to be mindful of the potential.

The key thing we need to do as democracies is engage in dialogue with our citizens, with non-government organizations, with the media, and with political parties before there is some kind of crisis or intervention, to talk through how we as a society should deal with these kinds of incidents. I don’t think we can predict when the next attack will be or what it will look like, but I am certain that there will be some form of interference in the future and we need to talk about it so that we are ready.

RA: Would you say that one problem is that it takes being the victim of a cyberattack or disinformation campaign before the correct actions are taken?

BGen Janzen: We could have a very long philosophical discussion about how societies deal with risk. There is a very good book on this by M.V. Rasmussen called The Risk Society at War. This very problem, that governments shouldn’t create policy to deal with national, environmental, and economic security, based on what has happened or based on what has just happened to another country, but rather they have to look at all things in terms of risk and the likelihood that something will happen. If it did happen, what would be the magnitude of the impact to the society? If something is relatively less likely to happen and has quite a low magnitude, it is probably not worth dealing with. But if something is not likely to happen but the magnitude is so extreme that it would be the end of your society… even though it is unlikely to happen you ought to prepare in some way to deal with that.

Of course, if it is likely and deadly, then for sure you better be ready to deal with it or else. I think we need to approach these things this way, not necessarily looking at things that have already happened, but by trying to be a little bit predictive. Given that, the recent security situation and the trends that are out there, what is likely to happen next, how likely it is to be targeted against us versus someone else? And what are the implications of that and then prioritize our responses. I think that is what military planners do and I think that is what governments are trying to do and we need to keep doing it and get better.

RA: What preventative measures can the Canadian public take to become more aware of these problems. You mention that the more a tactic, like bots for example, is used the more skeptical the public becomes of the problem. What can the Canadian public work to watch out for and even avoid in preparation for October’s election in order to make these campaigns less effective from the level of the individual?

BGen Janzen: If there is one thing I would like the Canadian public to consider it is information hygiene. What I mean by that is when you see questionable information on your social media feed, don’t just pass it on or let it go if you are not sure. Consider where the information is coming from and how reputable the source of that information is. If you are not sure, then do a little bit of your own checking before you further distribute, act on, or form an impression or idea on a particular piece of information.

I think that because of the overwhelming mass of information that we are all exposed to on a daily basis, we have become a little bit lax in terms of doing that compared to in the past. Obviously, if it is a picture of a dog jumping through a flaming hoop it is probably not that important to determine whether the video is real or not. But, if it is something that is a major decisive factor in our society or something that has major national security implications or implications for our society or our country as a whole, before people get too inflamed about reacting to it or sending that information, they really should stop and do a little bit of checking. If they doubt its authenticity or are able to determine that it is not real, they should out it and expose it, much like a lot of those grass roots groups are doing in other parts of the world. That is how we are going to really be effective as a society in terms of dealing with the problem.

There is one other thing that I would say is a second wish, that Canadians be mindful and seek out opposing views to their own beliefs and consider them with an open mind. Another trend that we cannot blame our adversaries for, we can only blame our society and technology, is this idea of echo chambers and the fact that we are able to customize the information and filter the information that comes to us so that it only comes from our friends, our idols, celebrities, and the commentators that we agree with. These echo chambers are leading to the polarization of society along political lines and along cultural lines and other divisions as well. If we let that go unchecked, it will not be helpful, and it creates vulnerabilities that actors can exploit in the information realm. But, if we try to remain open minded… trying to understand people with opposing points of view, and really find compromise by being the types of people that can have intelligent dialogues, we will be much more resilient to information attacks.

Disclaimer: Any views or opinions expressed in articles are solely those of the authors and do not necessarily represent the views of the NATO Association of Canada.

Ryan Atkinson

About Ryan Atkinson

Ryan Atkinson is Program Editor for Cyber Security and Information Warfare at the NATO Association of Canada. Ryan completed a Master of Arts in Political Science from the University of Toronto, where his Major Research Project focused on Russia’s utilization of information and cyber strategies during military operations in the war in Ukraine. He worked as a Research Assistant for a professor at the University of Toronto focusing on the local nature of Canadian electoral politics. He graduated with an Honours Bachelor of Arts in Political Science and Philosophy from the University of Toronto. Ryan conducted research for Political Science faculty that analyzed recruitment methods used by Canadian political parties. Ryan’s research interests include: Cyber Threat Intelligence, Information Security, Financial Vulnerability Management, Disinformation, Information Warfare, and NATO’s Role in Global Affairs.