The case for a ‘disinformation CERN’

| May 22, 2021

Democracies around the world are struggling with various forms of disinformation afflictions. But the current suite of policy prescriptions will fail because governments simply don’t know enough about the emerging digital information environment, according to Alicia Wanless, director of the Partnership for Countering Influence Operations at the Carnegie Endowment for International Peace.

Speaking in a panel discussion how democracies can collaborate on disinformation at ASPI’s grey zone and disinformation masterclass last week, Wanless went on to say that what we really need is ‘a disinformation CERN’—in reference to the international particle physics research outfit, where countries pool their resources to operate the Hadron particle accelerator, study results and share findings. The scale and reach of the disinformation problem is so huge that only research cooperation of this kind can address the global shared threat to information systems.

Our democratic societies are doomed to decline if we don’t put forward major effects to arrest the effects of disinformation, said Wanless. Fellow panel member and resident fellow at the American Enterprise Institute, Elizabeth Braw, agreed that democracies are in the middle of a generalised disinformation crisis.

At the same time, incentives to act may be blunted as democracies become numb to a multitude of cascading political crises driven by disinformation. These are having a global-warming-type effect on our political and cultural ecosystems—disinformation is turning up the temperature and toxicity of public discourse, but it also perpetuates denialism about the problem of disinformation itself.

Wanless explained that there are two major areas of research shortfall that democracies need to address. The first is how disinformation flows around global, national, local and individual information landscapes, for example, among news, social media and private messaging apps.

The second gap is in our understanding of both its short- and long-term impacts. Do disinformation campaigns change election outcomes? What’s the relationship between disinformation and politically motivated violence? And what might be the effects on the health of political systems over months or years of disinformation? Wanless noted that from an academic standpoint, most theories of communication are stronger on accounting for transmission but very weak on effects.

In addition, there are yawning knowledge gaps on the effects of disinformation countermeasures. For example, said Wanless, there are very few credible studies on the effects of de-platforming disinformation spreaders. Does it help in limiting disinformation? Or do the perpetrators just move underground to more niche platforms, where followers can be further radicalised and exhorted to violence? To help answer these questions, Washington DC’s Capitol insurrection of 6 January needs to be examined more closely.

The other problem for research is that private companies hold most of the relevant data and are unwilling to share it widely. The platforms regard their data as valuable proprietary information and to date have only been willing to share small amounts with handpicked research institutions on particular cases.

A well-funded, multinational research effort could help spearhead a broad-based, collaborative approach with the digital information industry that holds the bulk of data on information transmission and user behaviour. The big search engines, social media platforms, television networks, public broadcasters and newspapers of record should all be included.

On the question of how much such research would cost and who would lead it, Wanless said she’s costed a number of models that start from US$10 million per year for basic research and rise from there. Given the cost of disinformation to economies and societies—How much has Covid-19-related disinformation alone cost in terms of loss of life and income?—it seems like a miniscule investment compared to what Western democracies spend on military hardware.

Wanless believes that platforms should in some way be involved in funding this research and that discussions around taxes on them should be taking this into account. But the effort should probably be led by academic institutions and civil society rather than the national security community.

Braw agreed with Wanless that better research is critical, but so is building whole-of-society resilience, starting immediately. If this isn’t done, responses to disinformation crises risk continually exacerbating their initial effects, until societies are caught in a spin-cycle of chaotic reaction.

Democracies need to get out of their defensive postures. Disinformation cannot be beaten with de-platforming and labelling. We need to get better at public messaging and be in constant preparation for crisis communication. When Covid-19 hit, governments should have been ready to go with public communication and planning for food, water, energy and fuel shortages.

A good example of multilateral cooperation and public communication on a grey-zone crisis, said Braw, was the 2018 poisoning of former Russian double agent Sergei Skripal and his daughter Yulia using the chemical weapon Novichok in the UK. The UK was able to quickly stand up an informal alliance of countries that expelled Russian diplomats and censured and sanctioned Moscow.

Companies are on the front line of disinformation and grey zone operations, and they need to be consistently involved in a whole-of-society response. But it’s important to note, according to Wanless, the private sector is part of the problem. There’s money and power to be generated by inflaming fear and uncertainty.

Braw waxed nostalgic about the early days of social media—visiting the offices of Twitter when it was just a handful of guys and a few computers. Governments completely failed to see how these platforms would transform politics, change the nature of governance and even threaten democratic institutions.

To add to the challenge, domestic political actors are increasingly getting in on the disinformation action and have no real incentives to neutralise its effects.

In terms of constraints, international law is much too vague on the subject of propaganda and there are no strong agreed guidelines that platforms can implement. So while state regulation may be an old-fashioned, ‘European’ response, said Braw, it’s probably the only effective way forward. Building a multilateral approach to regulating a decentralised, global information space will be the critical factor for success in the fight against disinformation.

This article was published by The Strategist.

SHARE WITH:

One Comment

  1. Alan Stevenson

    Alan Stevenson

    May 23, 2021 at 10:42 am

    After leaving school I joined the RAN as an electronic engineer. While there I cam across a number of people who, in any discussion, took the negative line. It appeared that whatever was being discussed, they had to take an opposing view. Facts and logic took a back seat. When the flaw in their reasoning was pointed out, they just retreated to another line of attack. It quickly became apparent that neither facts nor logic had any basis in their opposition.

    Some people prefer to counter popular concepts merely for the sake of opposition – they can neither be convinced nor accept other ideas counter to their stance. It is unfortunate that some folk like this are also relatively intelligent and can convince others to their way of thinking. This, I believe is a psychological problem, possibly associated with their upbringing (the Australian classic film ‘The Castle’ being a great example).

    At parents’ meetings of our local (private) school, some parents came out with ideas and suggestions which were so way out that the rest of us were unable to politely refute or suggest alternatives. People take in facts, figures and knowledge in various ways, then our brains appear to misinterpret because of previous information which has been accepted.

    Most people have memories which differ from those of others involved in the same activities at the same time. We store information in ways which are still not completely understood but which seem to involve association rather that separate concepts.

    One of the problems with internet disinformation is that it is difficult to identify misinformation from misunderstanding or even miscommunication due to the complexities of the language in which the information is communicated. If we have to rely on the intelligence of the recipients to determine the truth or facts of the idea then we really are in troubled waters.