Introduction
As we enter 2022 full of hope that the worst of the pandemic is behind us in the UK, policymakers will start to consider what lessons have been learnt and what challenges we may still face. Since March 2020, a team of historians and digital humanities experts, with funding from the AHRC, has been focused on elucidating the role of rumour and conspiracy during epidemics and pandemics. By gathering a large sample of tweets pertaining to Covid-19 and comparing it with a variety of historical sources dating back to the 16th century, this project has yielded timely and important insights that can contribute to the ongoing national conversation and point towards policy solutions.
As increasing numbers of people turn to the internet to answer their health concerns, the potential impact of the circulation of misinformation and disinformation online has seldom been as serious as during the Covid-19 pandemic. Indeed, polling suggests that around a third of the UK population has encountered false or misleading information about the virus and the measures to prevent its spread. From the perspective of government and public health authorities, with vaccine uptake rates and public health at stake, the circulation of rumours and conspiracies demands legislative attention. The central challenge that policymakers face is distinguishing between the perennial and the historically contingent features of conspiracy generation and circulation. If rumours and conspiracies about the pandemic are primarily a consequence of timeless elements of human psychology, the problem will never be avoided altogether, and policies that reflect a mitigative rather than a preventive approach should be pursued. If, on the other hand, the scale of the problem is contingent upon innovations unique to our time, such as social media, then policies might aim at preventive regulation. The historical record is a powerful means of understanding the difference between behaviours that are rooted in human psychology and transcend contemporary circumstances, and that which is genuinely new about the phenomena we are witnessing today.
Misinformation and Disinformation: The Scale of the Problem
The pandemic was accompanied by a deluge of misinformation (that is to say, misleading information unintentionally presented as fact) that ranged widely, both in its divergence from the orthodoxy and in the degree of credence given to it. At one end of the spectrum, we encounter unverified but plausible speculation, such as the accidental ‘lab leak’ theory, initially treated as misinformation but more recently deemed ‘plausible’ by a US Intelligence report. At the other end, are the theories that Bill Gates is using the vaccine to implant microchips in people, that the disease is caused by the rollout of 5G, and that lockdowns and Covid vaccines are part of a plot by global elites to depopulate the globe and impose totalitarian control. According to recent research by the University of Oxford and the Oxford NHS Trust, in England around half of the population appears to endorse such conspiracy theories to some extent, with 10% showing ‘very high levels of endorsement’. This research also suggests that endorsement of these beliefs corresponds to reduced levels of compliance with public health measures and vaccination. Misinformation about the pandemic, it seems, is both abundant and consequential.
Disinformation (defined as false information that is spread deliberately) also represents a serious threat during the Covid pandemic. Although disinformation campaigns launched by foreign regimes are nothing new (they were an important feature of the Second World War, the Cold War and other conflicts), the internet now allows foreign and political actors to evade gatekeepers and insert themselves directly into our national discourse by posing as citizens or genuine online communities. According to a recent report by the European External Action Service, Russia and China have deliberately sought to undermine confidence in Western-made vaccines in order to promote their own state-made alternatives, as well as sowing doubts about institutions like the European Medicines Agency. Disinformation is particularly pernicious, as it is engineered specifically to maximise distrust and disunity in a population.
This leads us to consider just how grave a threat mis- and disinformation pose, which is crucial to the calibration of an appropriate policy response. While misinformation has been widespread, the public has largely complied with three unprecedented national lockdowns, despite assumptions in government and elsewhere that such measures would not be accepted. The vaccine rollout has also been largely successful, with Britain leading the world in its vaccination efforts for much of 2021. Moreover, studies of misinformation are notoriously difficult, for it is hard to ascertain the spirit in which people share it, whether from deep-seated belief, for satirical purposes, to debunk it, or as an indication of the supposed gullibility of other people. Even if people ‘believe’ in what they post online, it is not clear that this always translates into action.
On the other hand, if a small minority of the population is willing to act on their beliefs, the effect can be disproportionately damaging. For instance, more than seventy arson attacks on mobile phone masts have taken place across the UK, motivated by the idea that 5G causes Covid symptoms, while anti-vaxxers continue to take various forms of direct action, including picketing schools, threatening healthcare workers and even opportunistically menacing the leader of the Labour Party. The pandemic has also led to widespread scapegoating, with anti-Asian hate crimes rising significantly in the early months of 2020. Most importantly, marginal changes in the vaccination rate can mean the difference between life and death for many thousands of people. Recent reporting indicates that around three quarters of patients in intensive care units with Covid infections are unvaccinated. Given that the herd immunity threshold for Covid appears to be around 80% of the population at present (although this figure rises with each new variant), the final 20% of the population who remain unvaccinated play a disproportionately important role in allowing the virus to persist. While there are those who cannot be vaccinated for legitimate health reasons, misinformation clearly contributes to vaccine hesitancy. It is, therefore, imperative to find a legislative response to the circulation of rumour and conspiracy in this country, hence the need for historical context, to ascertain the precise nature of the problem and inform which policy approach to take.
A Perennial Problem?
Misinformation is far from a novel threat. George Orwell was discussing ‘post-truth politics’ more than eighty years ago, and our research project has found historical precedents for virtually every pandemic related rumour and conspiracy theory currently in circulation. Striking similarities can be seen between past and present theories about the origins, transmission, and treatments for pandemic diseases. For much of history, plagues were interpreted as acts of divine punishment for collective sins, and, though few people now believe in divine punishment, the Covid pandemic has been moralised in a similar way, with some blaming abusive animal agriculture or excessive international travel. Even the seemingly modern notion that the virus is a bioweapon finds its counterpart in the medieval belief that the plague was caused by Jews poisoning communal wells. The quarantine legislation of 1721 aroused accusations of despotic governance familiar to us today, and anti-vaccination marches and publications resisted the attempts of Late Victorian government to combat Smallpox.
Vaccination, which is, after all, an invasive medical procedure, has attracted particularly strong opposition since Edward Jenner’s discovery of smallpox vaccination was first publicised in the early 1800s, with critics describing it as a ‘pernicious and baneful experiment’. The fear that vaccines would be used to sterilize people is nearly two centuries old, as are the aspersions cast against their ingredients and efficacy. Government strategies to overcome this resistance have an equally long history and encompass a range of coercive and persuasive approaches, from mandating smallpox vaccination in the latter half of the nineteenth century to publicising the vaccination of Prince William and the Health Minister’s daughter in the early 1980s in order to encourage uptake of the Diphtheria, Tetanus and Pertussis (DTP) vaccine.
The remarkable continuity that can be seen in the scapegoating, anti-elitism, and conspiratorial thinking surrounding pandemic diseases through the ages points to human psychology as the common denominator. It is often difficult to accept that frightening and potentially lethal threats can appear randomly. Collective frustration about the relative powerlessness of ordinary people facing a virus can give rise to various fantasies about human agency in causing or preventing illness. Tamotsu Shibutani describes this psychological tendency as a ‘sense-making activity’. Historic evidence reveals that we have always been susceptible to various cognitive biases that make conspiracy theories seem more plausible. For instance, the ‘hyper-sensitive agency detector’, which refers to the tendency to attribute things to deliberate intention rather than random chance, as well as ‘confirmation bias’; the tendency to seek out evidence that supports our pre-existing beliefs to the exclusion of evidence to the contrary. Significantly, conspiracy theories offer people an explanatory framework in which their sometimes illegal non-compliance can seem justified, and even heroic, thus reflecting positively on their character.
The Mitigative Policy Paradigm
If rumours and conspiracies are in large part a perennial response to pandemics, arising from psychological needs and desires, policymakers should aim to mitigate rather than eliminate their inevitable occurrence. This requires a shift from a preventive approach, in which rumours are suppressed (which invariably makes them seem more legitimate to conspiracists), to a mitigative approach, in which the authorities recognise these cognitive tendencies and make pacifying alterations to the tone and manner in which official information is communicated.
Preventive approaches, such as blanket censorship and debunking measures, have seldom if ever been successful. In the past, censorship was often more performative than effective, a means to signal official disapproval, and it could even backfire by drawing further attention to the offending words or by driving the discussion underground. This can be seen in the current pandemic, with the growth of private messaging groups and encrypted communication. Censorship also risks smothering the truth, for example when Li Wenliang, a young doctor in Wuhan, tried to warn people about a new disease that was spreading in his hospital in December 2019. The Chinese authorities deleted his warnings and accused him of spreading false rumours, forcing him to sign a statement to that effect. He died of Covid shortly afterwards.
Debunking misinformation also has its challenges. On the one hand, ignoring a rumour can give the false impression that the authorities are unable to refute it. On the other hand, officially debunking rumours can give the most extreme ideas an air of legitimacy and more attention than they merit. The key to this seemingly intractable situation lies in realising that effective persuasion has more to do with identity than information. Information alone is insufficient, as Daniel Kahan et al have pointed out in relation to climate change: believers in anthropogenic global warming turn out to be no better informed about the basic science than deniers are. They argue that the reason why people adopt certain beliefs is because they convey values and give people status within their peer groups. As such, they note that ‘merely amplifying or improving the clarity of information… won’t generate public consensus’ if communicators are deaf to the cultural cues that might gain acceptance for their message.
The UK government’s effort to increase vaccine uptake in 2021 offers a case study of effective persuasion, in which the identity of the persuaders and those to be persuaded was of paramount importance. As the vaccine rollout progressed, the statistics revealed that vaccine uptake was lowest among ethnic minority groups, with 75% of black African and 66% of black Caribbean individuals having received one dose of the vaccine, as compared with 94% of white British individuals. In response to these findings, the Prime Minister authorised a highly targeted communications campaign from government and medical professionals, working in tandem with broadcasters and celebrities from ethnic minority backgrounds. An example of this can be seen in the ‘roadblock’ tactic deployed in February 2021, in which a synchronised release by all the major television broadcasters attempted to maximise the viewing figures for a video addressing cultural concerns about vaccination, hosted by celebrities like Moeen Ali, Romesh Ranganathan, Meera Syal, and David Olusoga. This campaign contributed to a rise in vaccination rates among ethnic minority communities, demonstrating the importance of identity in persuasive communication. Crucially, the campaign did not simply dismiss underlying, widely shared fears about technology and surveillance, but instead detached this issue from the question of vaccination.
This campaign, as with multiple policies during the pandemic, was informed by behavioral insights which are already widely used in commercial advertising. One of the core premises of behavioural science is that there are strong continuities in the ways in which people react to a given message, which allows the results of large trials to be extrapolated into public policies. This more or less defines the mitigative approach to policymaking, namely an approach that seeks to accommodate rather than eliminate certain continuities in social psychology. The mitigative approach is exemplified by the highly successful work of the Behavioral Insights Team (BIT), also known as ‘the Nudge Team’. The BIT was founded in the UK as a cabinet-level strategy unit in 2010, largely in response to the work of behavioral economist Richard Thaler and his Nudge Theory, which has given rise to government measures that seek to secure behavioral changes through ‘nudges’, or subtle alterations to the environment in which a choice is made. The ensuing expansion of the BIT into a social purpose company working with governments all around the world, suggests a growing awareness among policymakers of the efficacy of policies informed by evidence-based behavioural science research. For example, one of the BIT’s recent trials in the US, found that four carefully crafted messages could increase Covid vaccine uptake, by tapping into deeply emotive narratives such as ‘helping loved ones’ and ‘getting lives back’. However, there is some debate over whether government use of tactics to alter behaviour through changes in an individual’s choice environment rather than through open argument and persuasion can ever truly conform to the ideals of transparency and accountability, and using such approaches risks fuelling the very conspiracies about elite-rule and supposed ‘brainwashing’ attempts that counter-disinformation policies seek to mitigate.
This points to one of the blind spots of government efforts to counter mis- and disinformation; namely that these efforts themselves can contribute to the problems they seek to address. This highlights the need for mitigative approaches to be more self-aware and tonally sensitive. An example of this issue can be seen in the ‘rumour as disease’ metaphor, frequently used in official interventions. The ‘disease’ metaphor of misinformation, which can be traced back at least as far as the sixteenth century writings of Theodore Beza, the rumour ‘clinics’ of the Second World War and the World Health Organisation’s use of the term ‘infodemic’, evokes an apparently vulnerable and feeble-minded public who are in need of protection by those who know better. This framework aggravates the underlying causes of suspicion and distrust in the authorities, especially the belief that educated elites patronise ‘ordinary’ people and disrespect their values. Gareth Millward’s recent article on vaccine hesitancy among ethnic minorities raises similar concerns, arguing that identifying communities as ‘problems’ requiring ‘intervention’ only serves to increase their distrust of the government. In summary, when it comes to information policy, mitigative solutions enjoy considerably more success than preventive solutions, as they operate on a detailed understanding of human psychology. However, proponents of the mitigative approach must take care to consider how their interventions are perceived by those they seek to help.
What is Unique to Our Time?
While there are many striking similarities between historical and contemporary conspiracy theories and misinformation, there are also some noteworthy differences that follow from the particular nature of this virus, the unique elements of the political and media landscape, and the scope for variation in people’s responses to the threat. Unlike during many past plagues and pandemics, in our society, science and medicine have significant and well-deserved prestige. The NHS is one of the most trusted institutions in the United Kingdom, and there is overwhelming consensus in the medical profession as to the causes and nature of the disease. Michael Bresalier’s comparative analysis of Covid-19 and the 1918 ‘Spanish flu’ highlights the importance of contemporary information exchange and international scientific collaboration, although admittedly globalisation has led conspiracies to assume global proportions more readily than in the past.
Other differences between the present and the past are much more concerning. For instance, the internet has undermined the business model and status of traditional media at the expense of public trust in the few information sources that still claim to uphold journalistic standards. The resultant fragmentation of public discourse is exacerbated by social media algorithms which create filter-bubbles that can reinforce people’s views, in what is often referred to as ‘the echo-chamber effect’. Moreover, algorithms attuned to maximise user engagement have learnt the power of controversy and become adept at pitting groups with very divergent views against one another, as seen in the last US elections. Social media also offers direct access to foreign disinformation campaigners, wishing to sow discord in a targeted society. For example, a recent study has found that foreign disinformation on Twitter leads to a 15% increase for the median country in negative tweets about vaccines. Moreover, social media has been used as a crucial tool in organizing sometimes violent displays of dissent. There are, it seems, features of society today that are unique, both positive and negative, and this suggests there is a place for regulatory interventions aimed at preventing some of these modern problems from recurring.
The Preventive Policy Paradigm
Social media companies have often attempted to shift attention from their role in spreading mis- or dis-information by overplaying the case for historical continuity in the divided public sphere. Mark Zuckerberg, for instance, points out that political polarisation has been a feature of the US since before he was born. While this may be true, this does not mean that aspects of modern politics and media, especially social media, do not exacerbate existing problems. This offers a tempting target for policymakers looking to prevent a recurrence of online mis- and disinformation spreading at the scale seen during the last few years. But regulating this kind of technology is seldom simple.
Polling suggests that regulating social media is a popular option in the UK, with around 80% of the public agreeing that false stories about coronavirus should not be posted or shared on social media. Nevertheless, initial attempts at censoring misinformation on these platforms has started an arms-race between the censors and those who spread misinformation. For instance, when forced to abandon particular hashtags, people simply adopt new ones. Consequently, the debate has, to some extent, been driven into harder-to-censor formats such as memes and images. Regulators also struggle to distinguish between misinformation and satire or fair comment. While critics of social media companies have rightly pointed out that they often hide behind the claim that they are merely a platform for other peoples’ views, the volume of information posted and the fact that these online spaces have become in effect the public square make it difficult simply to apply traditional regulatory models.
This is not to say that existing efforts by content moderators and independent ethics committees, such as Meta’s oversight board, at blocking, fact checking and contextualising misinformation on social media haven’t had a positive impact. It is just not clear how far these approaches can be taken and whether more government pressure on such companies, as seen in the ‘duties’ demanded of providers in the Online Safety Bill, will achieve the desired results. In short, the preventive approach to information policy has few if any advantages over the mitigative approach, despite the unique features of the present day that offer potential targets to this legislative strategy.
Conclusion
The word ‘unprecedented’ has been widely used by commentators during the Covid-19 outbreak, however, the historical record is testament to the strong continuities in both the causes and content of misinformation about pandemics. The remarkable extent of these continuities points to the perennial, underlying influence of human psychology in shaping rumours and conspiracy narratives.
Since a problem arising from the nature of human cognition is ultimately ineradicable, policymakers should continue to primarily pursue mitigative rather than preventive solutions. These solutions should include any policy informed by behavioural insights, such as targeted media campaigns and identity-based communications (both of which have long been used in commercial advertising), in order to supplement the existing overreliance on debunking and censorship. However, policymakers should be sensitive to the tone and reception of their interventions, so as not to deepen the distrust in the authorities that gives rise to conspiracies in the first place. They could, for example, refrain from using the disease metaphor of misinformation.
While there are features unique to our time, such as social media, that exacerbate the underlying problems, policies aimed at the outright prevention of online misinformation enjoy few advantages. The efforts of bots and human content moderators is a step in the right direction, but social media companies cannot be relied upon to adequately regulate their own products, nor is further legislative pressure on such companies likely to prove a panacea. Thus, policymakers should ultimately be focused on harnessing the resources of government to mitigate the widespread presence of mis- and disinformation in our fragile public sphere. With the recently approved malaria vaccine set to become one of the major medical breakthroughs in decades, we have more than the defeat of Covid-19 in the balance.
Recent Comments