An “Infodemic” of Disinformation Spreading “Faster and More Easily Than This Virus”
COVID-19 and the spread of disinformation
Coinciding with the COVID-19 pandemic, Tedros Adhanom Ghebreyesus, Director-General of the World Health Organization (WHO), warned of an “infodemic” spreading faster than the virus itself. This infodemic has reached the Canadian border. A recent survey found that almost half of all Canadians express belief in at least one common falsehood regarding COVID-19. These include that the United States’ military was responsible for the emergence of COVID-19 in Wuhan, China and that 5G technology is being used to disseminate the virus. Both of these conspiracies have been spread by accounts linked to foreign state actors.
The infodemic has included both unintentional misinformation, and intentional disinformation. “Disinformation” means the intentional spread of falsehoods in an effort to divide and disrupt, and can be distinguished from “misinformation”, which lacks malicious intent. The intent of disinformation is to sow doubt and confusion. By reading, for example, that the virus is both a hoax and so deadly that Lithuania has run out of doctors, individuals begin to feel that the truth is impossible to know. The ultimate goal is to erode trust in democratic institutions, including national healthcare systems.
Philip Howard, the head of the Oxford Internet Institute, indicates that the pandemic has led to an increase in both misinformation and disinformation generated by foreign state actors, including from Russia and China. Disinformation has been spread online including by “bots”, automated computer programs that mask themselves as regular users on social media. In a study of 200 million COVID-19-related tweets by researchers at Carnegie Mellon University, 45 per cent were deemed to have been sent by bots. This immense volume of content is estimated to have reached nearly a billion users of social media sites like Instagram, Twitter, Facebook, and Reddit.
The dangers of disinformation cannot be ignored. Fears exist that disinformation will fuel xenophobia and even prove to be a major health risk, as accurate information is critical in a pandemic. Individuals who believe in disinformation such as the assertion that the virus can be “cured with saline” or that garlic protects you from the virus may risk not only their lives but the lives of others. Critics have argued that disinformation has created culture war issues surrounding public health guidance including the recommendation to wear masks. Falsities surrounding COVID-19 have been linked to assaults, deaths, and arsons.
Social media companies have struggled with disinformation for years, including infamously during the 2016 US presidential election. While companies have pledged to invest more in content moderation through algorithm or human moderators, the volume of content and the time taken to moderate it contribute to its limitations. Algorithms may vary in accuracy, and the daily onslaught of often hateful and violent content can be traumatic for human moderators. In many cases, companies continue to rely on users themselves to flag content for review.
While current disinformation campaigns have built off of previous political ones, some argue that the lack of partisanship or ideology related to COVID-19 has made platforms more responsive. The WHO has worked with social media platforms to either remove falsehoods or make them more difficult to find. In fact, coordination between social media platforms and the WHO actually began prior to the pandemic, with the partnership working to tackle misinformation surrounding health topics such as vaccines. Google created an “SOS Alert” in multiple languages so that the first information an individual would receive is from the WHO. YouTube removed videos linking COVID-19 to 5G technology and placed a banner to redirect users to WHO videos addressing COVID-19. Snapchat partnered with the WHO to create filters displaying facts on prevention. Twitter removed tweets critiquing quarantine measures, including by a head of state, and had previously banned political advertising. Facebook encouraged users searching for COVID-19 related topics towards the WHO website and provided the WHO with unlimited free webspace. The platform also launched a campaign to help its users identify fake news. WhatsApp, a messaging platform owned by Facebook, collaborated with the WHO to create a system to respond to questions from the public about the virus.
The infodemic has deepened the debate surrounding the responsibility of social media platforms regarding disinformation. Twitter has experimented with other methods to fight disinformation, particularly surrounding manipulated videos. In contrast, Facebook has presented the argument that its role is to champion free speech and act as a neutral forum. While Facebook has taken some steps to counter disinformation, critics have directed most of their disapproval towards it, arguing that the company has taken only small, reactionary steps to address the problems of disinformation on its platform and that the company needs to take a more proactive role in developing policies to prevent disinformation. The recent #StopHateForProfit campaign has brought together corporations, universities, and charities to halt their advertising on Facebook for the month of July to protest against the hatred and violence present on the platform.
At the user-level, individuals are encouraged to remain critical, verify information, ask other users to remove misinformation, and report posts to platforms. Users can also seek out education and training resources that will better equip them to identify disinformation, such as the free CrashCourse Media Literacy series on YouTube.
As the infodemic will not end with the development of a vaccine for COVID-19 and social media companies continue to debate the responsibility they hold with their users, we all hold some responsibility to inoculate ourselves from the spread of disinformation.
Featured image: Attribution to Jason Howie. This image is licensed under CC BY 2.0.
Edited by Selene Coiffard-D’Amico