The surge of misinformation (frequently referred to simply as ‘the pandemic’) during the Times of Covid-19 is a source of concern to many. As opposed to its ‘twins’: mal- and disinformation, misinformation is frequently characterized as a factually fake, yet non-malicious (meaning non-intentional and non-manipulative) type of information distortion. Lack of malicious intent does not, in principle, limit its negative and dangerous impact on the general public, which in times of need looks for news and sources of information and can be tragically misled.
An efficient fighting of misinformation requires a number of preconditions to be taken – first, it is important to recognize misinformation as a phenomenon of its own (not all ‘guardians’ of information society distinguish it), second – acknowledge its range and multi-layered occurrence. Having these in mind it is easier to see the real nature and size of that threat, which leads to a better understanding of misinformation and indicates that the counter-misinformative actions ought to be taken on multiple levels and by all actors.
During the webinar we have asked our panellists about their perceptions of infodemics, their unique experiences and finally, their ideas on how to integrate different stakeholder to fight the misinformation. Owing to the fact that the experts come from and represent different areas and countries (the UK and Poland), we could shed some light on cross-sectoral and cross-border collaboration opportunities. As one of the main goals of the meeting was to clarify challenges and important points in the discussion on the subject of non-malicious information distortion, we have encountered a surprisingly concerted approach which clearly shows that there is a common perception of a problem and will to cooperate.
Misinformation in the lens of Academia
Based on the findings of a special report published by S. Brennen, titled: “Types, Sources, and Claims of COVID-19 Misinformation”, the misinformation environment demonstrates a complex picture summed up by a few observations: a) a lack of systematic approach to quantifiable research on misinformation which forces academia to rely on third-party fact-checkers report, b) ‘Pareto’ problem of misinformation sources (only around 20% of misinformation is top-down, but this segment principally gets most of the traction, c) specific content, based on misinterpretations, misleading data and, sometimes, using a plethora of medical or technical terms, and d) claims that need to be verified. These elements build up a frame of Covid-19-specific misinformation in cyberspace.
It is noteworthy that attempts of curbing of misinformation have a potential to backfire. It has been observed that exposure of fake news to people sceptical towards legitimate news outlets and social media platforms make them spread fake news even more. This is a practical effect of a psychological confirmation bias which pushes for employment of psychology of misinformation in the process of its countering. The quality of work of various fact-checking organisations is usually seen as high by the academia which benefits from their studies. The academic research mainly focuses on big social media platforms which is both justified and calling for an expansion to other, less popular, means of electronic communication which gain more and more attention of misinformation actors. The recent changes of social media policies and progressive shift toward better content policing is needed and well-received.
Misinformation from the perspective of fact-checkers
The surge of misinformation is a particular challenge to information societies as it undermines trust in legitimate and democratic institutions. Its size forced many fact-checkers to perform a full pivot from other research topics to Covid-19 which highlights the capacity issues and calls for building up capacity and resilience of myth-debunking organisations. The number of urgent issues and threats is overwhelming and fact-checkers are forced to prioritise and select which topic are they going to cover and which will have to be dropped. This organisational obstacle affects negatively other actors but could not be avoided without further development of this sector. This is visible especially in the light of the fact, that infodemics brought up a different, more organic, type of conversations which contrasts with usual disinformative content. Notwithstanding, it is still able to cause a genuine harm which calls for a serious and holistic approach to effectively tackle that threat. The holistic approach calls for a concerted, efficient and trust-based collaboration between fact-checkers and specialised institutions, especially when the misinformation pieces are highly specialised. Given to the fact that Covid-19-related fake news frequently stem from misunderstood or misinterpreted elements of actual research, fake-news-debunking may require a subject-specific knowledge that cannot come from the fact-checking organisation itself.
The synergy and capacity-building effects have to be achieved with a full awareness of the fact that even here regional specificity applies. False narratives such as ‘strong leaders handle crises better’ or ‘immigrants are responsible for spreading Covid-19’ are locally spotted and can be particularly harmful when these relate to a vulnerable group. The protection of people spills over to other fields as health-related misinformation has a tendency to associate itself with other topics, with notable examples being 5G and vaccines. In these cases, similarly to a situation where highly politicized events such as elections take place, fake news gain traction under circumstances where new technology or complex mechanisms play a role, allowing conspiracy theories to spread and fuel misinformation. In parallel to academic perspective, practitioners stress the enormous traction of fake news spread by celebrities who receive disproportionately more attention than other misinformation actors.
Social media platforms play numerous roles in the information environment, among them providing space for participation in the news workflow, but also, increasingly, content policing. Practitioners highlight the complexity of clash of values between freedom of speech and media quality. It is of highest importance for social media giants to find a right balance between the two, especially that there is an observable change of what people start publishing across social medias. Fight over media quality sometimes symbolically takes form of an information warfare which must not be lost.
Misinformation and the public policy
Academia, social media platforms and fact-checking organisations witness increasingly growing engagement on the side of the public sector. The acknowledgement and integration of anti-disinformation and anti-misinformation measures into public policies is a positive sign. Capacity-development and readiness materialises itself in the collaboration with domestic and international partners, forming specialised disinformation units within public administration and other activities designed to promote resilience and raise awareness. Among notable actions it is worth mentioning that UK government has launched a few online campaigns such as “Stop the spread” and “Don’t feed the beast”, while Polish government launched its “Fake hunter” campaign where it engaged with volunteers online.
Other good practices and tailored activities encompass sharing messages, conducting training and engaging with social media platforms by governments. The framework of international cooperation divides itself to both bilateral and multilateral forms, where multilateral mean first and foremost collaboration within EU and NATO. From the perspective of public policy, the role played by the independent fact-checkers is crucial as it contributes to overall media literacy and allows to understand what is happening in the cyberspace. It must be stressed that general resilience is strictly connected to trust in the democratic institutions.
Partners Road to CYBERSEC is British Embassy Warsaw.