Fight Against Misinformation on Wikipedia

The amount of fake news is increasing, disinformation is a threat to democracy. The Wikimedia Foundation is also fighting against untrue content on Wikipedia.

Ensuring the quality and neutrality of online information is of central importance when it comes to social issues such as elections and the formation of political opinion. We spoke to Costanza Sciubba Caniglia, Head of Anti-Disinformation Strategy at the Wikimedia Foundation, the non-profit organization that runs Wikipedia.

What makes Wikipedia different from other online information platforms

Costanza Sciubba Caniglia: Wikipedia is particularly important when it comes to accurate and neutral information: even though Wikipedia does not disseminate breaking news, the encyclopaedia is visited more than 15 billion times every month to check facts and search for accurate information. It also serves as a source of much of the information used by other platforms and digital products such as Google, Alexa and Siri. Wikipedia takes a different approach to ensuring neutral, fact-based information than many might expect or that is common on other digital platforms. Wikipedia relies on the human factor.

What exactly do you mean by the human factor?

Costanza Sciubba Caniglia: Wikipedia is particularly important when it comes to accurate and neutral information: even though Wikipedia does not disseminate breaking news, the encyclopaedia is visited more than 15 billion times every month to check facts and search for accurate information. It also serves as a source of much of the information used by other platforms and digital products such as Google, Alexa and Siri. However, Wikipedia takes a different approach to ensuring neutral, fact-based information than many might expect or that is common on other digital platforms. Wikipedia relies on the human factor.

What exactly do you mean by the human factor?

Costanza Sciubba Caniglia: Success is an obligation. And the Wikimedia Foundation and its 265,000 volunteers worldwide are aware of this and face it as a challenge every day. For example, since June 1, 2024, the main Wikipedia article on the European elections in the German Wikipedia has been accessed more than 256,000 times and updated by over 62 volunteer editors. In addition, 57 volunteer “observers” have decided to monitor the changes and check for conformity with the in-house guidelines.

But how can misinformation on the internet be filtered out by people?

Costanza Sciubba Caniglia: Over the last two decades, volunteers have developed a number of procedures to ensure that the information on the website is reliable. Administrators, for example, are Wikipedia volunteers with extended rights. They regularly investigate negative behavior – such as vandalism or undisclosed paid editing – on the site and take action against it.

In other words, a kind of “crowd consensus” for high-quality, neutral knowledge?

Costanza Sciubba Caniglia: The volunteers compile information on important topics in accordance with the editorial principles and guidelines of the encyclopaedia. Every piece of information on the platform must cite reliable sources such as newspaper articles and experts that have been classified as reliable by discussions in the community. This means that hundreds of thousands of people from all walks of life should effectively reach a consensus based on facts. Volunteers discuss, debate and often disagree until a common consensus can be reached on what content should be added to Wikipedia. Each edit can be viewed in the article’s history, and each discussion point can be read on the article’s discussion page.

And what role does the Wikimedia Foundation play in the search for misinformation?

Costanza Sciubba Caniglia: While the vast majority of issues with content and behavior on Wikimedia projects are resolved by volunteers, the Wikimedia Foundation has a Trust and Safety team that alerts and supports in some extreme cases. At the request of volunteers, in some cases they investigate major problems with disinformation and other systematic forms of disruptive behavior that may occur on the site. It may also take action in a few cases, such as when the behavior of contributors to Wikimedia projects endangers the safety of other editors, or violates community guidelines, spreads false information on the platform, or prevents the proper functioning of the volunteer community.

There is also a task force. When does it intervene?

Costanza Sciubba Caniglia: The Foundation’s Disinformation Response Taskforce works with established Wikimedia volunteers and Wikimedia affiliates to help identify potential information attacks on Wikipedia during times when the risk of disinformation is particularly high, such as elections. Taskforce members coordinate via a dedicated communication channel and use reporting mechanisms to quickly identify and address any election-related disinformation attempts on Wikipedia.

costanza sciubba caniglia

Costanza Sciubba Caniglia

leads the anti-disinformation strategy at the Wikimedia Foundation.

 

 

 

 

Costanza Sciubba Caniglia