Wikipedia:Wikipedia Signpost/2022-01-30/Op-Ed


 * The contributor of this op-ed is a member of WikiProject Climate change and has been a Wikipedian for over eight years.

On November 19 last year, BBC News published an investigation into climate denial on non-English Wikipedias. It showed Wikipedia is rife with climate myths. BBC journalist and climate disinformation specialist Marco Silva described the denial as 'alive'. Classical climate denial—denying warming occurs, or denying humans are the primary cause—has been on the wane for a while. Within the English Wikipedia, climate denial has become exceedingly rare. The last time the climate change discretionary sanctions were used was in 2019, and that was against somebody exaggerating the dangers of climate change, as if the reality isn't scary enough!

Is it true that it's still alive and kicking within Wikipedia? My hypothesis is that most of these non-English articles are the ruins of a period in which climate denial flourished. Climate denial is dead, but still rotting.

A cross-wiki project
I started a cross-wiki review on Meta two weeks after the BBC article was published. The goal is to assess the breadth of the denial, and to see it removed from all language editions. We advertised it on seven climate change WikiProjects to ensure it's not just an enwiki project. While the most active editors were still from enwiki, Swedish and Czech Wikipedians also contributed. Depending on the size of the community, we either deleted the misinformation ourselves, or tried to find local editors willing to take on the task. On many articles, it was relatively easy to identify denial as Google Translate has come a long way: For most languages, the translated text was understandable.

The climate myths we found varied significantly. Many articles were an outdated translation of the English Wikipedia. Up to 2008, the English version contained a set of primary sources that supported alternative explanations of climate change. While back then this may have been a significant minority opinion, these views now fall squarely in the fringe or pseudoscience baskets. Displaying this outdated research plays directly into the hands of climate deniers. Their main goal is not making people deny that humans cause climate change. Rather, it is to plant seeds of doubt, as doubt paralyses action.

A smaller subset of languages had more blatant issues. For instance, I encountered the myth that global warming has stopped twice. Some articles contained attacks on prominent scientists. For instance, calling Naomi Oreskes, who was the first to research the magnitude of the consensus, incompetent. Accusations of fraud were also common. The Chinese version gave a platform to somebody espousing an antisemitic attack on Al Gore, even if it did this under a heading which translates as 'conspiracy theories'.

So far, the response to our efforts have been mostly positive. In some communities, talk page comments triggered a full update and rewrite (Catalan and Slovak). In others, the comments were addressed one-by-one (Hindi), or promises were made to update the article (Korean). For less active languages, we removed the denial ourselves, and were occasionally reverted (Belarussian). We found climate myths in over a third of the languages. The 52 articles in these languages are read about 7000 times a day.

It's frustrating that many of these language communities do have the capacity to maintain these pages, as shown by how well they responded to our requests. Should these projects become a routine? Or are more systematic approaches to misinformation needed?

A blueprint for other projects?
Climate change is not the only topic that has been plagued by misinformation. The current pandemic is a prime example. As with climate denial, misinformation on this topic costs lives. Misinformation about abortion puts women around the world in danger. Conspiracy theories surrounding elections are a grave danger for democracy. Can we start similar disinformation monitoring in these areas as well?

There is potential, but perhaps these projects will be more challenging. The misinformation may be spread across more articles, and may be added more recently, so that removal will meet with resistance. Nonetheless, the following steps provide a blueprint
 * 1) Fact-check the English version. Good Article and Featured Article reviews are great to sniff out final inaccuracies.
 * 2) Write understandably. We tend to overestimate the educational level of our readers      [ overcitation for needed emphasis]. Fact-checked information is useless if only half of our readers can understand it. Translators will not be able to translate well.
 * 3) Set up a meta page. Make a selection of languages, based on viewership. Find editors with technical abilities to help out.
 * 4) Find editors of major language groups to coordinate with, for instance via User Groups.
 * 5) Approach interested local editors, translators, and local admins to help out. While misinformation can be deleted with machine translations, keeping it out will require local help.

Or is structural disinformation monitoring needed?
Repeating this for more categories of dangerous misinformation would be beneficial. But it is a time-consuming effort, and in the meantime, our readers will consume heaps of misinformation. It is likely that not all languages will have the capacity to monitor these articles. Should we not be more aggressive? Here are two ideas to add structure to safeguard our readers from misinformation.

Give articles a "best before" date
Many topics are 'under development' in the real world. There are new scientific discoveries and review papers on a daily basis. Sensitive political situations change, as investigative journalism unearths scandals. For many articles, these timescales are easy to estimate. For climate change, there is typically little value in research older than 15 years, and even 6 year-old research can be outdated.

What if we give each article a best-before date? If I write an article about sea level rise, with a median source age of 2018, it's best read before 2024. We'll save maximum median source age on Wikidata. If, for a given language variant, the median source date is too old, a warning template could be displayed, perhaps on the article talk page. We could restrict this initially to medical articles, and a few other topics where misinformation is truly dangerous. If an article gets even further out-of-date, language communities may decide to automatically archive or delete the articles.

Archiving articles in addition to deleting
Paper encyclopedias have to make a decision each edition: Does this article get binned, reprinted, or updated? As such, they have a natural decision point to consider deleting outdated research. With over 6.4 million articles just on English Wikipedia, it's completely infeasible to check all articles and make that decision. What if we create a middle ground between deletion and showing articles to our readers? In a (semi-)automatic manner, we could archive many articles that are likely outdated. For instance, articles three years past their 'best before date'. The archive would be accessible to editors, and even readers, with an extra click. No admin intervention needed. A clean slate to write on can be an exhilarating experience. Working in the Dutch Wikipedia, I know how great it feels to start an article on an important topic. Making space in this way might not only help us avoid spreading misinformation, but might also reinvigorate Wikipedia.