User:NotQuiteTheSame/Ethical issues apparent on Wikipedia

In the year 2001, Apple introduced the first iPod, I was born, and a small website named Wikipedia was created. This website – as we all know – would soon grow to become the largest encyclopaedia the world has ever seen. And taking on that role would come with many challenges and ethical issues.

Introduction
We all use Wikipedia. From students and professors to judges and physicians. Even large organisations such as Google and Twitter use it for basic fact-checking. Because let’s face it: no other platform aggregates such large amounts of useful information as well as Wikipedia. Nonetheless, there are ethical considerations to be made surrounding this online encyclopaedia by all of its stakeholders, from its owners to its readers and contributors. One ethical issue that is particularly relevant to Wikipedia is that of vandalism and self-serving misinformation on the platform.

The open nature of editing Wikipedia articles
Wikipedia is built on the idea that everyone can contribute to the sharing of human knowledge. But such an online platform naturally comes with the risk of it being abused. Its contributors often refer to vandalism, but many so-called edit wars are more subtle than that. A well-documented example of them is between China and Taiwan. And since services such as Google and Siri use Wikipedia to answer questions, the answer to "What is Taiwan?" has gone back and forth between "a state in East Asia" and "a province in the People's Republic of China".

Who is affected and how are they affected by Wikipedia edits?
All this vandalism and self-serving misinformation has ethical implications for the Wikimedia Foundation, but also its readers and contributors. It may be interesting to point out that Wikipedia is itself at the intersection of owned media and earned media, as it is run by the Wikimedia Foundation, but its content is almost entirely created by independent users.

Let's start with the Wikimedia Foundation, which is the non-profit that runs Wikipedia. Their goal is to "make it easier for everyone to share what they know". Abuse of the platform is a consequence of how easy it is for users to edit Wikipedia articles, but suppressing the ease of editing them goes directly against the stated mission of the Foundation. Here, Wikipedia adopts an intermediary stance. The platform has protection policies in place for controversial articles, so that new or anonymous users cannot change their contents without approval from a moderator.

Next up, readers of Wikipedia are affected by misinformation and vandalism because they then go on to make decisions that affect people in the real world based on what they read. This means that readers have to be vigilant about the information they are using, but also that many more people’s lives can indirectly be influenced by a Wikipedia edit that someone made simply as a joke.

Finally, Wikipedia contributors are affected because they volunteer hours of their time to help improve the platform, only for their work to be squandered either by vandalism or even by restrictive policies from the Wikimedia Foundation itself.

Specific recommendation
So, what can be done? We can approach solving the ethical issues of free and open Wikipedia edits through a variety of perspectives. Two frameworks that may be considered are those of teleology and deontology.

The teleological approach
Recommendations stemming from the teleological approach may start with the problem itself. This problem is that malicious editors can edit articles just as easily as those who mean well. To solve this issue, the Wikimedia Foundation could make all article edits be subject to approval by its own moderators, except for basic changes to punctuation, spelling, and adding links between articles. This way everyone still has the opportunity to contribute to the encyclopaedia and spreading misinformation would be much more difficult and vandalism would be virtually eliminated. But from a deontological point of view, this solution is less than desirable. It may be argued that moderating everything goes against the ideal of being an open platform!

The deontological approach
Though the end result may not be as certain, another solution could be to educate people and hope their contributions to Wikipedia will lead to the platform becoming more reliable and less prone to abuse in the long run. This is possible because education would have a double effect: with the general public better equipped to identify misinformation, editors will be better at providing reliable sources and explanations, while readers will be better equipped to recognize dubious claims. Schools and universities should educate students on how to actively use Wikipedia. There could be a mandatory course to have students contribute to articles on the platform, so through participation they may learn how easy but also how difficult it can be to manipulate information on the platform.

Conclusion
We all use Wikipedia because it’s convenient and good enough for the vast majority of day-to-day collection of facts. Instead of dismissing the platform outright on the basis of it being untrustworthy, steps can be taken to make it better. Wikipedia could decide to implement stricter moderation, while educational institutions could teach students how to improve Wikipedia.