User:Ksuong2001/Report

As we explore a variety of online communities, working closely with the platform Wikipedia has not only expanded my understanding of digital collaboration but has also proven to be a valuable learning experience into the realm of knowledge sharing, sourcing, and community-driven content creation. Online communities serve as an opportunity for information sharing and learning, for companionship and social support, as well as entertainment for its members. And over the last six weeks, our class has collectively shaped our understanding of what constitutes a successful online community through discussions, readings, lectures, and hands-on engagement with Wikipedia. By comparing Wikipedia to other online communities such as Reddit, Yelp, and Twitch, it has provided a clear comprehension of the strengths and weaknesses that distinguishes a well-established platform. Although the Wikimedia Foundation and the Wikipedia Community have created a successful culture for online communities, there is much room for improvement. In this paper, I will analyze the Wikipedia community and the Wikimedia Foundation and provide practical and actionable advice on how to create a better online experience. By addressing key areas such as implementing automated filters for vandalism and spam detection, enhancing the content review process, and leveraging artificial intelligence technologies, I hope to further enhance their impact and effectiveness in shaping online knowledge sharing and collaboration.

A thriving community such as Wikipedia has established a range of behaviors and rules that most members follow and accept to support its mission. This collective ethos has essentially created the normative behavior that Wikipedia expects its writers to adopt a neutral point of view when writing articles. With respect to its mission and norms, users are more motivated to contribute and interact with other people in the community, thereby reinforcing Wikipedia’s collaborative spirit and commitment to impartial sharing of knowledge. But because Wikipedia embraces a familiar product with low barriers to contribution and offers low attribution and low social ownership of content, it is susceptible to the risk of external users engaging in attacks, insults, vandalism, and trolling within the community. This may result in inaccurate information/misinformation, spamming, personal attacks, and inappropriate posts which can threaten the integrity and reliability of the platform. To limit these threats, it is important for Wikipedia to reconsider the innovativeness of its goal and product. I believe that Wikipedia should implement automated filters that can detect and prevent common forms for vandalism, spam, and trolling. Creating these filters can analyze edits and flag suspicious activity for review by experienced editors. I noticed that editing an article does not require any experience or even verification which allows any individual to contribute whatever they please without any oversight of accountability. And although other users may detect this and counteract, relying on community policing is not sufficient. This highlights the need for a thorough review process to ensure that content maintains Wikipedia’s standards of accuracy, reliability, and neutrality. Access to editing should be accompanied by a content review queue where edits can be flagged as potentially problematic by experienced editors before being published. This can ensure that contentious edits are examined carefully before they go live, reducing the likelihood of vandalism or spam slipping through unnoticed. Moreover, implementing community moderation tools such as one-click reverting edits or user-warning templates for reviewing recent changes can streamline the moderation process and encourage active participation from experienced users. I also believe that leveraging artificial intelligence can be very beneficial in analyzing and identifying patterns of potential vandalism or spam. Despite its relatively straightforward editing process, it can be seen as outdated in comparison to modern web standards. By automating the detection process, AI can swiftly flag suspicious edits for review, allowing human moderators to focus their efforts on addressing more nuanced or complex issues within the community. If the Wikipedia community and the Wikipedia foundation address the platform’s outdated aspects, they can enhance user engagement, attract new contributors, and ensure that Wikipedia remains as a relevant and valuable resource.

Through my engagement with Wikipedia, I had the opportunity to fully immerse myself in the process of editing an article which has changed my attitude and perspective towards contributing in online communities. I found that WikiEdu was extremely helpful in guiding me through the steps of creating a well-written article. Without taking these precautionary measures and methods, I believe that my article would have lacked clarity and structure. There would have been a significant error in tone and balance as well which could have compromised the integrity and impartiality of the information presented. Encountering these potential challenges can reinforce my ideas on thorough reviews with automated filters, content moderation tools, and the use of AI. For most Wikipedia users, they are not required to have any formal training prior to making edits. Because of its inclusive environment, it sets the norm that anyone with the desire to create is embraced, regardless whether their motivations are intrinsic or extrinsic, thus even including those who may not have the best intentions. For these reasons, I believe that if this activity is detected, Wikipedia should penalize users by flagging and suspending accounts that may have a history of bad faith. This way, the Wikipedia community can distinguish legitimate information from those who seek to misuse the platform. Considering the advice presented can provide valuable insights to how Wikipedia can maintain its credibility, integrity, and continuous engagement as an online community.

As I reflect on my experience with using Wikipedia, I wish I would have had the opportunity to work with other experienced users to better understand how to communicate with one another. Although I did get the chance to interact with one other user that supported in making valuable changes however, I felt that I never had the clear understanding of what advice to take and what to ignore. By flagging accounts with a history of bad faith, this could limit the potential of inaccurate information, spamming, or trolling. Nonetheless, I found this experience to be very effective in learning how to work with online communities. Overall, I believe that what sets Wikipedia apart is its commitment to the democratization of knowledge and the collective effort of its volunteer community. Wikipedia's unique characteristics necessitate a nuanced approach that considers socio-cultural dynamics in the digital world, resulting in a thriving online community.