User:SPoore (WMF)/sandbox

Edit source

Community Health Metrics Kit consultation
The Community Health Metrics Kit is a new project to measure more aspects of our communities. If you are interested in metrics, statistics, and measurement of editing and contributing, please join us on Meta to discuss how and what the new project should measure! Please share this with anyone else you think may be interested in this work. This message is also available in other languages. For the Community health initiative, Cheers, SPoore (WMF), Trust &#38; Safety, Community health initiative (talk) 18:53, 17 September 2018 (UTC)

What this discussion is about
This is the start of a discussion on ways to improve tools and workflows for users to report harassment.

Wikimedia communities have acknowledged community health as a top concern. Disputes between contributors that advance further to become harassment are a significant concern of Wikimedia communities. According to a survey conducted in early 2017, 73% of Wikipedia volunteers surveyed reported that they had been harassed or bullied on Wikipedia in the previous 12 months. In surveys and community discussions, users have expressed that responses to behavioral issues are frequently inadequate. While our projects have developed good processes for dealing with issues like vandalism and edit-warring, the existing systems for reporting, managing, and evaluating user incidents do not appear to be effective at addressing harassment and other serious user conduct issues. Issues that reach administrators are often resolved adequately, but many incidents that could benefit from intervention do not get attention from administrators. Specifically, 84% of 300 users surveyed in 2017 requested better reporting tools, 77% requested better noticeboards, and 75% requested better policies.

Process and timeline
The Wikimedia Foundation Support and Safety and the Anti-Harassment Tools teams are committed to providing resources for community-supported change, be that technical development, further research, or coordination support. The first stage of the discussion is to identify shortcomings in current systems for reporting harassment in order to determine potential improvements that might be built by the Anti-harassment tools team in the later part of 2018.

Identifying problems
In preparing for this discussion, we've been gathering research—data and analysis—to aid in decision making. Previous community discussions, community surveys, and research and reports have been collected in the sidebar for your review. '''Please let us know about other discussions or research so we can add it to the collections. '''

From community discussions and community surveys, the Anti-Harassment Tools team and the Support and Safety team have identified shortcoming related to the current methods for reporting harassment. From this we have created a preliminary list of problems that could potentially be improved with changes to tools and workflows. We invite you to provide additional problems that you have noted for community discussion.

The discussion about this list of problems will help us prioritize which problems our software developers could work on later in 2018.

Problem list
Working draft of a problem list to be added to during this discussion.

Noticeboards do not facilitate effective case management

 * Basic case management tools are lacking. For example, there is no acknowledgement that the incident is under review. Updates to the case are manual.
 * No method to triage or prioritized cases
 * Lack of structure often makes reports difficult to understand or analyze because of incomplete information and evidence.
 * The open nature of noticeboards encourage uninvolved parties to contribute tendentious or contentious comments.

Users underreport behavioral issues

 * Fifty-three percent of English Wikipedia AN/I survey respondents avoided making a report on Administrators’ Noticeboard/Incidents because they were afraid it would not be handled appropriately.

Policy violations under enforced

 * Lack of detail in policy elevates the importance of administrator discretion, and discourages administrators to take action against abusive behavior because of the possibility of losing standing in the community.
 * Many wikis have a high bar for enforcement of behavioral policy. Administrators are frequently hesitant to take action except in straightforward or blatant cases.