Wikipedia:Recent changes patrol/2009 RC patrol initiative

Several editors have expressed concerns about the negative impact the RC patrol process can have on new and casual editors. There are several related concerns about the present state of affairs:


 * 1) New editors do not feel welcomed.  There are anecdotal reports that insightful, capable people who are genuinely trying to contribute and understand Wikipedia's ways of doing things are walking away frustrated.
 * 2) Casual editors making good-faith but ill-advised edits reportedly feel mistreated.  This includes, particularly, genuine attempts at articles on topics which are at best marginally notable.
 * 3) Editors making good-faith efforts to remove unsourced derogatory information from their biographies do not get the on-wiki support they need, leading to ill will and increasing the OTRS case load
 * 4) There is an unnecessarily high administrative burden to the project from the use of blocks to combat routine vandalism.

History
The development of two classes of automated tools revolutionized the way that Wikipedia managed the monitoring of recent changes:


 * 1) Tools that follow the RC feed on IRC and analyze and categorize edits in real time.  VandalFighter was among the first of these.
 * 2) Tools that automate template placement on user pages, such as Twinkle.

These tools have had enormous positive impact in two key areas:


 * 1) It is now rare for blatant vandalism to be completely missed.
 * 2) It is now nearly impossible for a single person to engage in sufficient vandalism to overcome the Recent Changes patrol without the use of a bot.

Policy has grown organically and has come to be codified at RC_patrol.

Proposed Changes

 * 1) Sharply reduce the use of user warning templates
 * 2) Update the language of remaining templates to be more friendly
 * 3) Provide more specific guidance for blocking users based on activity detected while RC patrolling
 * 4) Encourage more use of open, free-form, situation specific communication when dealing with good faith but misinformed edits.

Reduce the use of user warning templates
There are, at present, around 120 user warning templates.

Templates for routine vandalism
Some users believe that there is no reason to respond at all to routine vandalism (e.g. blanking a major page, adding expletives to articles, or adding a few words about a friend or pet to a major article). The case for just reverting the changes is based on the fact that the perpetrator knows exactly what they are doing and is merely trying to see what response they get. The fullest form of "deny recognition" is to revert the vandalism without taking any additional action.

Historically, the reason for warning vandals was to build the case for a block. Since blocks are rarely necessary with current RC patrolling and rollback tools (see below), there is no longer any reason to build such a case except in unusual cases.

Templates for content problems
Where a user is introducing problematic content that violates NPOV or fails to meet inclusion guidelines, the goal should be to educate the user about the specifics of the situation. A template may be appropriate for initial communication (when the talk page is blank or contains only a welcome message). If the behavior continues, escalating warnings only escalate the situation. Instead, an individualized note left by a thoughtful Wikipedian explaining the specifics of the problem is more likely to be read and understood.

Updates to the remaining templates
Templates remain appropriate other conduct violations, e.g. 3RR, egregious personal attacks, and sock puppetry, among others. However, some updates may be appropriate.

The UWTs on the whole portray an attitude of speaking rather than listening. They are written in terms of policy and consequences.

UWTs should be written to point out policy and then foster communications, and should in most cases direct conversation to the talk page of the Wikipedian who placed them (both to ensure a quick reply and to deal with the common case where a new user alternates between using an IP and one or more logged-in usernames).

Blocking policy
There are three possible outcomes from a block. In decreasing order of occurrence, they are:
 * 1) The affected user may attempt to circumvent the block by waiting it out or changing user names or IPs.
 * 2) The affected user may contest the block by contacting another administrator or mailing list.
 * 3) The affected user may accept the block, either by discontinuing the unwanted behavior or leaving Wikipedia entirely.

For routine vandalism that can be easily identified and reverted, the effort involved in managing the block -- through reblocking the user as needed and dealing with unblock requests (regardless of their merit) -- is generally higher than just reverting edits until the vandal loses interest.

When the inadvertent collateral damage caused by blocking people trying to fix copyvios or attack pages that affect them personally is considered, this is especially true.

Proposed policy adjustments

 * 1) In general, accounts should not be blocked for simple vandalism or disruption unless the vandalism occurs at a rate that is difficult to revert (e.g. more than 10 edits a day).
 * 2) Blocks for other content violations (e.g. COI, linkspam, copyvios) should be preceded by a genuine, personalized attempt to engage the user in discussion that addresses the specifics of the problem.

Encouraging use of free-form responses
Policy pages should encourage individually written responses as the preferred method of dealing with other users for matters other than courtesy notifications (e.g. notices that an article or image is being considered for deletion or that an article was tagged as a speedy). Once the matter has already been addressed once using a templated notice, Wikipedians should switch to more substantial dialogue.

If possible, it would be helpful culturally if edit-counting tools did not include templated warnings in their counts, or counted them separately.

Tool effects
Creative tool developers should be encouraged to adapt tools to this more personalized approach. For example, tool users could be given the opportunity to edit template text for each use rather than just subst:ing it. The history of users whose edits are being reviewed could be more prominently displayed. Tools could ask for confirmation before templating a talk page that already contains several templates.