User:Shereth/Automation



Clever Wikipedia editors have written a number of valuable, semi-automated counter-vandalism tools to aid in the never-ending fight against vandalism prevalent on the encyclopedia. Some of these tools, such as Twinkle and the unfortunately named Huggle allow editors to make extremely rapid reverts and even rollbacks to quickly deal with vandalism.

These tools, however, do have one serious drawback - they significantly reduce the amount of human interaction in the editorial and anti-vandalism process. In many cases, it's a matter of clicking a single button that will perform all of the pertinent actions, such as reverting the identified edits and using an appropriate user warning template on the offending editor's talk page. This is not, in and of itself, problematic; however, problems do arise when this lack of manual oversight causes good-faith edits - and sometimes even constructive edits - to be reverted as vandalism and the editors warned.

How can use of automated tools cause an issue?
The primary intended use of these tools is to fight against vandalism, spam and unconstructive edits. Indeed, there are practically legions of editors who devote much, if not all of their time, to fighting this scourge on Wikipedia and these tools allow them to do so with increased efficiency and ease. But even the experienced vandal-fighter and contributor can fall in to a simple trap of laziness. There are some edits that, at a casual glance, appear to be unhelpful - deletions of large blocks of material from an article, substantive edits from an IP address, and the like. These things are not, however, always vandalism, and when an automated tool is used it is all too easy to overlook the fact that these are indeed valid contributions.

How much of a problem is this? Enough that Twinkle's main page carries the following, stern warning for its users:

Huggle carries a similar template on its main page, further elaborating on the issue:

But what is the big deal?
Surely, the accidental reversion of a good edit is not such a bad thing - it is not difficult for editors to simply undo the automated edit and life is good. But what if the edit is never caught? Often editors will overlook entries in a page history that were marked as "using Huggle" or the like, in the assumption that this was a valid countervandalism edit. In that case, a good edit is lost and perhaps never recovered.

Worse still is the damage it can do to individual editors. When an established editor gets an automated message telling them that their contributions are unhelpful, or worse, vandalism, it can result in frustration and anger. An oft-cited essay advises us to not template the regulars to avoid this kind of scenario. Automated tools, on the other hand, do not distinguish between a brand new (and clearly inexperienced editor) and one who is well established. Generally a regular will be able to brush off the offense and no real harm is done, but the potential for negative feelings is there.

The damage done to new (but well-meaning) editors in these situations can be even worse. Imagine, as a brand-new editor, one who just registered (or maybe is editing from an IP) makes a thoughtful and useful contribution to an article, only to see it vanish in the wink of an eye. To make matters worse, when they click on the shiny "new messages" link that appears at the top of the page and find a stern admonition against vandalizing the project, how are they to feel? How many editors might be driven away from the project - permanently - because in their nascent wiki-editing, a misjudged click on an automated tool made them feel unwanted?

Click with care
This is not to say that automated tools like Huggle and Twinkle are bad. On the contrary, the help they provide to the project is, in all likelihood, far greater than the harm caused by the careless use of such tools. That said, if all of the avid Hugglers and Twinklers out there would simply take a moment to ponder what they are doing - slow down, just a little bit - a lot of unnecessary confusion and grief could probably be avoided. Sure, there are no speed limits on the information superhighway and Wikipedia is no different. But taking a little care with how quick you click that button can make a big difference.