User:Risker/Content moderation

Methods of content moderation
There are multiple types of content moderation carried out on Wikipedia, described below:

Editing

 * Most common type of content moderation, used by any editor (whether or not confirmed or registered) to modify the content of a page.
 * Used by preference by some editors when added content is at least partially appropriate.
 * If added content is entirely inappropriate (e.g., vandalism, test edits, poorly sourced fact information such as sports scores), revert of the edit is normally used. See below.

Revert (undo/rollback) of edit

 * See Help:Reverting
 * Second most common type of content moderation, used by any editor (whether or not confirmed or registered) to return a page to the previous version.
 * Used for vandalism, certain types of content removal
 * "Undo" is considered the gentlest form of reversion, and allows the reverting editor to include an edit summary explaining the reason for reversion. This is the most common method of reverting good-faith edits that are not appropriate to the page on which they have been made.
 * Various tools such as Twinkle add tabs or options to rollback an edit, usually with an automatic edit summary or no edit summary. This is primarily used for vandalism.

Revision deletion

 * See Revision deletion
 * Deletion of one or a series of edits (or log entries) to remove specific content/material from public view. Sometimes called "hiding" or "changing visibility".
 * Can be carried out only by administrators or persons with Oversight permission
 * Once content has been revision deleted, the content of the selected edits cannot be seen by anyone without administrator/oversight permissions
 * This can create challenges when reviewing the history of an article edit-by-edit, so its use is restricted. See criteria.
 * Revision deletion is reversible by administrators/oversighters
 * Revision deletion can also be used to redact username or action summary information in logs. Its use for this purpose is less common and is more likely to be subject to review or questions.

Suppression (Oversight)

 * See Oversight
 * Deletion of one or a series of edits or log entries to remove specific content/material from the view of both the public and administrators. Called "suppression" and "oversight" interchangeably.
 * Can be carried out only by users with oversight permissions.
 * Oversighters are appointed by the Arbitration Committee (Arbcom), and are (a) current Arbcom members, (b) former Arbcom members and (c) selected community members.
 * Once content is oversighted, the content of the selected edits cannot be seen by anyone (including administrators) except for other oversighters.
 * Use is regulated and restricted. Oversighters are expected to review each other's activities, and act as a self-regulating group.
 * Used mainly for material that is privacy-violating and/or potentially harmful (to a person or their reputation, to the project, to a person or place off-wiki, etc.), or that could make an individual excessively vulnerable (e.g., self-identification by minors, certain mental health issues).

Deletion

 * See Deletion
 * Removal (deletion) of the entire article or page.
 * Can only be carried out by administrators. Once deleted, the deleted contents of the page can be viewed by administrators/oversighters.
 * Deletion is carried out using established criteria, see proposed deletion, speedy deletion and the various "XFD" pages (articles, files, miscellany, redirects, etc.) for applicable criteria.

Suppression Deletion

 * No good documentation for this process, although it is briefly discussed at Oversight
 * Can only be carried out by oversighters. Once suppress-deleted, the contents of the page can only be viewed by oversighters.
 * Oversighters delete the page while also checking the "suppress data" box.
 * Oversighters may occasionally undelete a previously deleted page and then immediately redelete it using the suppression function as an alternative to suppressing each deleted edit individually. Most frequently occurs when the page has a significant (25+ revisions) history.
 * Used when every revision (or many discrete revisions) of the page qualifies for suppression.
 * Least common method of content moderation.

Page protection

 * See Protection policy
 * Protection is applied by administrators, but editors may request various levels of protection at requests for page protection
 * For descriptions, see the overview. Chart is reproduced below:

Moderating potentially harmful content
The primary methods of identifying problematic content are:
 * Recent changes (RC) patrol - carried out by all types of editors based on an ongoing log of changes to all pages.
 * Focus is primarily on "article space" - in particular, articles - which is where potentially harmful content is most likely to be widely disseminated. These are the pages that show up on google searches.
 * RC patrollers do not normally review edits to noticeboards or user talk pages, but may review edits by "other editors" to user pages or to templates or pages in the Wikipedia space that are rarely edited (such as policy pages), particularly if the edit is made by a new/unregistered user
 * New Page Patrol (NPP) - carried out by administrators and editors with a special "new page patroller" permission
 * Main focus is on newly created articles, which are reviewed for quality but will also flag potentially harmful content.
 * A few editors (mostly administrators) will do periodic sweeps of other new pages, particularly those in "user space" or sometimes "draft space", specifically to identify problematic/inappropriate/potentially harmful content.
 * Most common example is minors writing detailed pages about themselves/their families/their friends, etc. These are normally flagged for oversighter review/attention.
 * User contribution logs - carried out by all types of editors on a case-by-case basis when an account is identified to have made a problematic edit. Particularly focuses on new/unregistered accounts, although "experienced" accounts that have been dormant for a long time and suddenly start making problem edits will also be examined.  Often these accounts will be brought to the attention of an administrator, who may well block the account.

In addition:
 * If a non-admin editor identifies material that should be revision-deleted or suppressed, they can (a) revert and privately notify an admin/oversighter or (b) email Oversight (an OTRS queue where oversighters receive requests for suppression and/or revision deletion).
 * If an admin identifies an edit that should be suppressed, they will normally revision-delete and (a) privately notify an admin/oversighter or (b) email Oversight

The chart below discusses various potentially harmful content types and the manner in which it is addressed on English Wikipedia. Preferred method is bolded. Examples are provided where appropriate (using quotation marks), and are all fictional.