User:Blevintron/Bot

This page is about my bot to combat  link rot.

Purpose
To improve Wikipedia by identifying and marking broken links. To combat link rot by alerting the right user that a link has gone bad.

Haven't we solved this problem already?
No, there are tons of broken links on Wikipedia. Even if Wikipedia has other means of identifying an repairing broken links, they are not effective enough. Preliminary trials of this bot have discovered 10,789 broken links across 38,431 articles. That's about 1 broken link per 4 articles. Those numbers exclude links which are already marked as broken.

Why is this different
The bot tries to alert the right user when a link goes bad. The right user is the user who originally introduced that broken link to that article. This will be more effective than other approaches because the right user has shown interest in the article (by contributing in the past) and has shown some expertise (by adding a link).

The case studies section shows some examples of those messages.

Will that work? Won't that just be annoying?
There are two concerns relating to this approach:
 * 1) Will the actions of this bot be effective?  Specifically, will the bot's actions encourage human editors to repair broken links?
 * 2) Are the solicitations messages necessary, or will they only annoy the human editors?

First, I should mention that the bot supports an opt-out feature (via on articles or user pages), and that it has a strict limit on the number of messages it sends to one user in one day. The opt-out feature is advertised at the end of every solicitation message.

I hypothesize that the bot will be effective and will not be annoying. To answer the question concretely, we need an experiment. During normal operation, the bot collects information about the articles it has modified. Those results are tabulated and discussed at the experiment page. After a sufficient amount of experimental data has been collected, we can evaluate the effectiveness/annoyance of the bot and&mdash;if necessary&mdash;pursue a different approach.

Why not post broken links to a link rot group instead?
The problem with a volunteer link rot group is that it has (relatively) few volunteers to handle a huge number of broken links. It's a bottleneck&mdash;for a link rot group to work, every volunteer must tackle a large number of links. Further, these volunteers don't necessarily know much about the articles they try to fix. It can be very difficult for a volunteer to replace a link if that volunteer never knew what that link pointed to.

My plan spreads this work over a very large number of contributors, so each contributor has less work to do. Since these contributors added the link, we can assume that (1) they know about the subject matter, (2) the know what the link used to point to, and (3) they care enough about the article to contribute.

Why not contact the most recent, most active contributors to that article instead?
Not every article has recent or active contributors. Articles that have recent or active contributors probably have fewer broken links. The power of this bot is that it can work to fix broken links in the long tail of Wikipedia articles.

Community Discussions
There is a discussion of this bot on the village pump idea lab.

Technical Details

 * The bot is written in Ruby, using the standard libraries. Wikipedia interface is hand-rolled.  It runs on linux; it could probably be ported to Mac or Win32 with a little effort.
 * The source code is released as open source.
 * I will host this bot myself.

Tasks
See also the sections about throttling.

TASK 0: Checking Links
This is a read-only task.

The bot maintains a pool of questionable links---those which are not clearly working or broken. Links within that pool are checked at regular intervals until we are confident that the link is truly broken.
 * The bot carefully avoids links which have already been marked as broken, e.g.
 * The bot carefully avoids links which already have an archive url specified, e.g.

I have a working prototype of this task.

TASK 1: Marking Links
The bot edits a wikipedia article, adding the template to broken links, citations, etc.


 * The bot carefully avoids links which have already been marked as broken, e.g.
 * The bot carefully avoids links which already have an archive url specified, e.g.

I have a working prototype of this task.

TASK 2: Soliciting Help for Link Repair from Human Contributors
The bot scans the revision history of the article to find which user first added that link to the article. It sends a polite message to that user's User_talk: page, alerting them that the link has gone bad, suggesting possible archive matches, and encouraging them to fix the broken link. Examples of these messages can be found in the case studies section.

I have a working prototype of this task.

TASK 3: Collect Statistics to Evaluate Effectiveness, Participation and Annoyance
This task will only modify the bot's user space and the operator's user space.

The bot will scan its own edits to determine if any of them were reverted. If so, it notifies the bot operator. The bot operator can use this to improve the bot. Similarly, it will analyze revision history of the article and of the users it has contacted. It will use this information to measure how effective it is at eliminating broken links; see the effectiveness experiment.

I have a working prototype of this task.

TASK 4: Uploading its source code, status, etc to Wikipedia
The bot will upload its source code to its user page.

The bot will upload its status (running, software version, throttling parameters, etc) to a user page.

I have a working prototype of this task

Good Communication
Here are some examples of its communications in response to a few articles with broken links. I don't mean to pick on these articles or these authors; these were randomly selected by the bot.

The important aspects of these communications are:
 * They are polite;
 * The don't accuse people of posting broken links, but instead emphasize that links go bad over time;
 * They clearly advertise the opt-out feature;
 * When available, they suggest archived copies; and
 * The link to the relevant Wikipedia policies on link rot and  bots.

Case Study: Johnny Unitas Stadium
Here's an example of what it would write in response to broken links in the page Johnny Unitas Stadium

Edit Summary
Marked broken link http://www.towsontigers.com/johnnyu/index.asp; Marked broken link http://www.towsontigers.com/facilities/footballhouse.asp. Report problems to User_talk:Blevintron.

Message to User_talk:Thx2005, new section: Broken links in article 'Johnny Unitas Stadium'
Thank you for your contributions to the article 'Johnny Unitas Stadium'. Sadly, some of the links that you added have died. The article needs your help to repair link rot.

This link has died after you added it in February 2007:
 * http://www.towsontigers.com/facilities/footballhouse.asp – I tried to load this link on 11 March, 13 March and 15 March but it never worked. I'm not sure, but this copy from the archives might be a good replacement.

This link has died after you added it in July 2007:
 * http://www.towsontigers.com/johnnyu/index.asp – I tried to load this link on 11 March, 13 March and 15 March but it never worked. I'm not sure, but this copy from the archives might be a good replacement.

I'm just a bot, so I don't really know how to fix the problem. Could you please take a look? Thanks!

PS- if you don't want BrokenLinkBot to contact you, simply add to  your user page or  your user talk page.

~

Diffs
These are the changes to the article. Line wraps are introduced to make the diff readable, but are not inserted into the article.

in 2002. In fact, Unitas threw his last public pass at the re-opening of the facility (as Towson Stadium) just a few days before his -death. His widow, -Sandy, felt it appropriate to honor him by having the stadium named for him -instead, with fund-raising in his name taking the place of the money that a -corporate naming would have supplied. +death. His widow, Sandy, +felt it appropriate to honor him by having the stadium named for him instead, +with fund-raising in his name taking the place of the money that a corporate +naming would have supplied.

==External links== *[https://admin.xosn.com/ViewArticle.dbml?DB_OEM_ID=21300&ATCLID=1511688 Towson Athletics - Johnny Unitas Stadium] *[http://www.towsontigers.com/facilities/footballhouse.asp Towson Athletics - -Field House] +Field House]

Case Study: Mohammed Ali Hammadi
Here's an example of what it would write in response to broken links in the page Mohammed Ali Hammadi

Edit Summary
Marked broken link http://service.spiegel.de/cache/international/0,1518,391177,00.html; Marked broken link http://www.fbi.gov/pressrel/pressrel06/mostwantedterrorists022406.htm; Marked broken link http://www.fbi.gov/page2/feb07/rewards021207.htm. Report problems to User_talk:Blevintron.

Message to User_talk:DBaba, new section: Broken link in article 'Mohammed Ali Hammadi'
Thank you for your contributions to the article 'Mohammed Ali Hammadi'. Sadly, a link that you added has died. The article needs your help to repair link rot.

This link has died after you added it in September 2006:
 * http://service.spiegel.de/cache/international/0,1518,391177,00.html – I tried to load this link on 17 March, 19 March and 21 March but it never worked.

I'm just a bot, so I don't really know how to fix the problem. Could you please take a look? Thanks!

PS- if you don't want BrokenLinkBot to contact you, simply add to  your user page or  your user talk page.

~

Message to User_talk:LDH, new section: Broken link in article 'Mohammed Ali Hammadi'
Thank you for your contributions to the article 'Mohammed Ali Hammadi'. Sadly, a link that you added has died. The article needs your help to repair link rot.

This link has died after you added it in February 2007:
 * http://www.fbi.gov/page2/feb07/rewards021207.htm – I tried to load this link on 17 March, 19 March and 21 March but it never worked. I'm not sure, but this copy from the archives might be a good replacement.

I'm just a bot, so I don't really know how to fix the problem. Could you please take a look? Thanks!

PS- if you don't want BrokenLinkBot to contact you, simply add to  your user page or  your user talk page.

~

Message to User_talk:Steven Russell, new section: Broken link in article 'Mohammed Ali Hammadi'
Thank you for your contributions to the article 'Mohammed Ali Hammadi'. Sadly, a link that you added has died. The article needs your help to repair link rot.

This link has died after you added it in June 2006:
 * http://www.fbi.gov/pressrel/pressrel06/mostwantedterrorists022406.htm – I tried to load this link on 17 March, 19 March and 21 March but it never worked. I'm not sure, but this copy from the archives might be a good replacement.

I'm just a bot, so I don't really know how to fix the problem. Could you please take a look? Thanks!

PS- if you don't want BrokenLinkBot to contact you, simply add to  your user page or  your user talk page.

~

Diffs
These are the changes to the article. Line wraps are introduced to make the diff readable, but are not inserted into the article.

There has been speculation that his parole was granted as part of a covert prisoner swap, in exchange for the release of Susanne Osthoff. Taken hostage in Iraq a month prior, Osthoff was released the week of Hammadi's parole. +Freed Osthoff Not Heading Home Yet]

On February 24, 2006, he joined his accomplices on the FBI's Most Wanted Terrorists list, under the name Mohammed Ali Hamadei.[http://www.fbi.gov/pressrel/pressrel06/mostwantedterrorists022406.htm FBI Updates Most Wanted Terrorists and Seeking Information � War on Terrorism -Lists], FBI national Press Release, February 24, 2006 +Lists] , +FBI national Press Release, February 24, 2006

On February 12, 2007, the FBI announced[http://www.fbi.gov/page2/feb07/rewards021207.htm FBI 2007 -announcement of reward offer] a new $5 million reward for information -leading to the recapture of Hammadi. +announcement of reward offer] a new $5 million reward for +information leading to the recapture of Hammadi.

Case Study: Sean Kennard
Here's an example of what it would write in response to broken links in the page Sean Kennard

Edit Summary
Marked broken link http://www.simc.jp/2004/index_p_e.html; Marked broken link http://www.chopin.org/ip.asp?op=2005; Marked broken link http://www.tomaszmagierski.com/gallery_SK.html. Report problems to User_talk:Blevintron.

Message to User_talk:Brandamber, new section: Broken links in article 'Sean Kennard'
Thank you for your contributions to the article 'Sean Kennard'. Sadly, some of the links that you added have died. The article needs your help to repair link rot.

These links have died after you added them in March 2009:
 * http://www.simc.jp/2004/index_p_e.html – I tried to load this link on 10 March, 12 March and 14 March but it never worked.
 * http://www.chopin.org/ip.asp?op=2005 – I tried to load this link on 10 March, 12 March and 14 March but it never worked.
 * http://www.tomaszmagierski.com/gallery_SK.html – I tried to load this link on 10 March, 12 March and 14 March but it never worked.

I'm just a bot, so I don't really know how to fix the problem. Could you please take a look? Thanks!

PS- if you don't want BrokenLinkBot to contact you, simply add to  your user page or  your user talk page.

~

Diffs
These are the changes to the article. Line wraps are introduced to make the diff readable, but are not inserted into the article.

outside of Asia and Europe to win a prize in the competition. He went on from Curtis to study with Enrique Graf at the

After beginning studies with Graf, Kennard won various other prizes in piano competitions, including the National Chopin Competition, Iowa Piano

the documentary "Pianists," relating the story of the American pianists as they travelled to Poland to participate in the 2005 International Frederick Chopin Piano Competition.

Exclusions Compliant
This bot respects,  ,  , and all their variants before editing articles or contacting users via User_talk: pages.

When checking links, this bot respects robots.txt on third party sites.

Declaring a link 'broken' with High-confidence
Here, 'broken' means that all trials of the link consistently yielded: a DNS lookup error, an HTTP Connect timeout, a connection refused error, an HTTP 404 error, or an HTTP 5xx error.
 * The bot carefully avoids links which have already been marked as broken, e.g.
 * The bot carefully avoids links which already have an archive URL specified, e.g.

A link must be checked several times (MIN_LINK_FAILURES) with sufficient time between checks (LINK_TRIAL_PERIOD) to ensure that a link is broken, not simply a temporary network or server failure. For instance, it will only declare a link broken if it failed to load 3 times over a period of 5 days.

These checks are performed from a good vantage point in the internet---a prominent university in the USA. National Internet censorship should not have a significant impact on reputable sources.

Emergency Shutdown
The bot supports an emergency shutdown page on wikipedia. To shut it down, edit that page and add the word 'shutdown'

Throttling & Maxlag
The bot is carefully throttled for two reasons: first, to minimize load on wikipedia servers, and second, to make it easier for human reviewers to keep up.

Some of the more important limits are listed here:
 * the rate at which wikipedia is scraped during high-traffic (HIGH_TRAFFIC_SCRAPE_PERIOD) and low-traffic (LOW_TRAFFIC_SCRAPE_PERIOD) times.
 * the rate at which wikipedia will edit articles during high-traffic (HIGH_TRAFFIC_EDIT_PERIOD) and low-traffic (LOW_TRAFFIC_SCRAPE_PERIOD) times.
 * the maximum number of links that will be corrected in a single edit to a single article (MAX_LINKS_PER_EDIT). This simplifies a reviewers job, since diffs are smaller.
 * the minimum time between two edits to the same article (MIN_EDIT_PERIOD_PER_ARTICLE). This prevents high-frequency edit wars against humans or other bots.
 * the maximum number of edits to perform in a single calendar day (MAX_EDITS_PER_DAY).
 * the maximum number of User_talk: messages to send to a single user on a single calendar day (MAX_SOLICITATIONS_PER_USER_PER_DAY).

The bot uses the max lag parameter.