Wikipedia:Bots/Requests for approval/CarnivorousBot


 * The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was Symbol oppose vote.svg Withdrawn by operator.

CarnivorousBot
Operator:

Time filed: 05:32, Sunday December 29, 2013 (UTC)

Automatic, Supervised, or Manual: Automatic

Programming language(s): Python using mwclient

Source code available: Source code not available.

Function overview: Creates redirects requested by autoconfirmed and confirmed users.

Links to relevant discussions (where appropriate): Request located here.

Edit period(s): Daily

Estimated number of pages affected: Sorry, I don't know how many redirects are created per day. I would estimate around 150-250 daily.

Exclusion compliant (Yes/No): No

Already has a bot flag (Yes/No): No

Function details: This bot makes redirects that would be tedious to do manually. All autoconfirmed (or confirmed) users can request the bot to make redirects. Requests are added by editing a page in the bot's userspace (specifically, User:CarnivorousBot/Requests). The syntax required for requests is located here. After a user enters the request in appropriate syntax, the bot creates a redirect. There is very little room for error, because the bot is preforming edits that have been requested by humans. I don't see why this bot needs to be exclusion compliant. If for some reason it needs to be, I'll make it compliant. Edit:I have prepared a list of tedious redirects that actually need to be made here. When this gets approved for a trial, I'll do a couple from there.

Discussion
What about Template messages/Redirect pages? Your delimiting is sub-optimal; titles may have commas in them. I suggest you make links to the redirect and target pages. Josh Parris 00:54, 30 December 2013 (UTC)


 * Thanks for your response. I see the problem with using a comma as to separate the redirect and target pages. However, I do not believe that the using links as a delimiter is the best solution. Since this bot is going to be used to create many tedious redirects, I think that it would be less tedious for users to type just one character rather than make the two pages links. I've found a list of characters that aren't allowed in page titles on MediaWiki-based websites. Any of these characters could be used to delimit the titles without any interference. Also, upon reading that list, I have learned that "!" is allowed in MediaWiki titles. I leave it to everyone to decide which character(s) they believe is(are) most convenient to replace the current comma and exclamation point. — Carnivorous Bunny (talk) 03:54, 30 December 2013 (UTC)
 * I could also change the comment line to a place where one inserts a template message. — Carnivorous Bunny (talk) 15:46, 30 December 2013 (UTC)
 * This bot appears to have edited since this BRFA was filed. Bots may not edit outside their own or their operator's userspace unless approved or approved for trial. AnomieBOT ⚡ 22:48, 31 December 2013 (UTC)
 * Sorry, but someone placed a test (diff) on the requests page, so I assumed that I was allowed to run. I have reviewed what the bot did. I've disabled the bot until someone responds. Also, I've tagged the redirect for deletion, even though it seems like a good redirect. — Carnivorous Bunny (talk) 05:12, 1 January 2014 (UTC)
 * While I think the functionality of marginal value, I don't see it as harmful per se. Please consider how your bot will respond to a request for an existing page to redirect elsewhere. Josh Parris 07:04, 2 January 2014 (UTC)
 * I've run the bot on three edits from this list. Each individual edit seems fine to me. — Carnivorous Bunny (talk) 01:08, 3 January 2014 (UTC)


 * Just for the sake of not making excess edits, it would be better if it only updated the requests page after each "run" rather than after every single edit. Mr.Z-man 02:55, 3 January 2014 (UTC)


 * to address the concerns and suggestions raised above. — Carnivorous Bunny (talk) 17:19, 3 January 2014 (UTC)
 * I note that you haven't demonstrated the redirect templating capability you mentioned. What template do you think would be appropriate for List of newspapers published in Alaska? Josh Parris 21:31, 7 January 2014 (UTC)

Getting auto-confirmed status isn't really that hard. One could easily wait a while and then make the bot spam thousands of pages without extra checks. — HELL KNOWZ  ▎TALK 17:25, 3 January 2014 (UTC)
 * You don't even need to be autoconfirmed to create pages yourself, for createpage registered is enough. However, some limits on how many pages get created without oversight from the operator would be a good idea. Josh Parris 21:31, 7 January 2014 (UTC)
 * Yeah, but creating them manually is hard work. Spamming a list of 5k common English words is "amazing" vandalism. — HELL KNOWZ  ▎TALK 21:35, 7 January 2014 (UTC)
 * I'll file this again in a few days after I do a code rewrite.  CarnivorousBunny talk • contribs  12:07, 8 January 2014 (UTC)
 * The above discussion is preserved as an archive of the debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA.