Wikipedia:Bots/Requests for approval/DASHBot 4


 * The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was Symbol keep vote.svg Approved.

DASHBot 4
Operator: Tim1357

Automatic or Manually assisted: Automatic

Programming language(s): Python, Automator, and Applescript (im pretty lazy :) )

Source code available: Nope.

Function overview: Mark pages that are tagged for deletion as patrolled.

Links to relevant discussions (where appropriate): Discussion at New Pages Patroll

Edit period(s): Continuous

Estimated number of pages affected: 1 in 25 new pages.

Exclusion compliant (Y/N): N

Already has a bot flag (Y/N): N (but it needs one to function better)

Function details:
 * 1) Get large amount of new pages. (Using query)
 * 2) Get list of articles that are that are marked with AFD, CSD, or PROD tag. (Using cat-scan because I was getting mad at the API)
 * 3) Get RCID for pages that are in the intersect of new pages and tagged articles, using query.
 * 4) Patroll pages (using index.php instead of api.php)
 * 5) Wait 5 minutes
 * 6) Go again.

Note: I was going to do this all in python but I could not figure out how to configure cookies, so I interface the python script to the API with automator (its messy but it gets it done). Then, to loop it: I have to use Applescipt (Automator does not have an infinite loop option).

Discussion
I need the ability to patrol pages and a bot flag. (This is because currently the API limits me to 500 of the most recent pages, and I want to do more.) Tim1357 (talk) 17:29, 22 December 2009 (UTC)
 * Comment I agree with the proposed function of this bot, but I am not competent to comment on the method. JohnCD (talk) 10:07, 23 December 2009 (UTC)
 * I like the idea because it proposes a solution to a vexing problem - if you mark a new page with one of those you can't mark it patrolled.--otherlleft 12:31, 23 December 2009 (UTC)


 * Is there a reason you are using index.php instead of api.php, I am not highly competent in that area, but I was under the impression the API was preferred.  MBisanz  talk 06:36, 27 December 2009 (UTC)
 * Yes. The simple reason is that index.php requires a title and an rcid (with cookies automatically taken care of). Api.php requires title, rcid, and a token. Plus, there is no prebuilt function in any of the python packages (pywikipedia or wikitools) to do this. If there were a lot of pages that were to be affected by this I would build my own API function. However, because of the limited number, I figured it would be ok to use index.php. Tim1357 (talk) 06:55, 27 December 2009 (UTC)
 * Hmm, okey, seems reasonable.  MBisanz  talk 08:27, 27 December 2009 (UTC)
 * Thanks, one question: Do I stop after whatever comes first? Tim1357 (talk) 16:23, 27 December 2009 (UTC)
 * You can stop whenever you think you have a good sample of what the bot will do when approved.  MBisanz  talk 19:44, 27 December 2009 (UTC)

--Tim1357 (talk) 02:10, 28 December 2009 (UTC)
 * I've done a few so far. Stop me any time. Otherwise I will stop in 3 more days. Tim1357 (talk) 06:10, 29 December 2009 (UTC)
 * ✅ The weather has been a bit troublesome here, and my internets have been down alot. This is the reason for the on off performance of the bot. Anyways, take a look at the log here Tim1357 (talk) 03:03, 31 December 2009 (UTC)

 MBisanz  talk 04:36, 1 January 2010 (UTC)
 * The above discussion is preserved as an archive of the debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA.