Wikipedia:Bots/Requests for approval/BetacommandBot 3


 * The following discussion is an archived debate. Please do not modify it. Subsequent comments should be made in a new section. The result of the discussion was Symbol delete vote.svg Approval withdrawn.

BetacommandBot
Operator: User:Betacommand

Automatic or Manually Assisted: Auto

Programming Language(s): python

Function Summary: Anti-spam work

Edit period(s) (e.g. Continuous, daily, one time run):

Edit rate requested: Unknown edits as needed

Already has a bot flag yes:

Function Details: Currently BCbot does linksearches and post them to subpages under my account WPSPAM subpages these anti-spam statistical task may slowly branch out to cover more but they will stay in my userspace WPSPAM subpages

Discussion
Sorry, I can't really understand what you're saying. Can you clarify? — M ETS 501 (talk) 18:17, 10 March 2007 (UTC)
 * See WikiProject Spam/LinkSearchWikiProject Spam/Report WikiProject Spam/LinkSearch/List Betacommand (talk • contribs • Bot) 07:35, 12 March 2007 (UTC)
 * Personally, I'm as wise as before. What's it currently doing?  What extension are you proposing?  How's it going to do it?  At first sight this looks like you're requesting approval for a spider to exhaustively search for spam links, but I'm hoping that's not the case.  Alai 07:34, 13 March 2007 (UTC)
 * It's not a spider users who are part of WPSPAM Identify spamed domains and BCbot tracks the spamed domains and assist in identifying common targets for a given domain and provides a linksearch history for given spammer. Betacommand (talk • contribs • Bot) 18:00, 13 March 2007 (UTC)
 * I'm still struggling to follow your explanation (if explanation that was). Looking at the history of WikiProject Spam/LinkSearch/zazzle.com, I see that BCD a) BCB created the initial list, and b) periodically deletes items, and c) periodically adds new items.  Are all three of these activities covered by this BRFA?  If so, how are you doing each?  The b) part seems clearest, as I assume it's done by periodically checking to see if each page on the list has been deleted or despammed, but if a) and c) were to be automated, I'd be guessing as to how.  (Come to that, I'm rather guessing here in general, and would prefer "Function Details" that wasn't so unclear, if not open-ended, in the first place.)  Alai 03:19, 14 March 2007 (UTC)
 * Step one anyone in #wikipedia-spam finds a spammer they request BCbot to do a linksearch. if BCbot has a hit it A.) adds the name to WikiProject Spam/LinkSearch/List B.) creates a subpage with the linksearch results. C.) it updates WikiProject Spam/Report
 * Step 2: daily task it reads /List and updates all reports. if it finds that there are zero links it adds that to /Holding 1 it before adding to /Holding 1 BCbot does the same for /Holding 2 and moves it to old. /Old contains all websites that havent been spammed within the last 72 hours.
 * BCBot is used to track spammed domains and assist in identifying them and removing their spam. Betacommand (talk • contribs • Bot) 03:29, 14 March 2007 (UTC)
 * I'm awarding myself some points, since my "b)" seems to correspond fairly closely to your "Step 2". That part seems fair enough.  But I'm still struggling with the "do a linksearch" bit.  How's this to be generated?  From the contribs of a given spammer, extracting all additions of a particular domain?  Or what?  Alai 03:37, 14 March 2007 (UTC)
 * it uses Special:Linksearch to generate that data. as it is the least taxing method for our servers Betacommand (talk • contribs • Bot) 19:48, 14 March 2007 (UTC)
 * OK, that was a key piece of data I hadn't secured on my own recognizance; thank you.  Aside from the meta-issue of the description being still being less than crystal clear, and getting additional information being rather like pulling teeth, I'm not seeing any problems here.  Alai 03:13, 17 March 2007 (UTC)
 * I've noticed these pages already; presumably this application covers that activity which is already happening? If not, then I must have missed the point too.
 * Once a URL is down to 0 external links is it necessary to still have a page for them (and thus another external link), e.g. WikiProject_Spam/LinkSearch/panthercarclub.com?
 * What's the data going to be used for and who decides what is spam? I notice that for example google is listed, and yet google actually has an interwiki link test. --kingboyk 19:17, 17 March 2007 (UTC)
 * true google is listed that is because there was a request for a report of that data. spam is identified in several methods, Mass additions, identification of domains/sites that are identifed as possible spam that link to said page should be monitored, and other methods that humans Identify spam. the bot does not Identify spam. all the bot does is track the usage of the domain over time. In regard to deleting the subpage that issue has not arisen to need deletion, having the site in the title is not a link. Betacommand (talk • contribs • Bot) 02:07, 18 March 2007 (UTC)

All seems solid, flag away (looks for b'crat) -- Tawker 03:40, 21 March 2007 (UTC)

He's already got a flag; presumably you meant this? :) --kingboyk 11:56, 21 March 2007 (UTC)


 * For now, I have withdrawn the approval of this bot. When everything is in order again, if you still want to do a similar task please make a new bot request. — M ETS 501 (talk) 00:26, 22 March 2007 (UTC)


 * NOTE:  Permission for this bot was removed "Due to the huge controversy surrounding Betacommand's removal of external links". (See this page (at the bottom) and this page for the controversy itself). -- RM 13:00, 22 March 2007 (UTC)


 * UPDATE: This specific request has been rejected permanently, but may be superseded by a future request of the same nature. -- RM 14:20, 28 March 2007 (UTC)


 * The above discussion is preserved as an archive of the debate. Please do not modify it. Subsequent comments should be made in a new section.