Wikipedia talk:Bots/Requests for approval/Archive 12

HostBot 4, increase from 100 invites
Per the recent HostBot 4 approval, we started running 100 Wikipedia Adventure invites per day since last Wednesday. So far, we have had no technical problems and positive feedback about the game is coming in. (One editor intends to use TWA in a second education program course. Another education program member is adapting the game for a sandbox tutorial). Several editors have played through the game entirely yielding a very positive Learning pathway and final product. Technically and functionally it's all looking good.

Our playthrough rate is about 1 for every 150 invites. This is entirely normal for an online invitation. It tracks the same rate as the Teahouse invites did. ''Now that we know what the rate is, in order for us to demonstrate statistically significant impact we will need to increase the number of invites during the trial period. I'd like to request we raise the daily invite level from 100 per day to 300 per day. (500 would be even better.)''

These editors are pre-selected for having good-faith editor patterns by the WP:Snuggle machine-learning algorithm. We exclude editors who have never made an edit, and also exclude those who are blocked. We have been monitoring activity from these new editors and continue to check on whether or not they're causing any maintenance hassles for experienced editors.

We'd like to run the increased invites per day for the full 4 weeks of the trial period. This will permit us to run a full analysis of the data in the timeframe of the Individual Engagement Grant, the final report for which is due January 1st. Thanks for considering this during the trial period. Cheers, Ocaasit &#124; c 17:57, 25 November 2013 (UTC)
 * Hey just wanted to loop you in to this discussion.  Cheers, Ocaasit &#124; c 22:28, 25 November 2013 (UTC)
 * Given the tumbleweeds response from your last poke of the community, I think you'd be welcome to spam every new editor without limit, if that took your fancy. Josh Parris 22:33, 25 November 2013 (UTC)
 * Because of the timeframe, and our desire to gather meaningfully more data during the collection period, I would like to increase invites this week. Of course, we'll be mindful of any new technical issues or objections and not 'overdo it'.  In this trial phase we only want enough data to draw meaningful conclusions; this is not the time to promote the game to the broadest possible audience (hopefully good results will permit us to do that in the next phase). I will notify WP:VPR about the plan to increase the invite rates.  Thanks for your input again ! Ocaasit &#124; c 22:44, 25 November 2013 (UTC)
 * The invite-expansion-Pump-post-update is here, for future reference. - J-Mo  Talk to Me   Email Me  19:10, 10 December 2013 (UTC)

Propose revocation of approval of BattyBot 24
The discussion at Help talk:Citation Style 1/Archive 86 BattyBot24 does not have consensus and is breaking working citations. I propose the approval of BattyBot 24 be revoked. Jc3s5h (talk) 14:24, 1 January 2014 (UTC)
 * Thought it was 26? --Rschen7754 18:46, 1 January 2014 (UTC)


 * In the first paragraph of Help talk:Citation Style 1/Archive 86 GoingBatty links the words "this previously approved request" to the approval for BattyBot 24. The issue in the discussion is whether it is appropriate to delete cite template parameters such as "author = Staff" by bot. One issue is whether there is consensus for such deletions. Another issue is that such deletions will break links between Harvard inline citations and the full bibliography entry, and also between short footnotes and the full bibliography entry. An issue not mentioned in the discussion, but which I now bring up, is that "author = Staff" is meaningful and probably intended by the editor who created the citation, while the examples given in the bot approval indicate only "chaff" would be removed, that is, obvious mistakes that no editor would have put in if the editor had noticed it. Jc3s5h (talk) 19:05, 1 January 2014 (UTC)


 * While I haven't seen any examples of where any of the bot's edits actually broke any citations with sfn or similar templates, the fact that it COULD do so is very troubling. Therefore, I will be manually reverting each of the bot's edits that commented out "Staff".  I've posted the bot's find and replace rules below in the hope we can have further discussion on how to keep the bot task running to remove the "chaff" without breaking anything.  For example, I would want to skip all pages that contain sfn.  Thanks!  GoingBatty (talk) 22:05, 1 January 2014 (UTC)
 * Correction, I would want to skip all pages that contain a template that starts with "sfn" or "harv", including those listed at Template:Sfn. GoingBatty (talk) 18:00, 2 January 2014 (UTC)


 * Would you please state which version of regular expressions the bot uses, so the rules can be interpreted with greater confidence? Also, it might be helpful to add a plain-English description of the intent of the rule for the benefit of those not familiar with regular expressions. Jc3s5h (talk) 18:15, 2 January 2014 (UTC)
 * The bot uses AWB - I'll ask which version or regular expressions it uses. I've added plain-English examples of the matches, as you requested.  GoingBatty (talk) 02:22, 3 January 2014 (UTC)

AWB uses .NET Framework 3.5 Regular Expressions GoingBatty (talk) 20:03, 3 January 2014 (UTC)


 * Thanks, I think I have the book for that on my shelf. Jc3s5h (talk) 00:39, 4 January 2014 (UTC)


 * I think it is premature to delete the author = staff parameter because discussion is still ongoing. Also, I don't think the original bot approval applies to text intentionally composed by an editor, rather than "chaff". Jc3s5h (talk) 18:41, 2 January 2014 (UTC)


 * Looking over the expressions, they seem quite specific, and unlikely to have any matches that GoingBatty didn't intend. I am concerned that it would be quite confusing for editors to observe that citations that don't have ref=harv to have the author parameter changed while citations that do have ref=harv are left alone. Jc3s5h (talk) 03:48, 7 January 2014 (UTC)
 * Not sure I understand your confused editor scenario. My proposal is that if a page has citations with ref=harv (and therefore a harv related template), the bot would ignore the entire article, so as not to cause confusion.  However, if your point is that editors might be confused to see that the bot edited one article but not another, I hope they would come visit the bot's user page (where I would post a clarification explaining why it doesn't edit pages with ref=harv), and answer any questions that the confused editor asked on my talk page.  GoingBatty (talk) 04:37, 7 January 2014 (UTC)
 * Yes, my concern is that editors might be confused by the bot acting differently in different articles. I also think that the situation should first be resolved by the ongoing discussion, and then the result of the discussion documented at Help:Citation Style 1 and at the pages for the templates where articles by staff are most likely to come up, probably cite news and, because it is a redirect of cite magazine, cite journal. By the way, I don't have any particular preference about how to deal with institutional authors, I just think any well-developed style should have some specified approach. Jc3s5h (talk) 16:30, 7 January 2014 (UTC)

BAGbot status
– Per AnomieBOT status, the Bot Administrators' Group bot, BAGbot, which updates BAG/Status, shows status "error" from its last run @ 2014-01-23 18:48:19, and there is no next run scheduled. Does this require human attention? Wbm1058 (talk) 14:07, 24 January 2014 (UTC)
 * Might have been my "fault" . — HELL KNOWZ  ▎TALK 14:58, 24 January 2014 (UTC)
 * Actually, it was a problem with the database on Tool Labs. It just required a restart after that was fixed. Anomie⚔ 23:40, 30 January 2014 (UTC)

Reopen an expired bot request
Please can anyone reopen the BRFA request at Bots/Requests for approval/CLT20RecordsUpdateBot which has been marked as expired? I have declared that the trial of the bot is complete, even if it has not made the required number of edits as it will not edit until at least 6 months. jfd34 ( talk ) 10:09, 10 February 2014 (UTC)

Operator refuses to publish opt-out instructions
went through a BRFA, and identified itself as exclusion compliant, and then was flagged. A recent RfC about bot messages prompted me to review the opt-out instructions for several bots that leave messages. I noticed that all of them except ReferenceBot has instructions on how to opt out. I posted a note at User_talk:A930913 requesting that those instructions be added to the userpage per WP:BOTCONFIG and is refusing to publish how to opt out, even though the BRFA was approved as exclusion compliant. Werieth (talk) 18:48, 20 March 2014 (UTC)
 * Have you tried adding to your talk page?  GoingBatty (talk) 21:39, 22 March 2014 (UTC)
 * Ive never triggered the bot and really dont have a need to opt out. However a RfC prompted me to look into it. That should work, but for someone not familiar there is zero documentation on the bots user page. Werieth (talk) 00:14, 23 March 2014 (UTC)


 * Since the bot bot operator is refusing to address a basic issue of providing opt out instructions for a bot that is opt-out compliant I am requesting either a revocation of permission until such instructions are posted or until the bot passes a BRFA which is clearly marked as non-compliant. Werieth (talk) 17:49, 31 March 2014 (UTC)

BattyBot 25
I request the approval of BattyBot 25 be reconsidered. The description of the bot only mentions CS1 (which is Help:Citation Style 1). The rules for date format are described there. But the bot is also being used to change Citation templates. That style has hardly any date format restrictions; the users of that template have never been approached about whether they favor these restrictions, until this RFC which began less than 24 hours ago. Jc3s5h (talk) 03:13, 1 May 2014 (UTC), link fixed 4:03 UT.
 * WL doesn't work, so it's not possible to locate the RfC. --  Ohc  ¡digame! 03:49, 1 May 2014 (UTC)
 * Note that the bot task stated: "Fix incorrect date formats in citation templates to remove articles from Category:CS1 errors: dates." The version of the bot's code that was posted before the bot approval clearly showed that the bot would be editing citation templates.
 * I believe your main issue is that citation templates should not be included in this CS1 category. I appreciate that you opened up the RfC (and fixed the wikilink) to discuss that issue.  If the result of the RfC is that the error checking is removed from the citation template, then I will be happy to adjust the bot's code accordingly.  Thanks!  GoingBatty (talk) 23:17, 1 May 2014 (UTC)
 * P.S. I would be happy to focus my attention to other bot tasks if they would be considered. I have been waiting for a response to Bots/Requests for approvals since April 13, and am waiting to submit other request until after #28 is considered. GoingBatty (talk) 23:20, 1 May 2014 (UTC)

renewing a bot
Hello

I have a question regarding renewing an approval for a bot which we got in 2011. The archival discussion about the bot is posted at: Bots/Requests_for_approval/InstructorCommentBot. We would like to renew this bot since we are going to start new phase of our work in this area. The specification of the bot will stay exactly the same as the one we have already got the approval for. I do appreciate your guidance on this process.

Rostaf (talk) 13:49, 16 May 2014 (UTC)
 * The request you link to not for an approved bot, but for a bot approved for trial. Did you have another request in mind? If not, please do file a new request and mention in the discussion field the old request. - Jarry1250 [Vacation needed] 08:19, 17 May 2014 (UTC)

Resurrecting DeltaQuadBot
I'm considering resurrecting some of DQ's bot functions. Specifically for now, I'd start with UAA Bot reported usernames. Would I need a new BRFA? I already have User:TPBot.--v/r - TP 05:22, 12 June 2014 (UTC)
 * Yes, it's certainly good practice to run a new BRFA in this cases, to allow consideration of the task's continued relevance, the new operator, any tweaks or updates that could/should be made. It would be unlike to result in a rejection, in any case. - Jarry1250 [Vacation needed] 14:06, 12 June 2014 (UTC)

Copy and paste detection bot
We have build a copy and paste detection bot based on WP:Turnitin. We are planning on running it on a few edits. It makes no changes to main space but just puts the results at WP:MED/Copyright. From what we understand we do not need approval for this sort of work. Please advise if we are wrong. There is support for the idea in a number of places. We hope to have a proof for concept soon. Will only run on medical articles initially. Doc James (talk · contribs · email) (if I write on your page reply on mine) 14:24, 10 August 2014 (UTC)
 * It's best to have all bots be approved! That being said, your bot may not need a , and should have no trouble getting a trial approved. —  xaosflux  Talk 15:35, 10 August 2014 (UTC)
 * Am not running the bot myself. Just involved with its development. Will write more details and apply in full when home. Doc James  (talk · contribs · email) (if I write on your page reply on mine) 07:56, 11 August 2014 (UTC)

Contradiction
This issue was originally brought up here, but I'm moving it to this page, since this is the more appropriate venue for this portion of the discussion.


 * According to this page: "If you want to run a bot on the English Wikipedia, you must first get it approved."
 * According to WP:BOTAPPROVAL: "Operators may carry out limited testing of bot processes without approval...".

I think this contradiction needs to be addressed. I believe that the bot approval wording should take precedence, as limited testing by read-only bots or bots that edit only in their own user space is generally harmless and should not need approval. (Not to mention, as pointed out to me by a WP admin long ago, unless a server admin is looking at usage logs, or a checkuser is looking at the user agent, nobody's ever going to be able to tell if a read-only bot is running anyway.)

If we go that route, a simple tweak to this page's lead to "you must first get it approved, except as noted at WP:BOTAPPROVAL" or something along those lines would do the job. – RobinHood70 talk 17:02, 16 October 2014 (UTC)
 * I don't find this contradiction very interesting: in the vast majority of cases, a bot must be approved to be run. Confusing that with "except for very limited testing in WP:SANDBOX, or bots that edit only their or their owner's userspace (unless otherwise disruptive), or read-only bots, or bots that have been approved for a trial run by a BAG member, etc." is more likely to lead to well-meaning newbies deciding that their particular trial run is ok and getting themselves blocked than it is to clarify things for people who've actually read and understood WP:BOTPOL.
 * I would support a change of the target of the "approved" link on this page from Bots/Requests for approval/Approved to Bot policy, but that's as far as I would go. Anomie⚔ 15:10, 17 October 2014 (UTC)
 * That's fine too. The very first sentence there makes is clear when approval is needed and when it's not. – RobinHood70 talk 20:10, 17 October 2014 (UTC)
 * I went ahead and did that then. Anomie⚔ 00:44, 18 October 2014 (UTC)

When is bot approval needed?
I've made a series of 45 edits to take certain articles out of Category:Pages using duplicate arguments in template calls. I did the first few in the edit window, and then switched to AWB to do the others ([//en.wikipedia.org/w/index.php?title=2005_in_Pride_FC&diff=631374578&oldid=630708815 sample edit]). I kept my editing rate low and checked each diff. I've identified another list of 185 pages that could benefit from the same fix; I would like to work through those a bit faster if possible.

Did the series of 45 edits need bot approval? Would 185 edits? The rule at WP:BOTASSIST isn't much help: "In general, processes that are operated at higher speeds, with a high volume of edits, or are more automated, may be more likely to be treated as bots for these purposes" -- John of Reading (talk) 21:49, 27 October 2014 (UTC)
 * I'm a bit out-of-date on current consensus, but I'd say a reasonable rule is: as soon as you're editing so fast you can't check all the edits by hand, you need approval, as soon as you're editing so fast that you're screwing up people's watchlists, you need a bot flag too. --ais523 22:13, 27 October 2014 (UTC)
 * Here goes then: Bots/Requests for approval/John of Reading Bot. -- John of Reading (talk) 08:56, 28 October 2014 (UTC)

Need bot approval for running pywikibot scripts on English Wikipedia
Hi I would like to use my bot account, RahmanuddinBot which was denied bot access earlier. I would be using category.py to retrieve few list of articles on my sandbox pages. Please approve it for a period of 25 days.--Rahmanuddin Shaik (talk) 22:52, 13 November 2014 (UTC)
 * Your prior request, Bots/Requests for approval/RahmanuddinBot, was denied because it was not possible-you were requesting access for tewiki, not enwiki. If you bot is only going to be reading your sandbox and does not require a flag you can just do it; if you need flagging for high api use please submit an enwiki request. —  xaosflux  Talk 00:37, 14 November 2014 (UTC)

Review of expired request for APersonBot's second request
The BRFA for APersonBot's second request was declined due to inactivity. Since I have returned (mwahaha) and for running the bot on Labs, can the bot request be approved? APerson (talk!) 19:34, 28 November 2014 (UTC)
 * I have reopened the request, so responses should go there. APerson (talk!) 01:45, 29 November 2014 (UTC)
 * Quick note: I've relisted the BRFA on the main page - you might want to stick the template on it if you don't hear anything within a couple of days. Reticulated Spline (t &bull; c) 10:48, 29 November 2014 (UTC)

User:EranBot
EranBot has been running successfully on medical content for 6 months. It has picked up more than 200 cases of copyright violations and for issues it flags is right more than half the time.

I would like to request permission to expand the content this bot runs on beyond medical content. Not only will approval be required before this occurs but we also must check with our partners at Turnitin and we need to develop mechanisms to address all the flagged concerns.

Should I start a new bot request approval? Or what is the proper process for expanding the scope of a bots work?

Doc James (talk · contribs · email) 17:48, 18 December 2014 (UTC)
 * As this is a significant change that may need community input please just make a second bot request, same as if you were adding a new task. — xaosflux  Talk 17:56, 18 December 2014 (UTC)
 * Thanks User:Xaosflux will do. Doc James  (talk · contribs · email) 19:29, 18 December 2014 (UTC)

Merge From
Any objections to deprecating Bots/Requests for approval/Adminbots? Don't see any use of it having its own subpage. — xaosflux  Talk 03:20, 5 January 2015 (UTC)
 * I object. I have that page watchlisted because I want to see what is being proposed as an automated administrator-level account, but I don't have the patience to wade through WP:B/RfA every day just to look for the few Adminbot requests a year. NW ( Talk ) 05:22, 5 January 2015 (UTC)
 * There's nothing to actually merge here: Bots/Requests for approval/Adminbots exists only as a short notice of adminbot requests while the actual transclusion and approvals process uses Bots/Requests for approval just like any other request. Anomie⚔ 12:20, 5 January 2015 (UTC)
 * OK You two have already convinced me :D Withdrawn. —  xaosflux  Talk 13:54, 5 January 2015 (UTC)

Archives sorted and cleaned up
FYI. I have put the archives back in chronological order, and broken up the massively bloated Archive 4, which had grown to over a million bytes. Recent discussions were moved from Archive 1 to Archive 12. – Wbm1058 (talk) 01:26, 11 February 2015 (UTC)

BAGbot seems to be stuck
– Per AnomieBOT status, the Bot Administrators' Group bot, BAGbot, which updates BAG/Status, shows status "running" and the next run is scheduled for 2015-02-09 22:48:52. That would be yesterday. Is it stuck in a loop, or waiting for something that's down? Thanks, Wbm1058 (talk) 01:35, 11 February 2015 (UTC)
 * It's apparently stuck on something. I'll restart it in a little bit, but first I'm going to try to figure out what it's stuck on. Anomie⚔ 14:08, 11 February 2015 (UTC)

Request for re-examination Bots/Requests_for_approval/StanfordLinkBot
I see a serious lapse from my side. I missed to respond to comments by Josh Parris, Maralia and would like to respond to them. I was pleasantly surprised to look at the level of detail in which the edits were analysed by Josh. Although I am not an experienced wikipedia editor with thousands of edits under my belts, until such time, I would still like to improve Wikipedia using the best of my talents. Major concerns were with the exact process of link addition rather than the links themselves. I believe this is something which can be fixed.

The way we generate links is that the links in a source page are suggested on the basis of people using that page as a waypoint towards the target pages, across many instances. Thus a link at that place, would have helped the people reach the target page faster.

One error with this approach is that humans do not automatically consider the wiki linking guidelines while using it. However this issue can be mitigated by filtering links which refer to countries and originate from Category:Featured articles. We can work further on blacklisting certain types of links based on inputs from experienced editors.

Another error with this approach, is that finding the correct anchor text for the link itself can be tricky. The approach we use here is that we look at all possible anchors (words where a link can be inserted) for the target page across Wikipedia, filter such that anchors are not ambiguous (that is the probability of the anchor linking to other pages is small), and then we look for the first occurrence of any unambiguous anchor, since the link should be introduced as early as possible in the page. However we exclude section headings as per linking guidelines. The trade off here is between the least ambiguous link and its location in the page. We are working on a better weighting formula for combining these two factors instead of simply filtering by ambiguity and finding the first occurrence from the remaining. However such a formula is harder to justify.

This explains why "European" was included in the link text to classical music in Cello. The reason is that European classical music redirects to classical music across Wikipedia, which makes it an unambiguous anchor. And it is the first unambiguous anchor to appear in the article.

The second issue with this is that, although it can do multiple edits in the same page, it does not link pages which were not on the paths taken by people. So, the edit to Aromatic hydrocarbon linked to carbon dioxide, but not carbon monoxide, because we had data about carbon dioxide and not carbon monoxide. The common thread arising here was similarly positioned terms in the sentence and although we don't have supporting data, we are working on ways to identify such links and introduce them as well.

Another concern was regarding skipping of possible anchors. For example the bot chose to bypass Harvard Graduate School of Education and instead pipe link Harvard to Harvard University. However the manual of style for linking states that links should not be placed in the boldface reiteration of the title in the opening sentence of a lead which was specifically hard coded into the rules inside the bot.

Yet another concern is the context in which the anchor appears. For example, we wanted to link Raphael to Renaissance. The first occurrence of renaissance was preceded with the word high. We shall take care to exclude the anchors where in conjunction with preceding or succeeding words they can link to other pages.

However it is hard to understand the context of the sentence for example the fact that soybean edit links to Dairy, which is actually a dairy products factory instead of Dairy product. Or in sugar, where Protein was linked, instead of Protein (nutrient). This is harder than all the other issues and we have not been able to find a befitting solution for it.

Given that I am inexperienced in editing, I would like to work together with expert editors to augment the bot with AI rules, such that most of the edits are not shoddy and the utility is maximized. Ashwinpp (talk) 17:49, 15 March 2015 (UTC)


 * I think this is an area that you need strong AI for, and we just don't have that yet. All the mechanical questions are easy enough to sort out, but as you say the problems with Dairy and Protein are difficult.  In fact, they're so difficult I'm willing to bet that normal humans wouldn't notice/appreciate the problem - you need experienced editors. I think that was demonstrated with your attempt to validate your data using the Mechanical Turk.
 * As a guide for next steps, you could go down the semi-automated editing route that User:Dispenser went down in solving disambiguation: http://dispenser.homenet.org/~dispenser/view/Dab_solver Josh Parris 21:59, 20 March 2015 (UTC)


 * So to rephrase and ensure that I'm getting the correct idea. First we need to distinguish between easy v/s difficult links. The easy edits correspond to the mechanical questions and the difficult links correspond to the ambiguous links. It is possible that majority of the links have some ambiguity and fall under the difficult category. This distinction can be made on the basis of the output of dab_solver.py and and a threshold on the relevancy scores. The difficult links shall then be inserted with the help of experienced editors. Ashwinpp (talk) 04:13, 24 March 2015 (UTC)

Request for re-examination Bots/Requests for approval/Theo's Little Bot, Task 1
How does one judge the "low resolution" requirement of our Non-free content policy? Here are some things I consider - the size of the source image, the detail in the source image, the quality of the scan and encoding. What things does Theo's Little Bot consider? Only whether it's greater than 0.1 megapixels, an arbitrary number suggested by Non-free_content. And if it is greater, and it's in Category:Wikipedia non-free file size reduction requests, the answer is to reduce it to 0.1 megapixels regardless of its content. This degradation in quality is a one way process, as previous versions are deleted, users cannot revert, information is lost.

A look through the bot's uploads shows examples such as -
 * File:Drchaos boxart.jpg - An already low resolution file pointlessly resized from 275 × 396 to 263 × 378. Note the source file itself was previously manually reduced.
 * File:All Men Are Mortal, 1946 French edition.jpg - An already low resolution file pointlessly resized from 278 × 440 to 251 × 397. This file is actually free-use (Template:PD-text), so you could revert it.
 * File:Slacker-logo-black-official-2015.png - This file is actually free-use (Template:PD-text), but you can't revert it, because previous versions have been deleted. No one is reviewing bot uploads.
 * File:Endless Legend Box art.jpg - Correctly reduced once. Pointlessly bot reduced again.
 * File:Fantastic Four 48.jpg - This has been reduced, and thus re-encoded 3 times. Give it a few years and it'll just be artifacts.
 * File:Drakengard2ground gameplay.jpg, File:Drakengard2aerial gameplay.jpg - Muddy, low resolution with compression artifacts. Images reviewed as part of the good article process.  Pointlessly reduced to further muddy, lower resolution and introduce more artifacts.
 * File:Thor-272.jpg - User:Theo's Little Bot fights User:DASHBot.

What does Template:Non-free reduce mean? It used to mean the current file is too large. Now it means, the current file needs to be 0.1 megapixels. For many editors, it still means the former. Editors will not consider whether the file should be 0.1 megapixels, only that it is too large - this is the intuitive expectation, regardless of bot implementation. Right now, there is no way to tag images for reduction and for that resulting image to be larger than 0.1 megapixels. Even when an image is unambiguously high definition, I'm told File:Dear Esther Screenshot Large.jpg was at 1080p, tagging it would be a mistake - because you're left with a resulting file so small as to fail its rationale.

I propose we stop TLB's task 1 from running, and only restart it when we have a better process and implementation in place to stop its pointless overuse and resulting errors shown above. - hahnch e n 21:48, 15 February 2015 (UTC)
 * The bot is working fine, but editors need to recognize the reduce tag should only be for images far outside the range. That's a behavior problem, not a bot problem. --M ASEM (t) 22:45, 15 February 2015 (UTC)
 * Read second to last paragraph. Images can be correctly tagged, and still have negative results. You could place an onus on the revision-deleting administrator to review the image, but that is not done - of the files I checked, not once was a downscale reverted. It's clear that either process is ignored, or has never been put in place. - hahnch e n 23:02, 15 February 2015 (UTC)
 * If the rationale has no explaination of why a higher-than-normal resolution image is provided (and in the range of ~0.2 MP or higher), it is completely appropriate to tag that with a non-free reduce template and let the bot handle it. It is up to the image uploader or those using the image to make sure that if a larger image is needed, to justify it in the rationale. No rationale for the larger image, then it needs to be resized. --M ASEM (t) 23:36, 15 February 2015 (UTC)
 * The tag is correct, it was a file that needed reduction. Needing reduction and "needing reduction to 0.1 megapixels" are different things. - hahnch e n 23:41, 15 February 2015 (UTC)
 * The rationale does not give any reason why the image needs to be larger than 0.1MP, so someone reviewing images and seeing a full 1080p image has no reason to question the need to reduce. --M ASEM (t) 00:24, 16 February 2015 (UTC)
 * Needing reduction and "needing reduction to 0.1 megapixels" are different things. Admins are not reviewing the resized images. Hence the example above of the reduction of File:Slacker-logo-black-official-2015.png which in actuality is a free-use file.  The resized File:Dear Esther Screenshot Large.jpg is unsuited to show the game's graphics as specified by the rationale and article image caption.  Those are absolutely the reasons why the file has to be larger than 0.1MP, rationales have never explicitly stated why images are larger than the 0.1MP suggestion.  File:Warlugulong - Zoom.png is used in featured article Warlugulong, it is nearer 0.2MP, the rationale is suitable and justifies the actual size of the upload, not an imaginary number.  I considered all the questions I posed right at the top of this section when I uploaded the image, but I have no confidence that the tagger, bot, or admin would do the same. - hahnch e n 11:49, 16 February 2015 (UTC)
 * The Esther image has a poor rationale to justify "the graphics to the game are important", particularly when the image started at 1080p resolution. ("To show off the beautiful graphics" is not a proper rational - in contrast to the Warlungulong image where the rational explains that the detail is discussed by sources in the article.) You're blaming the bot for something it is only told to do by editors well aware of NFC policy's minimal use requirement. --M ASEM  (t) 15:55, 16 February 2015 (UTC)
 * Are you confident that if File:Warlugulong - Zoom.png is tagged, it will get reverted? Do you believe had I not stepped in, that File:Starcommand2013battle.jpg would still be saved as a useful image? That too was reduced to 0.1MP.  If not, we do not have a suitable process in place for the operation of this bot, and we should stop this task until we do. - hahnch e n 16:43, 16 February 2015 (UTC)

-- Magioladitis (talk) 22:50, 15 February 2015 (UTC)

It might be worth giving the template a size parameter or two. etc...

Also maybe it would make sense for the bot to skip items which would be reduced by less than 50%. All the best: Rich Farmbrough, 18:26, 17 February 2015 (UTC).


 * Note that the erroneously tagged as non-free file File:All Men Are Mortal, 1946 French edition.jpg has been reduced and the previous versions have been deleted. Admins do not review the downsized files, and how can they, given the amount of pointless reduces that are forced through the system? - hahnch e n 20:31, 20 February 2015 (UTC)
 * Again, the bot cannot understand images outside of their digital nature. It can't figure out free vs non-free, and this looks like a case that the uploader opted to select non-free as unsure of the best solution. Thus, the person that put the tag on that without questioning if it should be free is the behavior to discourage, not the bot. --M ASEM (t) 20:36, 20 February 2015 (UTC)
 * Masem, everyone reading this page already knows how the bot works. We can use the guillotine to trim hair too, but you can't defend accidental decapitations with "all a guillotine does is cut stuff".  In this case, the bot and the process surrounding it compounded an error.  There is no oversight at any point, the tagger, bot and admin do not review the image or its use.  Until we have that process in place, we should not be operating the bot. - hahnch e n 11:04, 21 February 2015 (UTC)

According to Bot_policy, this is where bots are re-assessed. Yet neither the bot author or any of the bot approvals group have made any comment. Aside from the discussion above, there's also this discussion at the Village Pump. The general consensus is that the current implementation is flawed, and there are multiple easy to implement fixes. I don't want to waste my time with shit such as this that the bot enables. - hahnch e n 01:45, 25 May 2015 (UTC)
 * Proposal - The bot should resize images to 0.15 megapixels and ignore smaller images.
 * Rich suggests ignoring images larger than 0.2MP. Masem suggests that the tag should only be used for images "far outside the range".  Instead of reducing images to 0.1MP and ignoring close cases, it should just reduce to 0.15MP.  Bots cannot make any of the value judgements that editors can.  The bot should be conservative, if the image needs resizing below 0.15MP, then that should be a manual process, where the right questions can be asked.  The bot should place images below 0.15MP into Category:Wikipedia non-free file size reduction requests for manual processing and notify the tagger. I'd be fine with 0.2MP too, or reducing to 0.15MP only those files which are larger than 0.2MP. - hahnch e n 11:04, 21 February 2015 (UTC)
 * Proposal - Uploaders must be notified when Template:Non-free reduce is tagged to their image.
 * While savvy users may watchlist their images like User:Darkwarriorblake at File:Dredd2012Poster.jpg, many reductions take place with editors oblivious until it's too late. This could be automated, or you could place the onus onto the taggers, I would prefer the former. - hahnch e n 11:04, 21 February 2015 (UTC)
 * Proposal - Once tagged, the bot does not reduce images for a week.
 * This gives time for manual reduction or tag removal by the uploader. - hahnch e n 11:04, 21 February 2015 (UTC)
 * Proposal - Bots (and taggers) should ignore previously reduced files.
 * File:Thor-272.jpg was a completely ridiculous reduction where Theo bot fights Dash bot. The Dashbot reduced version was 325*502, which is 0.16MP.  (I think this is because Dashbot looked at image width as well, conservatively allowing for larger thumbnail preferences)  The original was clearly low resolution, any further reduction is needless.  Can the bot parse file history comments?  If so, if the previous version was already marked reduced, it should place the file into Category:Wikipedia non-free file size reduction requests for manual processing and notify the tagger. - hahnch e n 11:04, 21 February 2015 (UTC)
 * Proposal - Admins to actually review the resized files.
 * This should be happening already. The proposals above, if implemented, should result in a reduction of the pointless resizes and make reviewing files more manageable. - hahnch e n 11:04, 21 February 2015 (UTC)

SamoaBot 2
While I agree a BRFA might be overkill for ~30 edits, and I'm glad the typos were fixed, I'm a bit disappointed for the outcome of the request. Magioladitis mentioned WP:SPELLBOT apparently ignoring that I explicitly stated the 'bot' would have been manually operated. Denying a BRFA within few minutes from the first reply, when the task has actually been carried out by someone else, is not a great display of sensitivity. -- Ricordi  samoa  19:27, 7 May 2015 (UTC)
 * I read the BRFA, and the responses looked reasonable to me. There was no indication in the request that the bot would be able to avoid false fixes of text like "the composer's name is sometimes misspelled as 'Giuseppi Verdi'." There was also no indication of which namespaces the bot would operate in; archived pages may have been undesirable to edit with this bot. By the time all of that would have been worked out, the fixes could already have been done manually. And they were.


 * At this point, it's water under the bridge. If you had proposed 3,000 edits instead of 30, I would have more sympathy toward your disappointment. – Jonesey95 (talk) 19:56, 7 May 2015 (UTC)
 * "Manual" means I would have checked every typo, including the title of the page it occurred in and the surrounding context. And of course I would not have edited discussion pages. Please note that I'm not challenging the closure in itself. It's a matter of tact; it's about building a friendly space for potential bot operators: I'm afraid the English Wikipedia fails in regard to this. -- Ricordi  samoa  10:53, 8 May 2015 (UTC)

you can always request an alternative account per WP:VALIDALT. We do not give out bot flags for single-use tasks that involve such a small numbers of edits and should only be done manually. There are many editors performing spell-checking in Wikipedia from their normal accounts. You are welcome to help on that. -- Magioladitis (talk) 11:20, 8 May 2015 (UTC)
 * had you written this explanation straight in the BRFA, it would have been perfect ;-) -- Ricordi  samoa  13:28, 8 May 2015 (UTC)

ArmbrustBot 4

 * The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.

The operator of ArmbrustBot 4 seems to be unwilling to comply to WP:Bot policy, see User talk:Armbrust. Please revoke the bot's permission. --Francis Schonken (talk) 20:13, 9 August 2015 (UTC)
 * The intention of the policy is not to forbid bots from recategorising biographical articles into broader categories. "1 BC births" is a subset of "0s BC births", and the latter can thus be safely substituted by a bot; if 1 BC is an erroneous categorisation, so is 0s BC, and they will both be as a result of human error. Alakzi (talk) 20:37, 9 August 2015 (UTC)
 * Well, the bot operator interpreted that besides recategorizing the "birth by year" cat to the parent "birth by decade" category (which is unproblematic), they should add a category like "1 BC" to the same biographical article that never has been in a "year" category (which would not be conforming to WP:COP) – which imho is neither the outcome of the CfD discussion, nor allowed by bot policy. "Dual" upmerges in this sense are not allowed on biographical articles by bot policy. --Francis Schonken (talk) 20:48, 9 August 2015 (UTC)
 * So there is no "human error" involved apart from the bot operator's (adding unallowed categories that are not the result of a recategorization of a pre-existing category), and an error they won't recognize nor act upon. For this unwillingness, please remove the bot's permission. --Francis Schonken (talk) 20:55, 9 August 2015 (UTC)
 * Please provide links to diffs of actual problem edits. Anomie⚔ 20:58, 9 August 2015 (UTC)
 * Diff – problematically adds Category:15 BC (besides the unproblematic Category:15 BC births &rarr; Category:10s BC births recategorization).
 * Afaics all today's edits at have the same problem. --Francis Schonken (talk) 21:06, 9 August 2015 (UTC)
 * I see nothing wrong with that edit. Both categories are clearly implied by the now-deleted combined category that was being replaced. Anomie⚔ 21:33, 9 August 2015 (UTC)
 * That's not a violation of policy, either; it is contrary to the WP:COP guideline, but it's not what the bot policy's intended to protect against. If a recategorisation into years wasn't intended, why was a double upmerge proposed to begin with, and why did none of the CfD participants nor the closer comment on it? It seems unfair to lay the blame squarely on the bot operator. Alakzi (talk) 21:12, 9 August 2015 (UTC)
 * I don't say the blame is "squarely on the bot operator" (I have asked the closer of the CfD discussion to review their closure )
 * What I say is that the bot operator is unwilling to see a problem, and even more unwilling to work towards a solution, which I think enough reasons for not letting this operator operate this bot. Look, I'm not a bot operator and when a bot operator decides he lets his bot perform tasks that should not have been done in the first place (whoever gave part of the permission to do it) then the bot wins if the bot operator puts his umbrella out and claims by high and low they did nothing wrong... unless you people here say no to such irresponsible behaviour of the bot operator.
 * And indeed adding Category:15 BC by bot is against bot policy (WP:Bot policy: "Assignment of person categories should not be made using a bot"), which you also should know, and not react with indignation against me for pointing it out. --Francis Schonken (talk) 21:21, 9 August 2015 (UTC)
 * It's not even clear to me that the policy you link applies: depending on what is meant by "person category", Category:15 BC may not even be one. Anomie⚔ 21:33, 9 August 2015 (UTC)
 * How's it different to adding births in decade categories - which you appear to have no issue with - from a policy perspective? You've not explained WP:COP to the bot operator, which is the actual point of contention. If you'd done that before coming here, rather than accuse them of flouting policy, this matter might've been resolved hours ago. Alakzi (talk) 21:37, 9 August 2015 (UTC)


 * The discussion above is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.


 * Can I reopen? A comment was added after closure of the discussion, and I'd like to reply to it without changing venue. --Francis Schonken (talk) 21:40, 9 August 2015 (UTC)
 * No. Take discussion elsewhere if you want. Anomie⚔ 21:41, 9 August 2015 (UTC)

Adding a form for BRFA
Sick of filling out forms in wikitext, I made a DRN-style form that you can test out by installing User:APerson/easy-brfa.js and then going to WP:Bots/Requests for approval/request. Feedback is welcome; eventually, it would be nice if we had a button on WP:BRFA that went straight to the request subpage. APerson (talk!) 23:33, 10 September 2015 (UTC)
 * Personally, I don't see the need. BRFAs are fairly easy to fill out. Even on my first one, the commented notes guided me through the entire thing step-by-step, and I had no problems. Has there been concern anywhere that BRFAs are difficult to fill out? ~ RobTalk 06:50, 15 September 2015 (UTC)
 * , it's less about the difficulty of filling out the wikitext form than it is convenience. The wikitext editor wasn't intended for filling out forms, and as I filled out my most recent one, I had to constantly check if I was typing in the form text or a comment instead of where my response was supposed to go. I didn't have any features like autocompletion or tabbing to the next form field, either. I don't find BRFAs very complex to start; it's just that having a HTML form instead of a wikitext one is a nicer experience, in my opinion. APerson (talk!) 21:31, 15 September 2015 (UTC)
 * With WP:SILENT in mind, a button to the WP:BRFA header that uses the script I wrote. Feel free to revert it if you want. APerson (talk!) 02:15, 24 September 2015 (UTC)
 * I can't say I'm very fond of having a button that just leads to a page saying some random userscript needs to be installed. Why not have your user script insert the button when it's enabled? Anomie⚔ 01:03, 25 September 2015 (UTC)
 * And the button should come after the "Before applying for approval" section. Legoktm (talk) 01:15, 25 September 2015 (UTC)
 * It also shouldn't be any part of (or remotely confused to be part of) any official process (in direct contrast to APerson's edit that could easily be confused as part of the official process). WP:AN3's header, for example, makes it very clear that one can use a tool/script to make it easier, but it in no way is confusable as being inherently part of the reporting process.  Furthermore, I agree with Anomie's solution:  the user script should self-generate any/all buttons related to it. -- slakr  \ talk / 05:38, 25 September 2015 (UTC)

Bot operator approval list?
How does one find out if an editor is running an approved bot or has permission to run a bot? I ask because the edits of seem to be suspiciously bot-like, but I can't tell if this is authorized or not. --Calton | Talk 09:04, 2 October 2015 (UTC)

P.S.: I'm also going to ask at the Village Pump. --Calton | Talk 09:05, 2 October 2015 (UTC)


 * For that specific editor, it appears more like semi-automated behavior than automated behavior in my opinion. Irregular times between edits, 3-4 minute gaps between different types of edits (setup time), and a time card that indicates sleep all point in that direction. ~ RobTalk 09:15, 2 October 2015 (UTC)


 * As a general rule, if a bot has approval to run, the specifically allowed tasks will be under Special:PrefixIndex/Wikipedia:Bots/Requests_for_approval/Username. If a bot has approval to run at all, it'll (probably) have the bot user right, which you can check at Special:ListUsers/bot. —  Earwig   talk 15:19, 2 October 2015 (UTC)
 * Also, going to agree with Rob here that the edits don't strike me as very bot-like. edits like this one are reasonable if Time, Inc. was the preferred name (it's not) because they correct an inaccuracy. Most of his edits seem fine; it looks like he tried to correct that mistake after you brought it up. —  Earwig   talk  15:40, 2 October 2015 (UTC)

BRFA page
In last 3 sections named ″Approved requests″, ″Denied requests″ and ″Expired/withdrawn requests″ it's better to have links to requests pages, not (only) to bot account userlinks (botlinks). First column from this table looks for me more usefull than all simple userlinks from these sections. --XXN, 17:14, 26 October 2015 (UTC)
 * The first link after the bot name, "tasks", goes to the request page. —  Earwig   talk 17:36, 26 October 2015 (UTC)
 * True. Sorry, my bad. I have misread and ignored that label (in Userlinks on first postion is ″talk″, here is "tasks") and didn't checked link:( --XXN, 18:02, 26 October 2015 (UTC)
 * Understandable. "Tasks" is also a bit unclear since it implies a general list rather than the specific approval. Maybe we can tweak that. —  Earwig   talk 18:51, 26 October 2015 (UTC)

Looking for advise/Is it a bot
I am writing a program to auto update https://en.wikipedia.org/wiki/List_of_countries_by_percentage_of_population_living_in_poverty using the world bank api. https://github.com/utilitarianexe/wiki_poverty_list_bot If I use this to generate the markup but do not autopost does this still need approved. Also how do cite the program or let other editors of the page know that is how the data is being generated. Would it be better to make it truely autoupdate periodically as a full on bot? Thanks for any help Lonjers (talk) 00:22, 3 December 2015 (UTC)
 * No approval is needed if you are manually posting it (this would be considered more like a script than an automated bot, I suppose). Feel free to explain the nature of the program briefly in the edit summary; you might consider creating a page in your userspace with some info about it, including the GitHub URL, that you can link to in the edit summary. I don't see a pressing reason to make this a bot given how often I imagine the data is updated, but if you think it should be automatic after doing it for a while, we can go down that road then. —  Earwig   talk 00:30, 3 December 2015 (UTC)
 * To clarify one point: When manually posting it, you are expected to be reviewing the edit to make sure it really is sensible just as you would with any other edit, not just blindly hitting "save page" on the script's output. Anomie⚔ 13:19, 3 December 2015 (UTC)

Thanks for the help I finished with the first table using output from the program if you want to check it out. Will also be working on updating the rest of the page and maybe hunting for other things that need updated like this. If you have any suggestions of similar projects as this let me know. Maybe post on my wiki user page. Lonjers (talk) 02:44, 5 December 2015 (UTC)
 * If you want to let it run automatically and periodically, then yes it would be a bot. As far as how to cite, programs don't make edits - editors do (or editor controlled bots do).  You could write up at least a description of your program and include it in the edit summaries.  As for as actual "citations" go, your edits should incorporate a citation to the actual source of the data. —  xaosflux  Talk 03:45, 5 December 2015 (UTC)

Dead links bot
Regarding dead Links bot (ie. User:Cyberbot_II) .. some concerns have been raised by multiple users about how it handles bare URLs and the response from the author is that is how it was designed and approved. How do we open new discussion/concerns about an approved active bot? -- Green  C  16:19, 10 February 2016 (UTC)
 * If you want to bring up concerns about bots, here or usually WP:BON is the venue. There's no dedicated page or process for this. Most interested parties would be watching either. Usually these get resolved on talk pages or here or, if all fails, ANI. — HELL KNOWZ  ▎TALK 16:31, 10 February 2016 (UTC)

From the archive (formerly Update these instructions w.r.t. AnomieBOT)
I was recently redirecting subpages of various talk pages, and this query from four years ago hasn't been answered yet: Part III seems somewhat dated. In particular, after you add BotTrialComplete:
 * 1) The instructions tell the bot operator to move the record to the 'trial complete' section, but
 * 2) AnomieBOT does this automatically.

I guess that AnomieBOT probably does similar maintenance in other situations. If this is a mature capability of AnomieBOT, perhaps the instructions should be revised. Blevintron (talk) 15:42, 19 April 2012 (UTC) Any comments? APerson (talk!) 04:08, 5 April 2016 (UTC)

Edit verification help
Hello, I'm seeking a volunteer to verify if 22 edits made by my bot look OK to you. The list can be found here. Thanks for any feedback. -- Green  C  18:13, 14 April 2016 (UTC)

Page is mad
Can you take a look at this page and the included templates - something is breaking the page - see the errors in the approved requests (sometimes missing) and denied requests sections. — xaosflux  Talk 21:40, 20 April 2016 (UTC)
 * (not sure why I was fingered as the one who broke everything ), the problem is that we exceeded the post-expand include size limit, so some templates were left unexpanded. We currently have 16 BRFAs running, which is more than I've seen for a while, so that's probably the culprit. The only fix I can think of right now is to remove some transcluded BRFAs, which of course isn't ideal at all; on the other hand, the number of trial-completed BRFAs also seems quite high, so maybe a BAG member or two could look into closing them. APerson (talk!) 01:28, 21 April 2016 (UTC)
 * *cough*I think I may now what the cause is. *cough*.—cyberpower Chat :Limited Access 02:04, 21 April 2016 (UTC)
 * You can thank Cyberbot II 5a, for transcluding a massive page related to the trial. It's grown to a massive size.  I've now changed it to a link, which has fixed the problem. :p—cyberpower Chat :Limited Access 02:11, 21 April 2016 (UTC)
 * Thank you both - maybe we need a banner - Hey BAG guys, if you don't work this page the whole thing will ASPLODE! — xaosflux  Talk 02:17, 21 April 2016 (UTC)
 * YES PLEASE!!—cyberpower Chat :Limited Access 02:19, 21 April 2016 (UTC)