Wikipedia:Bot requests

Source: Wikipedia, the free encyclopedia.

This is an old revision of this page, as edited by Cyberpower678 (talk | contribs) at 06:11, 24 February 2017 (→‎Website suddenly took down a lot of its material, need archiving bot!: Reply). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

This is a page for requesting tasks to be done by bots per the bot policy. This is an appropriate place to put ideas for uncontroversial bot tasks, to get early feedback on ideas for bot tasks (controversial or not), and to seek bot operators for bot tasks. Consensus-building discussions requiring large community input (such as request for comments) should normally be held at WP:VPPROP or other relevant pages (such as a WikiProject's talk page).

You can check the "Commonly Requested Bots" box above to see if a suitable bot already exists for the task you have in mind. If you have a question about a particular bot, contact the bot operator directly via their talk page or the bot's talk page. If a bot is acting improperly, follow the guidance outlined in WP:BOTISSUE. For broader issues and general discussion about bots, see the bot noticeboard.

Before making a request, please see the list of frequently denied bots, either because they are too complicated to program, or do not have consensus from the Wikipedia community. If you are requesting that a template (such as a WikiProject banner) is added to all pages in a particular category, please be careful to check the category tree for any unwanted subcategories. It is best to give a complete list of categories that should be worked through individually, rather than one category to be analyzed recursively (see example difference).

Alternatives to bot requests

Note to bot operators: The {{BOTREQ}} template can be used to give common responses, and make it easier to keep track of the task's current status. If you complete a request, note that you did with {{BOTREQ|done}}, and archive the request after a few days (WP:1CA is useful here).


Please add your bot requests to the bottom of this page.
Make a new request
# Bot request Status 💬 👥 🙋 Last editor 🕒 (UTC) 🤖 Last botop editor 🕒 (UTC)
1 Automatic NOGALLERY keyword for categories containing non-free files (again) 27 11 Anomie 2024-08-04 14:09 Anomie 2024-08-04 14:09
2 Clear Category:Unlinked Wikidata redirects 9 6 Wikiwerner 2024-07-13 14:04 DreamRimmer 2024-04-21 03:28
3 Fixing stub tag placement on new articles Declined Not a good task for a bot. 5 4 Tom.Reding 2024-07-16 08:10 Tom.Reding 2024-07-16 08:10
4 Adding Facility IDs to AM/FM/LPFM station data Y Done 13 3 HouseBlaster 2024-07-25 12:42 Mdann52 2024-07-25 05:23
5 Tagging women's basketball article talk pages with project tags BRFA filed 15 4 Hmlarson 2024-07-18 17:13 Usernamekiran 2024-07-18 17:10
6 Bot that condenses identical references Coding... 12 6 ActivelyDisinterested 2024-08-03 20:48 Headbomb 2024-06-18 00:34
7 Bot to remove template from articles it doesn't belong on? 3 3 Thryduulf 2024-08-03 10:22 Primefac 2024-07-24 20:15
8 One-off: Adding all module doc pages to Category:Module documentation pages 6 2 Nickps 2024-07-25 16:02 Primefac 2024-07-25 12:22
9 Draft Categories 13 6 Bearcat 2024-08-09 04:24 DannyS712 2024-07-27 07:30
10 Remove new article comments 3 2 142.113.140.146 2024-07-28 22:33 Usernamekiran 2024-07-27 07:50
11 Removing Template:midsize from infobox parameters (violation of MOS:SMALLFONT)
Resolved
14 2 Qwerfjkl 2024-07-29 08:15 Qwerfjkl 2024-07-29 08:15
12 Change stadium to somerhing else in the template:Infobox Olympic games Needs wider discussion. 8 5 Jonesey95 2024-07-29 14:57 Primefac 2024-07-29 13:48
13 Change hyphens to en-dashes 16 7 1ctinus 2024-08-03 15:05 Qwerfjkl 2024-07-31 09:09
14 Consensus: Aldo, Giovanni e Giacomo 17 5 Dicklyon 2024-08-14 14:43 Qwerfjkl 2024-08-02 20:23
15 Cyclones 3 2 OhHaiMark 2024-08-05 22:21 Mdann52 2024-08-05 16:07
16 Substing int message headings on filepages 8 4 Jonteemil 2024-08-07 23:13 Primefac 2024-08-07 14:02
17 Removing redundant FURs on file pages 4 2 Jonteemil 2024-08-12 20:26 Anomie 2024-08-09 14:15
18 Need help with a super widespread typo: Washington, D.C (also U.S.A) 32 10 Jonesey95 2024-08-26 16:55 Qwerfjkl 2024-08-21 15:08
19 Dutch IPA 4 3 IvanScrooge98 2024-08-25 14:11
Legend
  • In the last hour
  • In the last day
  • In the last week
  • In the last month
  • More than one month
Manual settings
When exceptions occur,
please check the setting first.

Add protection templates to recently protected articles

We have bots that remove protection templates from pages (DumbBOT and MusikBot), but we don't have a bot right now that adds protection templates to recently protected articles. Lowercase sigmabot used to do this until it stopped working about two years ago. I generally think it's a good idea to add protection templates to protected articles, so people know (especially if you're logged in and autoconfirmed, because then you would have no idea it would be semi-protected). —MRD2014 (talkcontribs) 13:06, 18 October 2016 (UTC)[reply]

We need those bots because the expiration of protection is usually an automatic process. However, placing on the protection has to be done by an admin - and in this process as part of the instructions, a template that they use places that little padlock. Thus, any protected page will have the little padlock, I don't think many admins forget to do this. For it to be worth a bot to do this, there would have to be a substantial problem - can you show us any? If you can, then I will code and take this on. TheMagikCow (talk) 18:01, 15 December 2016 (UTC)[reply]
@TheMagikCow: Sorry for the late reply, but it's not really a problem, it's just that some administrators don't add protection templates when protecting the page (example), so a logged-in autoconfirmed user would have no idea it's semi-protected or extended-protected unless they clicked "edit" and saw the notice about the page being semi-protected or extended-protected. I ended up adding {{pp-30-500}} to seven articles ([1]). This has nothing to do with removing protection templates (something DumbBOT and MusikBot already do). The adding of {{pp}} templates was previously performed by lowercase sigmabot. —MRD2014 (talkcontribs) 00:34, 28 December 2016 (UTC)[reply]
@MRD2014: Ok, those examples make me feel that a bot is needed for this - and it would relieve the admins of the task of manually adding them. I think I will get Coding... and try to take this one on! TheMagikCow (talk) 10:39, 28 December 2016 (UTC)[reply]
@TheMagikCow: Thanks! —MRD2014 (talkcontribs) 14:41, 28 December 2016 (UTC)[reply]
Or possibly a bot who sends the admin a notice that "It looks like during your protection action on X you may have forgotten to add the lock icon. Please check and add the appropriate lock icon. Thank you" Hasteur (talk) 02:07, 2 January 2017 (UTC)[reply]
Hasteur's suggestion should probably be incorporated into the bot since it has the clear benefit of diminishing future instances of mismatched protection levels and protection templates by reminding admins for the future. Enterprisey (talk!) 03:05, 2 January 2017 (UTC)[reply]
OK - Will try to add that - would it be easier if that was a template? TheMagikCow (talk) 11:53, 2 January 2017 (UTC)[reply]
Some admins have {{nobots}} on their talk pages (Materialscientist for example) so the bot couldn't message those users. Also, lowercase sigmabot (the last bot to add protection templates) would correct protection templates too. —MRD2014 (talkcontribs) 17:20, 2 January 2017 (UTC)[reply]
In some cases, there is no need to add a prot padlock, such as when the page already bears either {{collapsible option}} or {{documentation}}; mostly these are pages in Template: space. Also, redirects should never be given a prot padlock - if done like this, for example, it breaks the redirection. Insread, redirects have a special set of templates which categorise the redir - they may be tagged with {{r fully protected}} or equivalent ({{r semi-protected}}, etc.), but it is often easier to ensure that either {{redirect category shell}} or the older {{this is a redirect}} is present, both of which determine the protection automatically, in a similar fashion to {{documentation}}. --Redrose64 🌹 (talk) 12:11, 3 January 2017 (UTC)[reply]
About the notifying admins thing, MediaWiki:Protect-text says "Please update the protection templates on the page after changing the protection level." in the instructions section. Also, the bot should not tag redirects with pp templates per Redrose64. If it tags articles that aren't redirects, it shouldn't have any major issues. —MRD2014 (talkcontribs) 19:26, 3 January 2017 (UTC)[reply]
This would be better as a mediawiki feature - see Wikipedia:Village_pump_(technical)#Use_CSS_for_lock_icons_on_protected_pages.3F, meta:2016_Community_Wishlist_Survey/Categories/Admins_and_stewards#Make_the_display_of_protection_templates_automatic, phab:T12347. Two main benefits: not depending on bots to run, and not spamming the edit history (protections are already displayed, no need to double up). As RedRose has pointed out, we already have working Lua code. Samsara 03:48, 4 January 2017 (UTC)[reply]
TheMagikCow has filed a BRFA for this request (see Wikipedia:Bots/Requests for approval/TheMagikBOT 2). —MRD2014 (talkcontribs) 18:29, 5 January 2017 (UTC)[reply]

BSicons

Could we have a bot that

  1. creates a daily-updated log of uploads, re-uploads, page moves and edits in BSicons (Commons files with prefix File:BSicon_);
  2. makes a list of Commons redirects with prefix File:BSicon_;
  3. uses the list (as well as a list of exceptions, probably this Commons category and its children) and uses it to edit RDT code (both {{Routemap}} and {{BSrow}}/{{BS-map}}/{{BS-table}}) which uses those redirects, replacing the redirect name with the newer name (for instance, replacing   (HUB83) with   (HUBe) and   (STRl) with   (STRfq));
  4. goes through Category:Pages using BSsplit instead of BSsrws and replaces \{\{BSsplit\|([^\|]+)\|([^\|]+)\|$1 $2 ([^\|\{\}])+\}\} with {{BSsrws|$1|$2|$3}}; and
  5. creates a list of BSicons with file size over 1 KB.
The example diagram.

This request is primarily for #2 and #3, since there've been a lot of page moves from confusing icon names recently and CommonsDelinker doesn't work for BSicons because they don't use file syntax. The others would be nice extras, but they're not absolutely necessary if no one wants to work on them. For clarity, an example of #3 would be changing

{{Routemap
|map=
CONTg\CONTg
BHF!~HUB84\BHF!~HUB82
CONTf\CONTf
}}

to

{{Routemap
|map=
CONTg\CONTg
BHF!~HUBaq\BHF!~HUBeq
CONTf\CONTf
}}

(Pinging Useddenim, Lost on Belmont, Sameboat, AlgaeGraphix, Newfraferz87, Redrose64 and YLSS.) Jc86035 (talk) Use {{re|Jc86035}}
to reply to me
08:59, 25 October 2016 (UTC) Jc86035 (talk) Use {{re|Jc86035}}[reply]
to reply to me
06:59, 27 October 2016 (UTC)

Point 1. should be all BSicon files, regardless of filetype, so that those (occasionally uploaded) .png files also get listed. Useddenim (talk) 10:48, 25 October 2016 (UTC)[reply]
Updated request. Thanks. Jc86035 (talk) Use {{re|Jc86035}}
to reply to me
11:42, 25 October 2016 (UTC)
[reply]

To further clarify, the regex for #3 is \n\{\{BS[^\}]+[\|\=]\s*$icon\s*\| for BS-map. I have no idea what it'd be for Routemap, but to the left of the icon ID could be one of \n (newline), ! !, !~ and \\ (escaped backslash); and to the right could be one of \n, !~, ~~, !@, __, !_ and \\. Jc86035 (talk) Use {{re|Jc86035}}
to reply to me
06:21, 26 October 2016 (UTC)[reply]

I started to do some coding for this request, but I will not have time to continue working on it until January. (I have no objections to another botop is handling this request before then.) I'm not familiar with the route diagram templates, so I will likely have questions. — JJMC89(T·C) 18:16, 9 December 2016 (UTC)[reply]

Update: I've done most of the coding for this, but I've run into a Pywikibot bug. The bug effects getting the list of redirects to exclude for #3. (If the bug is not resolved soon, I will try to work around it.) See below for example output for #1. (It will normally be replaced daily and only contain the previous day's changes.) #2 and #5 will be simple bulleted or numbered lists. Which would you prefer? Some clarification for #3, for {{routemap}} replace in |map= based on the separators above and for any template with a name starting with BS (or [Bb][Ss]?) replace entire parameter values that match, correct? Example for -BS to v-BSq on Minami-Urawa Station:
@@ -61 +61 @@
- {{BS6|dSTRq- orange|O1=dv-NULgq|STRq- orange|O2=-BS|STRq- orange|O3=-BS|STRq- orange|O4=-BS|STRq- orange|O5=-BS|dSTRq-
 orange|O6=dv-NULgq|5|← {{ja-stalink|Fuchūhommachi}}}}
+ {{BS6|dSTRq- orange|O1=dv-NULgq|STRq- orange|O2=v-BSq|STRq- orange|O3=v-BSq|STRq- orange|O4=v-BSq|STRq- orange|O5=v-BS
q|dSTRq- orange|O6=dv-NULgq|5|← {{ja-stalink|Fuchūhommachi}}}}
Example for #1
{| class="wikitable sortable"
|+ Updated: ~~~~~
! File !! Oldid !! Date/time !! User !! Edit summary
|-
| [[commons:File:BSicon -3BRIDGE.svg]] || 219463995 || 2016-11-25T18:24:57Z || SchlurcherBot || Bot: Removingcategory 'Uploaded with UploadWizard' per [[Commons:Categories for discussion/2016/08/Category:Uploaded with UploadWizard|discussion]]
|-
| [[commons:File:BSicon -3BRIDGEq.svg]] || 220150188 || 2016-11-26T09:13:44Z || AkBot || Category:Uploaded with UploadWizard removed per [[Commons:Categories for discussion/2016/08/Category:Uploaded with UploadWizard|community decision]]
|-
| [[commons:File:BSicon -3KRZ.svg]] || 220150226 || 2016-11-26T09:13:47Z || AkBot || Category:Uploaded with UploadWizard removed per [[Commons:Categories for discussion/2016/08/Category:Uploaded with UploadWizard|community decision]]
|-
| [[commons:File:BSicon -3KRZo.svg]] || 220150264 || 2016-11-26T09:13:50Z || AkBot || Category:Uploaded with UploadWizard removed per [[Commons:Categories for discussion/2016/08/Category:Uploaded with UploadWizard|community decision]]
|-
| [[commons:File:BSicon -3KRZu.svg]] || 220150305 || 2016-11-26T09:13:52Z || AkBot || Category:Uploaded with UploadWizard removed per [[Commons:Categories for discussion/2016/08/Category:Uploaded with UploadWizard|community decision]]
|-
| [[commons:File:BSicon -3STRq.svg]] || 220150349 || 2016-11-26T09:13:55Z || AkBot || Category:Uploaded with UploadWizard removed per [[Commons:Categories for discussion/2016/08/Category:Uploaded with UploadWizard|community decision]]
|-
| [[commons:File:BSicon -BRIDGE.svg]] || 220150391 || 2016-11-26T09:13:58Z || AkBot || Category:Uploaded withUploadWizard removed per [[Commons:Categories for discussion/2016/08/Category:Uploaded with UploadWizard|community decision]]
|-
| [[commons:File:BSicon -BRIDGE.svg]] || 203687181 || 2016-08-11T08:35:01Z || Jc86035 || Jc86035 uploaded a new version of [[File:BSicon -BRIDGE.svg]]
|-
| [[commons:File:BSicon -BRIDGE.svg]] || 203686022 || 2016-08-11T08:21:46Z || Jc86035 || Jc86035 uploaded a new version of [[File:BSicon -BRIDGE.svg]]
|-
| [[commons:File:BSicon -BRIDGEl.svg]] || 219463892 || 2016-11-25T18:24:47Z || SchlurcherBot || Bot: Removingcategory 'Uploaded with UploadWizard' per [[Commons:Categories for discussion/2016/08/Category:Uploaded with UploadWizard|discussion]]
|-
| [[commons:File:BSicon -BRIDGEq.svg]] || 219463860 || 2016-11-25T18:24:43Z || SchlurcherBot || Bot: Removingcategory 'Uploaded with UploadWizard' per [[Commons:Categories for discussion/2016/08/Category:Uploaded with UploadWizard|discussion]]
|-
| [[commons:File:BSicon -BRIDGEq.svg]] || 204432312 || 2016-08-21T08:45:59Z || Jc86035 || Jc86035 uploaded a new version of [[File:BSicon -BRIDGEq.svg]]
|-
| [[commons:File:BSicon -BRIDGEr.svg]] || 219463904 || 2016-11-25T18:24:48Z || SchlurcherBot || Bot: Removingcategory 'Uploaded with UploadWizard' per [[Commons:Categories for discussion/2016/08/Category:Uploaded with UploadWizard|discussion]]
|-
| [[commons:File:BSicon -BRIDGEvq.svg]] || 219463868 || 2016-11-25T18:24:44Z || SchlurcherBot || Bot: Removing category 'Uploaded with UploadWizard' per [[Commons:Categories for discussion/2016/08/Category:Uploaded with UploadWizard|discussion]]
|-
| [[commons:File:BSicon -BS.svg]] || 203149593 || 2016-08-05T01:20:39Z || Tuvalkin || Tuvalkin moved page [[File:BSicon -BS.svg]] to [[File:BSicon v-BSq.svg]] over redirect: Because that’s how it should be named.
|-
| [[commons:File:BSicon -DSTq.svg]] || 219463968 || 2016-11-25T18:24:54Z || SchlurcherBot || Bot: Removing category 'Uploaded with UploadWizard' per [[Commons:Categories for discussion/2016/08/Category:Uploaded with UploadWizard|discussion]]
|-
| [[commons:File:BSicon -GIPl.svg]] || 190769935 || 2016-03-20T19:57:19Z || Plutowiki || Lizenz
|-
| [[commons:File:BSicon -GIPl.svg]] || 190769739 || 2016-03-20T19:53:46Z || Plutowiki || User created page with UploadWizard
|-
| [[commons:File:BSicon -GRZq.svg]] || 219464007 || 2016-11-25T18:24:59Z || SchlurcherBot || Bot: Removing category 'Uploaded with UploadWizard' per [[Commons:Categories for discussion/2016/08/Category:Uploaded with UploadWizard|discussion]]
|-
| [[commons:File:BSicon -KBSTl.svg]] || 220150428 || 2016-11-26T09:14:01Z || AkBot || Category:Uploaded with UploadWizard removed per [[Commons:Categories for discussion/2016/08/Category:Uploaded with UploadWizard|community decision]]
|-
| [[commons:File:BSicon -KBSTr.svg]] || 220150461 || 2016-11-26T09:14:03Z || AkBot || Category:Uploaded with UploadWizard removed per [[Commons:Categories for discussion/2016/08/Category:Uploaded with UploadWizard|community decision]]
|-
| [[commons:File:BSicon -L3STRq.svg]] || 219464058 || 2016-11-25T18:25:05Z || SchlurcherBot || Bot: Removing category 'Uploaded with UploadWizard' per [[Commons:Categories for discussion/2016/08/Category:Uploaded with UploadWizard
|discussion]]
|-
| [[commons:File:BSicon -LSTR+l.svg]] || 228341148 || 2017-01-02T00:38:51Z || Zyxw59 || 
|-
| [[commons:File:BSicon -LSTR+l.svg]] || 228341113 || 2017-01-02T00:37:48Z || Zyxw59 || User created page with UploadWizard
|}
I haven't coded for #4 yet. Does it need to be done on a regular basis or only once? — JJMC89(T·C) 02:14, 19 January 2017 (UTC)[reply]
@JJMC89: (pinging Useddenim, Sameboat and AlgaeGraphix) Many thanks, looks good. Don't think #4 is necessary in retrospect, because it would catch some links to rail lines as well and changing those might be counterintuitive. Not sure about #3 but the whole icon name should be changed, if that's what you're saying. For #3 it might be a good idea to have a blacklist of redirects which shouldn't be changed (or a whitelist), because some icons, including   (v-BSq), might have been moved to a bad/incorrect name. It might be better to use numbered lists so we could find the length of the lists easily, but I don't mind if they're bulleted. Jc86035 (talk) Use {{re|Jc86035}}
to reply to me
02:40, 19 January 2017 (UTC)[reply]
My preference would be for bulleted list, as it is easier to manipulate with a text editor (vs. stripping out all of the item numbers). If you need the total number of files, it should be trivial to add a count statement to the bot's code. AlgaeGraphix (talk) 18:40, 19 January 2017 (UTC)[reply]
@AlgaeGraphix: Bullets vs numbered is * vs # as the character at the beginning of each line. @Jc86035: There will be an onwiki configuration that contains a blacklist of sorts. It will initially be excluding c:Category:Icons for railway descriptions/Exceptional permanent redirects (recursively including subcategories). Would using {{bsq}} for #1 and/or #2&5 instead of or in addition to the linked file name be beneficial? Also, if desired for #1 the table could include the last n days of changes. For the BRFA: Have there been any prior discussions for this? What pages would you like to use for #1, #2, and #5? (I'll but them in userspace if you don't have a place for them in projectspace.) Do you have an estimate of edits per day for #3? — JJMC89(T·C) 05:50, 20 January 2017 (UTC)[reply]
@JJMC89: I guess #1, #2 and #5 could go on your Commons userspace, but I don't really mind. For #1 maybe log pages could be sectioned into 24-hour periods (starting 00:00 UTC), like Chumwa's Commons new file logs but in tabular format. Using {{bsq}} would be great. I'm not aware if there have been any prior discussions (Useddenim, Tuvalkin, Sameboat?), although CommonsDelinker has never worked well with BSicons and I believe the only bot that previously did this was Chrisbot for a few months in 2009. For #3, there's probably going to be a very large number of edits on the first day (possibly as many as 5,000), but very few after that. Jc86035 (talk) Use {{re|Jc86035}}
to reply to me
09:29, 20 January 2017 (UTC)[reply]
@Jc86035:: The bot action as requested and their suggested splitting bweteen Commons and Wikiepdias make sense in my opinion. There were several discussions concerning CommonsDelinker in the past, but the matter is as you presented it. Tuvalkin (talk) 11:09, 20 January 2017 (UTC)[reply]

Commons bot request filed. I will file a BRFA here after that has run its course. — JJMC89(T·C) 00:35, 22 January 2017 (UTC)[reply]

Initial redirect-changing blacklist should include all of these icons; convoluted and boring discussion under way on exactly what's wrong with them or if anything's wrong with them. Jc86035 (talk) Use {{re|Jc86035}}
to reply to me
14:03, 22 January 2017 (UTC)[reply]
BRFA filed. — JJMC89(T·C) 04:22, 30 January 2017 (UTC)[reply]
@JJMC89: Just one thing – {{bsq|redirect}} (if it's going to be edited by the bot) should be replaced with {{bsq|new name|alt=redirect}}. Jc86035 (talk) Use {{re|Jc86035}}
to reply to me
08:51, 31 January 2017 (UTC)[reply]
@Jc86035: The bot is only editing {{Routemap}} and {{BS.*}} route diagram templates. {{BSicon quote}} ({{bsq}}) is a rail routemap template, so it will be ignored. — JJMC89(T·C) 16:38, 31 January 2017 (UTC)[reply]

VeblenBot

User:VeblenBot handles many of the routine chores associated with Peer Review. It was developed by User:CBM, and is currently in my care, but neither of us has the time or inclination to run it. Would someone be able to take it over? If so, please reply here. Thanks, Ruhrfisch ><>°° 19:06, 20 November 2016 (UTC)[reply]

Much of this is just a task-specific archiving job. It would be possible for someone else to rewrite this in a bot framework of their choice without too much work, instead of taking over the existing code. It's an important task for the Peer Review system, but I can't manage it any longer. — Carl (CBM · talk) 13:05, 21 November 2016 (UTC)[reply]
Are there details anywhere of what exactly the bot does? I'm not finding a relevant-looking BRFA for "many routine chores". Anomie 17:27, 26 November 2016 (UTC)[reply]
@Anomie: I'm also approved for this task, and I also can't maintain it, sadly. My implementation was kind of shit anyway. See Wikipedia:Bots/Requests_for_approval/BU_RoBOT_9 for task details, though. ~ Rob13Talk 05:13, 21 January 2017 (UTC)[reply]
One BRFA is Wikipedia:Bots/Requests_for_approval/VeblenBot_5. The PR system worked by having the bot make a page that tracks category contents. This is a relatively straightforward task: given a list of categories and templates, it generates wiki pages which list the category contents using the templates. By looking at subpages of User:VeblenBot/C/, it should be possible to recreate the list of categories that need to be tracked. There is more information at Template:CF and Wikipedia:Peer_review/Tools#Peer_review_process_-_technical_details. — Carl (CBM · talk) 01:41, 22 January 2017 (UTC)[reply]
@BU Rob13: BRFA filed Anomie 01:21, 23 January 2017 (UTC)[reply]
@Enterprisey: Didn't realize Anomie had taken this on. It wouldn't be a horrible thing to have multiple bot ops able to do this, but you may prefer to devote time elsewhere. Entirely up to you. Sorry if you've already started development. ~ Rob13Talk 02:29, 23 January 2017 (UTC)[reply]
No problem, and thanks for letting me know. Enterprisey (talk!) 03:40, 23 January 2017 (UTC)[reply]

Replacement Peer review bot - VeblenBot - URGENT

Peer review generally has 30-50 active reviews. We rely on a single bot, VeblenBot, to process new reviews and archive old reviews, otherwise the whole system crumbles. Unfortunately for the last 3 or so years we have had a large number of problems because the bot is not well supported and frequently is inactive.

We and the thousands of Wikipedians who use peer reviews would be very grateful if a functional replacement bot could be created that works consistently. I can supply more technical details about the process later, it is documented at WP:PR. Many thanks if you can solve this!! --Tom (LT) (talk) 12:57, 30 December 2016 (UTC)[reply]

Tom (LT), to understand what it does. The bot takes as input Category:Arts peer reviews and produces as output User:VeblenBot/C/Arts peer reviews. Does it also remove entries? Does it retrieve data from other places? -- GreenC 19:06, 30 December 2016 (UTC)[reply]
Green Cardamom, see WP:PR tab "technical details" --Tom (LT) (talk) 00:07, 31 December 2016 (UTC)[reply]
I've seen that. It's called the "Tools" tab BTW. -- GreenC 00:54, 31 December 2016 (UTC)[reply]
As it's hosted on Labs, I think the easiest thing would be for User:CBM (who I think is inactive as a bot op) or User:Ruhrfisch to add another keen Perl enthusiast to their Labs project to help out from time to time. Unfortunately I don't do Perl but I know plenty of people watching this page do! - Jarry1250 [Vacation needed] 20:13, 30 December 2016 (UTC)[reply]
@LT910001: At one point, I had taken this on I believe, but I petered out on running the task. That was my fault. Unfortunately, I can't run the bot for the next two weeks because I'm out-of-town. I can run it when I get back, but I'm much busier these days than I used to be, so I probably can't do it long-term. ~ Rob13Talk 23:44, 30 December 2016 (UTC)[reply]
Thanks for your offer, and if you could activate the bot infrequently that would be better than not at all, but we really need a longer term solution here. --Tom (LT) (talk) 00:07, 31 December 2016 (UTC)[reply]
@LT910001: Totally fell off my radar over the two weeks I was out of town, but I eventually remembered this. The bot is running now. ~ Rob13Talk 05:11, 21 January 2017 (UTC)[reply]
Thank you BU Rob13, much appreciated!--Tom (LT) (talk) 05:43, 21 January 2017 (UTC)[reply]

Can a bot operator have one of their bots fill in for what this bot above is supposed to do, because I've noticed that the peer review nomination pages haven't been updated since November. -- 1989 (talk) 23:38, 21 January 2017 (UTC)[reply]

Coding... Anomie 01:28, 22 January 2017 (UTC)[reply]
The same is true for the two GAR-related pages handled by VeblenBot: one is at User:VeblenBot/C/Wikipedia good article reassessment and controls the community reassessments are transcluded at WP:GAR, the main Good Article Reassessment page—we've been adding and subtracting these by hand as community GARs show up at Category:Good article reassessment nominees (which include both community and individual reassessments), or the GARs are closed and vanish from the category. The other is at User:VeblenBot/C/Good articles in need of review, and is based on the {{GAR request}} templates and their associated category, Category:Good articles in need of review. This last hasn't been updated since November either. I'm not sure what it would take to get these two VeblenBot chores up and working again, but it would be greatly appreciated. Many thanks. BlueMoonset (talk) 01:57, 22 January 2017 (UTC)[reply]
I've gotten code done to start populating those lists as subpages of User:AnomieBOT/C (since edits to the bot's own userspace don't need a BRFA). The categories to listify are configurable on-wiki if more are needed, see the instructions on that page. Going to look at the replacement for Wikipedia:Bots/Requests for approval/BU RoBOT 9 next. Anomie 03:06, 22 January 2017 (UTC)[reply]
@Anomie: You forgot to make this one, User:AnomieBOT/C/List peer reviews, list articles for PR. -- 1989 (talk) 03:23, 22 January 2017 (UTC)[reply]

Anomie, thanks for helping them with this. It is not a hard task, but I needed to move on to other things. — Carl (CBM · talk) 14:55, 22 January 2017 (UTC)[reply]

Anomie, my thanks, too. Do you or CBM know why the User:AnomieBOT/C/Wikipedia good article reassessment page display omits the last several entries (the ones from 2017)? The same thing was happening on the VeblenBot page, and while it doesn't prevent the page from working correctly with WP:GAR, something doesn't seem to be working as it should. Please let me know when you consider these pages ready to be used officially—when the bot runs on a regular schedule—and I'll adjust the GAR pages accordingly. BlueMoonset (talk) 16:09, 22 January 2017 (UTC)[reply]
@BlueMoonset: All 12 articles named like "Wikipedia:Good article reassessment/" in Category:Wikipedia good article reassessment are showing up on User:AnomieBOT/C/Wikipedia good article reassessment; the answer is probably that some change made the last several entries not be in that category anymore. Anomie 19:06, 22 January 2017 (UTC)[reply]
Also, AnomieBOT is running on a regular schedule already (it checks for updates to the lists hourly). Feel free to change things over. Anomie 19:10, 22 January 2017 (UTC)[reply]
Anomie, thanks for letting me know that AnomieBOT is now handling these pages and checking on an hourly basis. I'll update the affected GAR pages in a few minutes. As for the 2017-dated entries not displaying on the User:AnomieBOT/C/Wikipedia good article reassessment page, they're still in the category; this has been an issue since they were first manually added to the User:VeblenBot/C/Wikipedia good article reassessment page starting back on January 7. It doesn't affect the transclusions on the WP:GAR page, but it's odd that the AnomieBOT page, like the VeblenBot page before it, doesn't display them. This may be something down in the weeds of the CF suite workings; I didn't see that any of the peer review pages use this name-only format, so there may be a parameter somewhere that prevents post-2016 entries from displaying in this one case. BlueMoonset (talk) 19:34, 22 January 2017 (UTC)[reply]
@BlueMoonset: Oh, I see what you're referring to now, they're not showing up in the rendered page. Template:CF/GAR/Default (used by Template:CF/Wikipedia good article reassessment) has logic to only show reassessments that are more than 17 days old. The first 2017 assessment, from Jan 6, should start showing up around 16 hours from now. Anomie 20:39, 22 January 2017 (UTC)[reply]
Anomie, thanks for looking in to it. Presumably there's a good historical reason for the 17 day delay; it's good to know what's causing the display to work as it does. BlueMoonset (talk) 20:57, 22 January 2017 (UTC)[reply]
Adding: it looks like 1989 made the switchover at 03:30, so AnomieBOT has been on the job for over 16 hours at GAR. Thanks again. BlueMoonset (talk) 19:39, 22 January 2017 (UTC)[reply]

Missing BLP template

We need a bot that will search for all articles in Category:Living people, but without a {{BLP}} (or alternatives) on article's talk page, and add to these pages missing template. --XXN, 21:21, 20 November 2016 (UTC)[reply]

Ideally, not {{BLP}} directly, but indirectly via {{WikiProject Biography|living=yes}}. But we once had a bot that did that, I don't know what happened to it. --Redrose64 (talk) 10:33, 21 November 2016 (UTC)[reply]
{{WikiProject Biography|living=yes}} add the biography to Category:Biography articles of living people. TheMagikCow (talk) 18:48, 16 January 2017 (UTC)[reply]
Hi@Redrose64:, what was that bot's name? We faced such need recently during the Wiki Loves Africa photo contest on Commons. Hundreds of pictures from a parent category missed a certain template. I am planning to build of bot or adapt an existing one for similar cases.--African Hope (talk) 17:08, 4 February 2017 (UTC)[reply]
I'll code this. If there is a living people category but no {{BLP}} or {{WikiProject Banner Shell}} or {{WikiProject Biography}}? I might expand this to see if it has a 'living people' category but it does not have a living or not parameter. Dat GuyTalkContribs 17:28, 4 February 2017 (UTC)[reply]
I don't recall. Maybe Rich Farmbrough (talk · contribs) knows? --Redrose64 🌹 (talk) 00:20, 5 February 2017 (UTC)[reply]
I have been fixing members of Category:Biography articles without living parameter along with User:Vami_IV for some time. Menobot ensures that most biographies get tagged. I also did a one-off to tag such biographies a couple of moths ago. All the best: Rich Farmbrough, 00:32, 5 February 2017 (UTC).[reply]

IP-WHOIS bot

During vandal hunting I've noticed that IP vandals usually stop in their tracks the moment you add the 'Shared IP' template (with WHOIS info) to their Talk page. I assume they then realise they're not as anonymous as they thought. A bot that would automatically add that WHOIS template to an IP vandal's Talk page, let's say once they've reached warning level 2, would prevent further vandalism in a lot of cases. I don't know if this needs to be a new bot or if it could be added to ClueBot's tasks. I think ClueBot would be the best option since it already leaves warnings on those Talk pages, so adding the Shared/WHOIS template as well would probably be the fastest option. Any thoughts? Mind you, I'm not a programmer so there's no way I could code this thing myself. Yintan  20:27, 30 November 2016 (UTC)[reply]

This would be fairly easy to do. Coding... Tom29739 [talk] 17:32, 8 December 2016 (UTC)[reply]
Nice idea, Tom29739 what's the status on this? 103.6.159.67 (talk) 08:04, 16 January 2017 (UTC)[reply]
This is still being coded, development has slowed unfortunately due to being very busy in real life. Tom29739 [talk] 22:40, 18 January 2017 (UTC)[reply]

Update WikiWork factors

Hi, per what was discussed at Wikipedia talk:Version 1.0 Editorial Team/Index#Wikiwork factors, I'm asking that the WikiWork factors for WikiProjects be updated. I myself frequently reference them, they're pretty useful overall to scope out a project, so it would be nice to get them working again. Thanks, Icebob99 (talk) 16:16, 13 December 2016 (UTC)[reply]

Hello? Is this request feasible? Icebob99 (talk) 16:17, 16 December 2016 (UTC)[reply]

Just to give more context, at Wikipedia talk:Version 1.0 Editorial Team/Index we are having a discussion regarding the WP 1.0 bot (talk · contribs). One of the functions of the bot is to update the Wikiwork factors to be displayed in the WikiProject assessment tables. However the bot stopped updating it since July 2015. REquests to the bot owner has not been answered since he/she seems to have retired. Can someone here please see what's the issue with the bot and make it run again? The bot in question which updates the Wikiwork numbers is called Theo's Little Bot (talk · contribs). It has been doing other jobs as can be seen, just skipping the Wikiwork updation. —IB [ Poke ] 14:35, 17 December 2016 (UTC)[reply]

Just for reference: here's the manual calculator; you can divide the score you get from that tool by the total number of articles in the project to get the relative score. Getting the bot to do this, of course, would be the ideal scenario. Icebob99 (talk) 00:52, 18 December 2016 (UTC)[reply]
Thanks for the URL @Icebob99:, now I can get the progression of each Wikiproject. Do I need to update the Wikiwork page to reflect this so that its assimilated in the project assessment table? —IB [ Poke ] 06:33, 19 December 2016 (UTC)[reply]
@IndianBio: I went from User:WP 1.0 bot/Tables/Project/Microbiology to User:WP 1.0 bot/WikiWork and found in the documentation that there are four different pages that the User:Theo's Little Bot used to update: User:WP 1.0 bot/WikiWork/ww, the overall WikiWork score; User:WP 1.0 bot/WikiWork/ar, the total number of articles in the project; User:WP 1.0 bot/WikiWork/om, the relative WikiWork score; and User:WP 1.0 bot/WikiWork/ta, the table that contains the overall and relative scores. If you look at the history of each of those pages, the bot was updating them until 2 July 2015. You can update those pages manually by inputting the numbers by hand and then inputting the score from the calculator into the overall WikiWork score page, the number of articles page, or the relative WikiWork score page. The table generator page used values from those three pages. So to answer your question, yes you do need to update one of those WikiWork pages for it to show up in the project assessment table. (Anyone who is looking at reviving User:Theo's Little Bot could also use this info). Icebob99 (talk) 16:15, 19 December 2016 (UTC)[reply]
@Icebob99: I can't thank you enough for guiding me in generating the score and updating them. The project templates are finally reflecting the current status. —IB [ Poke ] 04:36, 20 December 2016 (UTC)[reply]
It's my pleasure just as much as yours! Although, it would be nice to have the bot do it rather than individual WikiProject editors... Icebob99 (talk) 04:39, 20 December 2016 (UTC)[reply]
@1989:, I see that you are active on this page, may I ask you to please look through this request once? —IB [ Poke ] 07:36, 20 December 2016 (UTC)[reply]

@IndianBio and Icebob99: The source code is available, and I might be able to take over, especially because of the recent m:Requests for comment/Abandoned Labs tools discussion. Would that be useful? Dat GuyTalkContribs 17:26, 20 December 2016 (UTC)[reply]

@DatGuy: That would be great! I'm not too knowledgeable in the world of bots, but I think that this is one of those where it gets running and goes on for a long time. Thanks! Icebob99 (talk) 18:01, 20 December 2016 (UTC)[reply]
@DatGuy: thanks for your response, did you have any progress with the bot's functionality? Sorry for asking. —IB [ Poke ] 16:11, 26 December 2016 (UTC)[reply]
No problem at all. The "committee" that should oversee the take-overs isn't actually created yet. I'll try and test it on my own computer and fix any minor bugs related to new updates before I start a BRFA. However, the code will still be private to respect Theo's wishes. Dat GuyTalkContribs 10:45, 27 December 2016 (UTC)[reply]
Hi, is there any progress? I'm not familiar with how long bots take to fix (you folks do some arcane sorcery), so I might be asking preemptively. Icebob99 (talk) 03:27, 19 January 2017 (UTC)[reply]
Coding.... Dat GuyTalkContribs 11:41, 1 February 2017 (UTC)[reply]

BRFA filed. Dat GuyTalkContribs 17:09, 1 February 2017 (UTC)[reply]

Thanks! WikiWork works again! Icebob99 (talk) 14:19, 13 February 2017 (UTC)[reply]

MarkAdmin.js

Hello.

I would like to transfer the following script to Wikipedia so users such as myself could identify which users are the following:

  • Administrators (by default)
  • Bureaucrats (by default)
  • Checkusers (by default)
  • Oversighters (by default)
  • ARBCOM Members (optional)
  • OTRS Members (optional)
  • Edit Filter Managers (optional)
  • Stewards (optional)

https://commons.wikimedia.org/wiki/MediaWiki:Gadget-markAdmins.js

I would like a bot to frequently update the list to make the information accurate.

https://commons.wikimedia.org/wiki/MediaWiki:Gadget-markAdmins-data.js 1989 (talk) 19:30, 18 December 2016 (UTC)[reply]

Add importScript('User:Amalthea/userhighlighter.js'); to your "skin".js file to show admins Ronhjones  (Talk) 21:27, 3 January 2017 (UTC)[reply]

add birthdate and age to infoboxes

Here's a thought... How about a bot to add {{birth date and age}}/{{death date and age}} templates to biography infoboxes that just have plain text dates? --Zackmann08 (Talk to me/What I been doing) 18:13, 20 December 2016 (UTC)[reply]

These templates provide the dates in microformat, which follows ISO 8601. ISO 8601 only uses the Gregorian calendar, but many birth and death dates in Wikipedia use the Julian calendar. A bot can't distinguish which is which, unless the date is after approximately 1924, so this is not an ideal task to assign to a bot. (Another problem is that if the birth date is Julian and the death date is Gregorian the age computation could be wrong.) Jc3s5h (talk) 19:07, 20 December 2016 (UTC)[reply]
@Jc3s5h: that is a very valid point... One thought, the bot could (at least initially) focus on only people born after 1924 (or whichever year is decided). --Zackmann08 (Talk to me/What I been doing) 19:13, 20 December 2016 (UTC)[reply]
Without comment on feasibility, I support this as useful for machine-browsing. The ISO 8601 format is useful even if the visual output of the page doesn't change. ~ Rob13Talk 08:22, 30 December 2016 (UTC)[reply]

I all go for it. I am filing a BFRA after my wikibreak. -- Magioladitis (talk) 08:24, 30 December 2016 (UTC)[reply]

  • When I open the edit window, I just see a bunch of template clutter, so I would like to understand what the template is used for, who on WP uses it, and specifically what the purpose of micro format dates is; It strikes me that the info boxes are sufficiently well labelled for any party to pull date metadata off them without recourse to additional templates. -- Ohc ¡digame! 23:13, 14 February 2017 (UTC)[reply]

Can a useful bot be taken over and repaired.

(Was posted at WP:VPT, user:Fastily suggested to post here if there was no takers)
User:Theopolisme is fairly inactive (last edit May). He mde User:Theo's Little Bot. Of late the bot has not been behaving very well on at least one of it's tasks (Task 1 - reduction of non-free images in Category:Wikipedia non-free file size reduction requests. It typically starts at 06:00 and will drop out usually within a minute of two (although sometimes one is lucky and it runs for half an hour occasionally). Messages on talk pages and github failed to contact user. User:Diannaa and I both sent e-mails, and Diannaa did get a reply - He is very busy elsewhere, and hopes to maybe look over Xmas... In view of the important work it does, Dianna suggested I ask at WP:VPT if there was someone who could possibly take the bot over? NB: See also Wikipedia:Bot requests#Update WikiWork factors Ronhjones  (Talk) 19:44, 25 December 2016 (UTC)[reply]

Now this should be a simple task. Doing... Dat GuyTalkContribs 12:39, 27 December 2016 (UTC)[reply]
@DatGuy: FWIW, I'm very rusty on python, but I tried running the bot off my PC (with all saves disabled of course), and the only minor error I encountered was resizer_auto.py:49: DeprecationWarning: page.edit() was deprecated in mwclient 0.7.0 and will be removed in 0.9.0, please use page.text() instead.. I did note that the log file was filling up, maybe after so long unattended, the log file is too big. Ronhjones  (Talk) 16:24, 28 December 2016 (UTC)[reply]
Are you sure? See [2]. When it tries to upload it, the file is corrupted. However, the file is fine on my local machine. Can you test it on the file? Feel free to use your main account, I'll ask to make it possible for you to upload files. As a side note, could you join ##datguy connect so we can talk more easily (text, no voice). Thanks. Dat GuyTalkContribs 16:33, 28 December 2016 (UTC)[reply]
Well just reading the files is one thing, writing them back is a whole new ball game! Commented out the "theobot.bot.checkpage" bit, changed en.wiki to test.wiki (2 places), managed to login OK, then it goes bad - see User:Ronhjones/Sandbox2 for screen grab. And every run adds two lines to my "resizer_auto.log" on the PC. Bit late now for any more. Ronhjones  (Talk) 01:44, 29 December 2016 (UTC)[reply]
Ah, just spotted the image files in the PC directory - 314x316 pixels, perfect sizing. Does that mean the bot's directory is filling up with thousands of old files? Just a thought. Ronhjones  (Talk) 01:49, 29 December 2016 (UTC)[reply]
See for yourself :). Weird thing for me is, I can upload it manually from the API sandbox on testwiki just fine. When the bot tries to do it via coding? CORRUPT! Dat GuyTalkContribs 10:28, 30 December 2016 (UTC)[reply]
25 GB of temp image files !! - it there a size limit per user on that server? Somewhere (in the back of my mind - I know not where - trouble with getting old..., and I could be very wrong) I read he was using a modified mwclient... My PC fails when it hits the line site.upload(open(file), theimage, "Reduce size of non-free image... and drops to the error routine, I tried to look up the syntax of that command (not a lot of documentation) and it does not seems to fully agree with his format. Ronhjones  (Talk) 23:29, 30 December 2016 (UTC)[reply]
OTOH, I just looked at the test image, have you cracked it? Ronhjones  (Talk) 23:31, 30 December 2016 (UTC)[reply]

BRFA filed. Dat GuyTalkContribs 09:19, 1 January 2017 (UTC)[reply]

@DatGuy: And approved I see - Is it now running? I'll stop the original running. I see it was that "open" statement that was the issue I had! Ronhjones  (Talk) 00:34, 3 January 2017 (UTC)[reply]

Autoassess redirects

A bot that patrols articles and reassesses WikiProject banners when an article has been redirected. (As far as I can tell, this doesn't exist.) It's an easy place to save editor patrol time. I would suggest that such a bot remove the class/importance parameters altogether (rather than assessing as |class=Redirect) because then the template itself will (1) autoassess to redirect as necessary, and (2) autoassess to "unassessed" (needing editor attention) if/when the redirect is undone. But bot assistance in that first step should be uncontroversial maintenance. Alternatively, the bot could remove WP banners when the project doesn't assess redirects, though I think the better case would be to leave them (let the project banners autoassess as "N/A") rather than not having the page tracked. I am no longer watching this page—ping if you'd like a response czar 14:32, 6 January 2017 (UTC)[reply]

I have some code for this. Czar and Izno, how do you think the bot should react when the talk page has banners with class set to redirect and importance set to some value? I think there's some value to keeping the importance parameter, but it may be unimportant in the long run. Enterprisey (talk!) 21:01, 19 January 2017 (UTC)[reply]
If the page has been redirected for a week, I'd consider it uncontroversial to wipe both quality and importance parameters, which both should be reassessed by a human if/when the article is restored. (The WikiProject template automatically recategorizes when the redirect is removed for a human to do this.) I see two cases, though, (1) updating the WikiProject templates when the redirecting editor does not, and (2) removing manual assessments as "Redirects" to let the template autoassess on its own. There could be issues with the latter, so I'd focus on the former case, which is the most urgent. I would think some kind of widespread input would be needed for the latter, considering how some project may desire manual assessments across the board and/or keeping their importance params on redirects, for whatever reason. Thanks for your work! Looking forward to the bot. czar 21:06, 19 January 2017 (UTC)[reply]
Perhaps the bot could leave a comment; something like <!-- EnterpriseyBot reassessed this from High Start to no parameters -->. Articles which are un-redirected would then give the editor an opportunity to review the old classification. --Izno (talk) 13:11, 20 January 2017 (UTC)[reply]
I'm strongly, strongly opposed to wiping importance. Some projects may use that to identify targets needing creation which are currently redirects. This should be a project-level decision. As for removing classes, that's uncontroversial. Not sure how you're implementing this, but I'd suggest you want articles that have been redirects for at least 48 hours to avoid wiping classes from articles which are blanked and redirected by vandals, etc. ~ Rob13Talk 05:01, 21 January 2017 (UTC)[reply]
At the moment, the bot skips anything that hasn't been a redirect for a week. I agree that the importance parameter shouldn't be affected by whatever the page contains, so at the moment it isn't touched. The BRFA might also be a good place to discuss this. Enterprisey (talk!) 01:57, 22 January 2017 (UTC)[reply]
BRFA filed Enterprisey (talk!) 02:45, 22 January 2017 (UTC)[reply]
And marking this as Y Done, as I'm continuing to run the task. Enterprisey (talk!) 20:16, 10 February 2017 (UTC)[reply]

Copy coordinates from lists to articles

Virtually every one of the 3000-ish places listed in the 132 sub-lists of National Register of Historic Places listings in Virginia has an article, and with very few exceptions, both lists and articles have coordinates for every place, but the source database has lots of errors, so I've gradually been going through all the lists and manually correcting the coords. As a result, the lists are a lot more accurate, but because I haven't had time to fix the articles, tons of them (probably over 2000) now have coordinates that differ between article and list. For example, the article about the John Miley Maphis House says that its location is 38°50′20″N 78°35′55″W / 38.83889°N 78.59861°W / 38.83889; -78.59861, but the manually corrected coords on the list are 38°50′21″N 78°35′52″W / 38.83917°N 78.59778°W / 38.83917; -78.59778. Like most of the affected places, the Maphis House has coords that differ only a small bit, but (1) ideally there should be no difference at all, and (2) some places have big differences, and either we should fix everything, or we'll have to have a rather pointless discussion of which errors are too little to fix.

Therefore, I'm looking for someone to write a bot to copy coords from each place's NRHP list to the coordinates section of {{infobox NRHP}} in each place's article. A few points to consider:

  • Some places span county lines (e.g. bridges over border streams), and in many of these cases, each list has separate coordinates to ensure that the marked location is in that list's county. For an extreme example, Skyline Drive, a scenic 105-mile-long road, is in eight counties, and all eight lists have different coordinates. The bot should ignore anything on the duplicates list; this is included in citation #4 of National Register of Historic Places listings in Virginia, but I can supply a raw list to save you the effort of distilling a list of sites to ignore.
  • Some places have no coordinates in either the list or the article (mostly archaeological sites for which location information is restricted), and the bot should ignore those articles.
  • Some places have coordinates only in the list or only in the article's {{Infobox NRHP}} (for a variety of reasons), but not in both. Instead of replacing information with blanks or blanks with information, the bot should log these articles for human review.
  • Some places might not have {{infobox NRHP}}, or in some cases (e.g. Newport News Middle Ground Light) it's embedded in another infobox, and the other infobox has the coordinates. If {{infobox NRHP}} is missing, the bot should log these articles for human review, while embedded-and-coordinates-elsewhere is covered by the previous bullet.
  • I don't know if this is the case in Virginia, but in some states we have a few pages that cover more than one NRHP-listed place (e.g. Zaleski Mound Group in Ohio, which covers three articles); if the bot produced a list of all the pages it edits, a human could go through the list, find any entries with multiple appearances, and check them for fixes.
  • Finally, if a list entry has no article at all, don't bother logging it. We can use WP:NRHPPROGRESS to find what lists have redlinked entries.

No discussion has yet been conducted for this idea; it's just something I've thought of. I've come here basically just to see if someone's willing to try this route, and if someone says "I think I can help", I'll start the discussion at WT:NRHP and be able to say that someone's happy to help us. Of course, I wouldn't ask you actually to do any coding or other work until after consensus is reached at WT:NRHP. Nyttend (talk) 00:55, 16 January 2017 (UTC)[reply]

Off-topic discussion
I'm a WikiProject NRHP member and I'd like to support what Nyttend is getting at. I support anyone considering Nyttend's question directly, but want to ask about a variation. Note, it's kind of unfortunate though that the source of coordinates is not identified by WikiProject NRHP editors, neither originally (when the source was probably the NRIS database) nor now. (Marking source of coordinates, going forward, is under discussion at Wikipedia talk:WikiProject National Register of Historic Places#Coordinates conversions, and should we be footnoting coordinates?) Perhaps what Nyttend is getting at, and more, could be done by a bot which would make three-way comparison of coordinates in A) individual articles to B) coordinates in NRHP county list-articles to C) coordinates in the downloadable NRIS database. The NRIS database is the original source of most of the coordinates that Nyttend has painstakingly improved upon, for places in Virginia. I believe them that they have gone through Virginia carefully and that wherever they have changed coordinates in the (B) county list-articles that they have done that well. In other states it is much more random, and the coordinates might have been improved in an individual article OR in the county list-article. I personally have fixed coordinates in individual articles (A) but not in list-articles (B), working the opposite way from how Nyttend has done. Could a bot be programmed to make a three-way comparison. If A and B are the same as C, then mark them as being sourced from NRIS. If the state is Virginia, and just one out of A and B is different than C, then accept the change at the other place too and mark both A and B as being sourced by Nyttend's evaluation (using {{NRHPcoord}}) with "improvedby=Nyttend" parameter. If both A and B are different than C, then mark them as discrepancies (using template NRHPcoord with some suitable parameter). If either A or B already has been marked as improved, then improve the other one and copy the sourcing over. If the (C) NRIS coordinates cannot be found for a given site, then mark something else. I wonder, is it possible for someone to consider running this kind of three-way comparison (and would that be easier/better)? --doncram 02:52, 23 January 2017 (UTC)[reply]
No, any three-way comparison is a big distraction. What we need is a bot that will copy human-checked coordinates from lists to articles (with exceptions to be provided by me) and nothing else; we can worry about the other stuff at another time. Nyttend (talk) 15:38, 23 January 2017 (UTC)[reply]
What about the coordinates in Virginia lists that were not improved or verified, though. I checked two lists and see that Nyttend changed all 8 sets of coordinates in one, and changed 1 out of 3 sets of coordinates in another. And what about coordinates in individual articles that were improved by another editor (I don't know how many of these exist, but there will be some within the articles for 2,995 Virginia NRHP sites). I think bot editing has to be restricted to cases where the edit will clearly be making an improvement.
A lesser task would be if a bot could mark, using template:NRHPcoord, the specific coordinates in Virginia lists that Nyttend changed recently. If a bot can examine edits and see that the coordinates were changed by Nyttend. That would still be helpful. --doncram 17:03, 25 January 2017 (UTC)[reply]
I have confirmed coordinates for every site in the state, aside from a few for which I did not have information, and I logged all of those. Most items on which I changed nothing are items in which the original coordinates were already correct; aside from the items I logged, there's no possibility of the current coordinates being wrong, unless I made a typo or misread a map or something like that. The bot shouldn't worry about whether I changed anything. Nyttend (talk) 20:12, 27 January 2017 (UTC)[reply]

Bot to help with FA/GA nomination process

The process is as follows: (Pasted from FA nomination page):

Before nominating an artic)le, ensure that it meets all of the FA criteria and that peer reviews are closed and archived. The featured article toolbox (at right) can help you check some of the criteria. Place {{FAC}} should be substituted at the top of the article talk page at the top of the talk page of the nominated article and save the page. From the FAC template, click on the red "initiate the nomination" link or the blue "leave comments" link. You will see pre-loaded information; leave that text. If you are unsure how to complete a nomination, please post to the FAC talk page for assistance. Below the preloaded title, complete the nomination page, sign with ~~~~ and save the page.

Copy this text: Wikipedia:Featured article candidates/name of nominated article/archiveNumber (substituting Number), and edit this page (i.e., the page you are reading at the moment), pasting the template at the top of the list of candidates. Replace "name of ..." with the name of your nomination. This will transclude the nomination into this page. In the event that the title of the nomination page differs from this format, use the page's title instead.

May be a bot could automate that process? Thanks.47.17.27.96 (talk) 13:08, 16 January 2017 (UTC)[reply]

This was apparently copied here from WP:VPT; the original is here. --Redrose64 🌹 (talk) 21:34, 16 January 2017 (UTC)[reply]
I think that at WP:VPT, the IP was directed here - see here. TheMagikCow (talk) 19:40, 17 January 2017 (UTC)[reply]
There is some information that is required from the user, both with the FAC and GAN templates, that can't be inferred by a bot but requires human decision making. I don't think this would be that useful or feasible. BlueMoonset (talk) 21:35, 22 January 2017 (UTC)[reply]

Bot for category history merges

Back in the days when the facility to move category pages wasn't available, Cydebot made thousands of cut-and-paste moves to rename categories per CFD discussions. In the process of the renames, a new category would be created under the new name by the bot with the the edit summary indicating that it was "Moved from CATEGORYOLDNAME" and identifying the the editors of the old category to account for the attribution. An example is here.

This method of preserving attribution is rather crude and so it is desirable that the complete editing history of the category page be available for attribution. The process of recovering the deleted page histories has since been taken on by Od Mishehu who has performed thousands of history merges.

I suggest that an adminbot be optimised to go through Cydebot's contribs log, identify the categories that were created by it (i.e, the first edit on the page should be by Cydebot) and

  1. undelete the category mentioned in the Cydebot's edit summary
  2. history-merge it into the new category using Special:MergeHistory.
  3. Delete the left-over redirect under CSD G6.

This bot task is not at all controversial. This is just an effort to fill in missing page histories. Obviously, there would be no cases of any parallel histories encountered - and even if there were, it wouldn't be an issue since Special:MergeHistory cannot be used for merging parallel histories - which is to say that there is no chance of any unintended history mess-up. This should an easy task for a bot. 103.6.159.72 (talk) 10:52, 18 January 2017 (UTC)[reply]

There's one thing that i have overlooked above, though it is again not a problem. In some rare cases, it may occur that after the source page has been moved to the destination page, the source page may later have been recreated - either as a category redirect or as a real category. In such cases, just skip step #3 in the procedure described above. There will be edits at the source page that postdate the creation of the destination page, and hence by its design, Special:MergeHistory will not move these edits over - only the old edits that the bot has undeleted would be merged. (It may be noted that the MergeHistory extention turns the source page into a redirect only when all edits at the source are merged into the destination page, which won't be the case in such cases - this means that the source page that some guy recreated will remain intact.) All this is that simple. 103.6.159.72 (talk) 19:37, 18 January 2017 (UTC)[reply]
Is this even needed? I would think most if not all edits to category pages do not pass the threshold of originality to get copyright in the first place. Our own guidelines on where attribution is not needed reinforce this notion under US law, stating duplicating material by other contributors that is sufficiently creative to be copyrightable under US law (as the governing law for Wikipedia), requires attribution. That same guideline also mentions that a List of authors in the edit summary is sufficient for proper attribution, which is what Cydebot has been doing for years. Avicennasis @ 21:56, 20 Tevet 5777 / 21:56, 18 January 2017 (UTC)[reply]
Cydebot doesn't do it any longer. Since 2011 or sometime 2015, Cydebot renames cats by actually moving the page. So for the sake of consistency, we could do this for the older cats also. The on-wiki practise, for a very lomg time, has been to do a history merge wherever it is technically possible. The guideline that edit summary is sufficient attribution is quite dated and something that's hardly ever followed. It's usually left as a worst-case option where a histmerge is not possible. History merge is the preferred method of maintaining attribution. Some categories like Category:Members of the Early Birds of Aviation do have some descriptive creative content. 103.6.159.72 (talk) 02:21, 19 January 2017 (UTC)[reply]
I'm not completely opposed to this, but I do think that we need to define which category pages are in scope for this. I suspect the vast majority of pages wouldn't need attribution, and we should be limiting the amount of pointless bot edits. Avicennasis @ 02:49, 21 Tevet 5777 / 02:49, 19 January 2017 (UTC)[reply]
It wasn't 2011 (it can't have been, since the ability to move category pages wasn't available to anybody until 22 May 2014, possibly slightly later, but certainly no earlier). Certainly Cydebot was still making cutpaste moves when I raised this thread on 14 June 2014; raised this thread; and commented on this one. These requests took some months to be actioned: checking Cydebot's move log, I find that the earliest true moves of Category: pages that were made by that bot occurred on 26 March 2015. --Redrose64 🌹 (talk) 12:07, 19 January 2017 (UTC)[reply]
Since we are already talking about using a bot, I think it makes sense to do them all (or lest none at all) since that would come at no extra costs. Selecting a cherry-pick for the bot to do is just a waste of human editors' time. The edits won't be completely "pointless" - it's good to be able to see full edit histories. Talking of pointless edits, I should remind people that there are bots around that perform hundreds of thousands of pointless edits. 103.6.159.84 (talk) 16:14, 19 January 2017 (UTC)[reply]
As to when it became technically possible, I did it on May 26, 2014. עוד מישהו Od Mishehu 05:32, 20 January 2017 (UTC)[reply]
~94,899 pages, by my count. Avicennasis @ 03:36, 23 Tevet 5777 / 03:36, 21 January 2017 (UTC)[reply]
That should keep a bot busy for a week or more. The Usercontribs module pulls the processing queue. Here's the setup in the API sandbox. Click "make request" to see the results of a query to get the first three. Though I've never written an admin-bot before, I may take a stab at this within the next several days. – wbm1058 (talk) 04:28, 21 January 2017 (UTC)[reply]
The other major API modules to support this are Undelete, Mergehistory and Delete. This would be a logical second task for my Merge bot to take on. The PHP framework I use supports undelete and delete, but it looks like I'll need to add new functions for user-contribs and merge-history. In my RfA I promised to work the Wikipedia:WikiProject History Merge backlog, so it would be nice to take that off my back burner in a significant way. I'm hoping to leverage this into another bot task to clear some of the article-space backlog as well...
Coding... wbm1058 (talk) 13:06, 21 January 2017 (UTC)[reply]
My count is 89,894 pages. wbm1058 (talk) 00:58, 24 January 2017 (UTC)[reply]
@Wbm1058: Did you exclude the pages that have already been histmerged (by Od Mishehu and probably a few by other admins also)?— Preceding unsigned comment added by 103.6.159.67 (talkcontribs) 12:39, 24 January 2017 (UTC)[reply]
I was about to mention that. My next step is to check the deleted revisions for mergeable history. No point in undeleting if there is no mergeable history. Working on that now. – wbm1058 (talk) 14:40, 24 January 2017 (UTC)[reply]
Note this example of a past histmerge by Od Mishehu: Category:People from Stockport
Should this bot do that with its histmerges too? wbm1058 (talk) 21:51, 25 January 2017 (UTC)[reply]
Yes, when there is a list of users present (there were periods when the bot didn't do it, but most of the time it did). עוד מישהו Od Mishehu 22:24, 25 January 2017 (UTC)[reply]

An other issue: Some times, a category was renamed multiple times. For example, Category:Georgian conductors->Category:Georgian conductors (music)->Category:Conductors (music) from Georgia (country); this must be supported also for categories where the second rename was recent. e.g Category:Visitor attractions in Washington (U.S. state)->Category:Visitor attractions in Washington (state)->Category:Tourist attractions in Washington (state). Back-and-forth renames must also be considered, for example, Category:Tornadoes in Hawaii->Category:Hawaii tornadoes->Category:Tornadoes in Hawaii; this also must be handled in cases where the second rename was recent, e.g Category:People from San Francisco->Category:People from San Francisco, California->Category:People from San Francisco. עוד מישהו Od Mishehu 05:35, 26 January 2017 (UTC)[reply]

Od Mishehu, this is also something I noticed. I'm thinking the best way to approach this is to start with the oldest contributions, and then merge forward so the last merge would be into the newest, currently active, category. Is that the way you would manually do this? So I think I need to reverse the direction that I was processing this, and work forward from the oldest rather than backward from the newest. Category:Georgian conductors was created at 22:56, 23 June 2008 by a human editor; that's the first (oldest) set of history to merge. At 22:38, 7 June 2010 Cydebot moved Category:Conductors by nationality to Category:Conductors (music) by nationality per CFD at Wikipedia:Categories for discussion/Log/2010 May 24#Category:Conductors. At 00:12, 8 June 2010 Cydebot deleted page Category:Georgian conductors (Robot - Moving Category Georgian conductors to Category:Georgian conductors (music) per CFD at Wikipedia:Categories for discussion/Log/2010 May 24#Category:Conductors.) So we should restore both Category:Georgian conductors and Category:Georgian conductors (music) in order to merge the 5 deleted edits of the former into the history of the latter. The new category creation by Cydebot that would trigger this history restoration and merging is
  • 00:11, 8 June 2010 . . Cydebot (187 bytes) (Robot: Moved from Category:Georgian conductors. Authors: K********, E***********, O************, G*********, Cydebot)
However, if you look at the selection set I've been using, you won't find this new category creating edit: 8 June 2010 Cydebot contributions
It should slot in between these:
To find the relevant log item, I need to search the Deleted user contributions
I'm looking for the API that gets deleted user contributions. This is getting more complicated. – wbm1058 (talk) 16:38, 26 January 2017 (UTC)[reply]
OK, Deletedrevs can list deleted contributions for a certain user, sorted by timestamp. Not to be confused with Deletedrevisions. wbm1058 (talk) 17:18, 26 January 2017 (UTC)[reply]
After analyzing these some more, I think my original algorithm is fine. I don't think it should be necessary for the bot to get involved with the deleted user contributions. What this means is that only the most recent moves will be merged on the first pass, as my bot will only look at Cydebot's active contributions history. The first pass will undelete and merge the most recently deleted history, which will expose additional moves that my bot will see on its second pass through the contributions. I'll just re-run until my bot sees no more mergeable items. The first bot run will merge Category:Georgian conductors (music) into Category:Conductors (music) from Georgia (country). The second bot run will merge Category:Georgian conductors into Category:Conductors (music) from Georgia (country). The first bot run will merge Category:Visitor attractions in Washington (U.S. state) into Category:Tourist attractions in Washington (state), and there's nothing to do on the second pass (there is no mergeable history in Category:Visitor attractions in Washington (state)). The first pass would merge Category:Hawaii tornadoes into Category:Tornadoes in Hawaii – I just did that for testing. The second pass will see that Category:Tornadoes in Hawaii should be history-merged into itself. I need to check for such "self-merge" cases and report them (a "self-merge" is actually a restore of some or all of a page's deleted history)... I suppose I should be able to restore the applicable history (only the history that predates the page move). Category:People from San Francisco just needs to have the "self-merge" procedure performed, as Category:People from San Francisco, California has no mergeable history. Thanks for giving me these use-cases, very helpful.
I should mention some more analysis from a test run through the 89,893 pages in the selection set. 2369 of those had no deleted revisions, so I just skip them. HERE is a list of the first 98 of those. Of the remaining 87,524 pages, these 544 pages aren't mergeable, because the timestamp of the oldest edit isn't old enough, so I skip them too. Many of these have already been manually history-merged. That leaves 86,980 mergeable pages that my bot should history-merge on its first pass. An unknown number of additional merges to be done on the second pass, then hopefully a third pass will either confirm we're done or mop up any remaining – unless there are cats that have moved four times... wbm1058 (talk) 22:42, 26 January 2017 (UTC)[reply]
Some of the pages with no deleted reivsions are the result of a category rename where the source category was changed into something else (a category redirect or disambiguation), and a history merge in those caes should be done (I juse did onesuch merge, the thirds on the list of 99). However, this may be too difficult for a bot to handle; I can deal with those over time if you give me a full list. The first 2 on the list you gave are different - the bot didn't delete them (it did usually, but not always), and they were removed without deletion by Jc37 and used as new categories. I believe, based on the link to the CFD discussion at the beginning, that the aanswer to that would be in Wikipedia:Categories for discussion/Log/2015 January 1#Australian politicians. עוד מישהו Od Mishehu 05:34, 27 January 2017 (UTC)[reply]

This whole thing seems a waste of time (why do we need to see old revisions of category pages that were deleted years ago), but if you want to spend your time writing and monitoring a bot that does this, I won't complain; it won't hurt anything. I'm just concerned by the comments up above that point out a lot of not-so-straightforward cases, like the tornadoes in Hawaii and the visitor attractions in Washington. How will the bot know what information is important to preserve and what isn't? Nyttend (talk) 05:28, 27 January 2017 (UTC)[reply]

The reasons for it, in my opinion:
  1. While most categories have no copyrightable information, some do; on these, we legally need to maintain the history. While Cydebot did this well for categories which were renamed once, it didn't for categories which were renamed more than once. Do any of these have copyrightable information? It's impossible to know.
  2. If we nominate acategory for deletion, we generally should inform its creator - even if the creation was over 10 years ago, as long as the creator is still active. With deleted history, it's difficult for a human admin to do this, and impossible for automated nomination tools (such as [[WP::TW|Twinkle]]) or non-admins.
עוד מישהו Od Mishehu 05:37, 27 January 2017 (UTC)[reply]
  1. Because writing a bot is fun, isn't it? As only programmers know. And especially if the bot's gonna perform hundreds of thousands of admin actions.
  2. Because m:wikiarchaeologists will go to any lengths to make complete edoting histories of pages visible, even if it's quite trivial. Using a bot shows a far more moderate level of eccentricity than doing it manually would. Why do you think Graham87 imported thousands of old page revisions from nostwiki?
103.6.159.76 (talk) 08:59, 27 January 2017 (UTC)[reply]

I think it may be best to defer any bot processing of these on the first iteration of this. Maybe after a first successful run, we can come back and focus on an automated solution for these as well. It's still a lot to be left for manual processing. I'll work on the piece that actually performs the merges later today. – wbm1058 (talk) 13:49, 27 January 2017 (UTC)[reply]

@Wbm1058: For the pages that were copy-pasted without rhe source catgeory being delted, you can still merge them. Use of Special:MergeHistory ensures that only the edits that predate the creation of the destination category will be merged. 103.6.159.90 (talk) 08:32, 29 January 2017 (UTC)[reply]

BRFA filed I think this is ready for prime time. wbm1058 (talk) 01:17, 28 January 2017 (UTC)[reply]

Website suddenly took down a lot of its material, need archiving bot!

Per Wikipedia_talk:WikiProject_Academic_Journals#Urgent:_Beall.27s_list, several (if not) most links to https://scholarlyoa.com/ and subpages just went dead. Could a bot help with adding archive links to relevant citation templates (and possibly bare/manual links too)? Headbomb {talk / contribs / physics / books} 00:31, 19 January 2017 (UTC)[reply]

Cyberpower678, could you mark this domain is dead in IABot's database so that it will handle adding archive urls? — JJMC89(T·C) 01:13, 19 January 2017 (UTC)[reply]
@Cyberpower678: ? Headbomb {talk / contribs / physics / books} 10:50, 2 February 2017 (UTC)[reply]
Sorry, I never got the first ping. I'll mark it in a moment.—CYBERPOWER (Chat) 16:52, 2 February 2017 (UTC)[reply]
Only 61 urls were found in the DB with the domain.—CYBERPOWER (Chat) 17:39, 2 February 2017 (UTC)[reply]
@Cyberpower678: Well that's 61 urls that we needed! Would it be possible to have a list of those urls, or is that complicated? It would be really useful to project members to have those centralized in one place. Headbomb {talk / contribs / physics / books} 20:04, 13 February 2017 (UTC)[reply]
I would but, the DB is under maintenance right now.—CYBERPOWER (Be my Valentine) 20:06, 13 February 2017 (UTC)[reply]
I'll ping you next week then. Headbomb {talk / contribs / physics / books} 20:07, 13 February 2017 (UTC)[reply]
@Cyberpower678:. Headbomb {talk / contribs / physics / books} 21:00, 22 February 2017 (UTC)[reply]
The interface link will be made available soon, but...
  1. http://scholarlyoa.com/
  2. http://scholarlyoa.com/2012/09/05/two-print-journals-completely-hijacked-by-online-hoodlums/
  3. http://scholarlyoa.com/2012/11/30/criteria-for-determining-predatory-open-access-publishers-2nd-edition
  4. http://scholarlyoa.com/2012/11/30/criteria-for-determining-predatory-open-access-publishers-2nd-edition/
  5. http://scholarlyoa.com/2013/03/05/new-term-moamj-multidisciplinary-open-access-mega-journal
  6. http://scholarlyoa.com/2013/04/04/hindawis-profits-are-larger-than-elseviers/
  7. http://scholarlyoa.com/2013/04/09/the-epitome-of-predatory-publishers/#more-1525
  8. http://scholarlyoa.com/2013/07/16/recognizing-a-pattern-of-problems-in-pattern-recognition-in-physics/
  9. http://scholarlyoa.com/2013/08/01/article-level-metrics/
  10. http://scholarlyoa.com/2013/10/03/science/
  11. http://scholarlyoa.com/2013/11/05/i-get-complaints-about-frontiers/
  12. http://scholarlyoa.com/2013/11/21/index-copernicus-has-no-value/
  13. http://scholarlyoa.com/2014/01/02/list-of-predatory-publishers-2014/
  14. http://scholarlyoa.com/2014/02/11/bogus-new-impact-factor-appears/
  15. http://scholarlyoa.com/2014/02/18/chinese-publishner-mdpi-added-to-list-of-questionable-publishers/
  16. http://scholarlyoa.com/2014/02/18/chinese-publishner-mdpi-added-to-list-of-questionable-publishers/#comment-46115
  17. http://scholarlyoa.com/2014/03/06/is-the-editor-of-the-springer-journal-scientometrics-indifferent-to-plagiarism/#more-3197
  18. http://scholarlyoa.com/2014/05/02/red-alert-polish-scholarly-journal-is-hijacked/
  19. http://scholarlyoa.com/2014/07/08/cardiology-journals-decline-is-heartbreaking/
  20. http://scholarlyoa.com/2014/08/28/predatory-publisher-organizes-conference-using-same-name-as-legitimate-conference/
  21. http://scholarlyoa.com/2014/10/02/an-editorial-board-mass-resignation-from-an-open-access-journal/
  22. http://scholarlyoa.com/2014/10/14/the-scientific-world-journal-will-lose-its-impact-factor-again/
  23. http://scholarlyoa.com/2014/11/04/google-scholar-is-filled-with-junk-science/
  24. http://scholarlyoa.com/2014/11/20/bogus-journal-accepts-profanity-laced-anti-spam-paper/
  25. http://scholarlyoa.com/2014/12/16/the-chinese-publisher-scirp-scientific-research-publishing-a-publishing-empire-built-on-junk-science
  26. http://scholarlyoa.com/2014/12/18/the-omics-publishing-groups-empire-is-expanding/
  27. http://scholarlyoa.com/2015/01/08/anti-roundup-glyphosate-researchers-use-easy-oa-journals-to-spread-their-views/
  28. http://scholarlyoa.com/2015/01/20/did-a-romanian-researcher-successfully-game-google-scholar-to-raise-his-citation-count/
  29. http://scholarlyoa.com/2015/03/31/berkeley-california-based-journal-is-hijacked/
  30. http://scholarlyoa.com/2015/05/26/watch-out-for-publishers-with-nova-in-their-name/
  31. http://scholarlyoa.com/2015/06/11/guest-editing-a-special-issue-with-mdpi-evidences-of-questionable-actions-by-the-publisher/
  32. http://scholarlyoa.com/2015/07/21/chinese-journal-has-surprise-author-fee-but-gives-refund-if-you-cite-your-article-six-times/
  33. http://scholarlyoa.com/2015/07/30/is-scielo-a-publication-favela/
  34. http://scholarlyoa.com/2015/08/25/more-pseudo-science-from-swiss-chinese-publisher-mdpi/
  35. http://scholarlyoa.com/2015/12/22/jmir-publications-a-model-for-open-access-health-sciences-publishers/
  36. http://scholarlyoa.com/2016/01/19/another-respected-society-journal-victimized-by-title-thief/
  37. http://scholarlyoa.com/about/
  38. http://scholarlyoa.com/individual-journals/
  39. http://scholarlyoa.com/other-pages/hijacked-journals/
  40. http://scholarlyoa.com/publishers/
  41. https://scholarlyoa.com/2016/02/09/canadian-publisher-has-open-access-evil-twin/#more-6815
  42. https://scholarlyoa.com/2016/04/28/the-tr-master-journal-list-is-not-a-journal-whitelist/
  43. https://scholarlyoa.com/publishers/
  44. https://scholarlyoa.com/tag/fake-impact-factors/
  45. https://scholarlyoa.com/2016/04/19/oncotargets-peer-review-is-highly-questionable/
  46. https://scholarlyoa.com/individual-journals/
  47. https://scholarlyoa.com/2014/08/28/predatory-publisher-organizes-conference-using-same-name-as-legitimate-conference/
  48. https://scholarlyoa.com/2016/04/28/the-tr-master-journal-list-is-not-a-journal-whitelist//
  49. https://scholarlyoa.com/2015/01/08/anti-roundup-glyphosate-researchers-use-easy-oa-journals-to-spread-their-views/
  50. https://scholarlyoa.com/2014/12/16/the-chinese-publisher-scirp-scientific-research-publishing-a-publishing-empire-built-on-junk-science/
  51. https://scholarlyoa.com/2016/07/14/more-fringe-science-from-borderline-publisher-frontiers/#more-7720
  52. https://scholarlyoa.com/2016/02/09/canadian-publisher-has-open-access-evil-twin/
  53. https://scholarlyoa.com/2013/01/25/omics-predatory-meetings/
  54. https://scholarlyoa.com/2016/10/13/bogus-british-company-accredits-omics-conferences/#comment-425306
  55. https://scholarlyoa.com/2016/09/29/scam-publisher-omics-international-buying-legitimate-journals/
  56. https://scholarlyoa.com/2016/10/27/reviewer-to-frontiers-your-review-process-is-merely-for-show-i-quit/
The DB maintenance script merged some differently formatted URLs that were identical.
You can find them on the following articles:
  1. Sokal affair
  2. Vanity press
  3. Autocomplete
  4. Institute for Scientific Information
  5. Impact factor
  6. Google Scholar
  7. Epistemologia
  8. SciELO
  9. Hindawi Publishing Corporation
  10. Journal of Medical Internet Research
  11. Nova Science Publishers
  12. Redalyc
  13. MDPI
  14. Pulsus Group
  15. Scientometrics (journal)
  16. List of confidence tricks
  17. Abstract and Applied Analysis
  18. Allied Academies
  19. Journal of Natural Products
  20. The Scientific World Journal
  21. Corruption in Canada
  22. Frontiers in Bioscience
  23. Indian Journal of Pharmaceutical Sciences
  24. Bentham Science Publishers
  25. Index Copernicus
  26. Scientific Research Publishing
  27. Frontiers in Psychology
  28. Salahaddin Khalilov
  29. World Academy of Science, Engineering and Technology
  30. Open Access Scholarly Publishers Association
  31. K. K. Aggarwal
  32. Journal of Cosmology
  33. Canadian Journal of Gastroenterology & Hepatology
  34. Plastic Surgery (journal)
  35. Experimental & Clinical Cardiology
  36. Canadian Journal of Respiratory Therapy
  37. Academic journal publishing reform
  38. OMICS Publishing Group
  39. Frontiers Media
  40. Oncotarget
  41. Altmetrics
  42. Wulfenia (journal)
  43. Sylwan
  44. Polonnaruwa (meteorite)
  45. Predatory open access publishing
  46. Eddie Kohler
  47. Anatole Klyosov
  48. Who's Afraid of Peer Review?
  49. Jeffrey Beall
  50. Pattern Recognition in Physics
  51. Hijacked journal
  52. Mikhail Blagosklonny
  53. Mega journal
  54. International Journal of Advanced Computer Technology
  55. International Archives of Medicine
  56. Michael Silbermann
  57. The Veliger
  58. Stephanie Seneff
  59. Jan Vijg
  60. Chinese Chemical Letters
  61. Aging (journal)
  62. Emerging Sources Citation Index
  63. Neuropsychiatry (journal)
  64. Biomedical Research
  65. Future Medicine
  66. Imaging in Medicine
  67. International Journal of Pharma and Bio Sciences
  68. Predatory conference
  69. International Journal of Clinical Rheumatology
  70. Current Pediatric Research
  71. Clinical Practice
Cheers.—CYBERPOWER (Chat) 06:11, 24 February 2017 (UTC)[reply]

Apply editnotice to talk pages of pages in Category:Wikipedia information pages

Apply editnotice {{Wikipedia information pages talk page editnotice}} to talk pages of pages in Category:Wikipedia information pages, Per unopposed proposal here. (A change to common.js to produce this edit notice was suggested but rejected; a bot task was suggested as the best approach.) Thanks! —swpbT 16:15, 20 January 2017 (UTC)[reply]

(The other route for placing the edit notice is a default-on gadget, which hasn't been put to a discussion yet; bot ops considering doing this task may want to wait until that route has been considered and rejected as well.) Enterprisey (talk!) 18:58, 20 January 2017 (UTC)[reply]
I'm very hesitant to start applying edit notices en masse without some stronger consensus, especially with a bot. ~ Rob13Talk 04:58, 21 January 2017 (UTC)[reply]
To editor BU Rob13: Let another bot operator do it then. The proposal has sat in the appropriate location for 19 days, and the edit notice has appeared on every page currently in the category for 10 days, without a single voice of opposition. "Insufficient consensus" is simply not a valid reason to withhold action in this case. —swpbT 13:18, 23 January 2017 (UTC)[reply]
As these are project pages, and the common.js route has been practically killed going forward with this as a one-off bot run shouldn't be to bad. If someone files a BRFA it should be able to go to trial quickly. NOTE: The OPERATOR will need to be an admin or template editor. — xaosflux Talk 03:00, 24 January 2017 (UTC)[reply]
This is not meant to be a one-off run – the pages currently in the category are already tagged with the edit notice. The request is for a bot to continuously put the notice on pages that are newly added to the category. —swpbT 15:10, 24 January 2017 (UTC)[reply]
@Xaosflux: Why does the operator need to be an admin or template editor? Everybody can edit Wikipedia information pages, right? PhilrocMy contribs 16:13, 24 January 2017 (UTC)[reply]
To edit the info page itself, yes, but to create editnotices in any namepsace besides User/User talk requires the template editor right. —swpbT 18:41, 24 January 2017 (UTC)[reply]
@Swpb: I have made a request to be a template editor. PhilrocMy contribs 19:29, 24 January 2017 (UTC)[reply]
Yes, it is just for the editnotices, the bot's edits are the responsibility of the operator and will need this extra access to make these notices, so the operator will need to be trusted for that access as well. — xaosflux Talk 21:17, 24 January 2017 (UTC)[reply]
@Xaosflux: @Swpb: @BU Rob13: If the bot gets approved, how often should it run? It can't run continously because to my knowledge, AWB can't check if new pages are added to a category. Yes, I want to use AWB. Anyway, how often should the bot run? PhilrocMy contribs 16:02, 26 January 2017 (UTC)[reply]
AWB is not a good option for a bot that should be run automatically on a schedule. — JJMC89(T·C) 18:41, 27 January 2017 (UTC)[reply]
@JJMC89: I have now decided to use DotNetWikiBot instead of AWB. PhilrocMy contribs 20:41, 27 January 2017 (UTC)[reply]
@Xaosflux: @Swpb: @BU Rob13: @JJMC89: I have now made code for the bot which is viewable here. PhilrocMy contribs 22:27, 28 January 2017 (UTC)[reply]
That code is missing namespace restrictions (should only be for Wikipedia and Help pages in the category) and is creating the notice for the subject page instead of the talk page. Does input.text add text to any existing text? The bot needs to do this, not replace anything that is currently there. Also it should only add the editnotice if it is not already there. — JJMC89(T·C) 22:50, 28 January 2017 (UTC)[reply]
@JJMC89: I figured that since most of the pages in the category are Wikipedia and Help already, there was no need for any restrictions. Also, the code checks if each page has a corresponding {{Editnotices}} page. If that page exists, it checks if it is empty (which might happen). If it is, it adds the notice. If not, it goes to the next page. If the page doesn't exist in the first place, it creates the page by adding the editnotice. If you had looked at the code more thoroughly, I wouldn't have to explain this to you. Please look at the if statement inside the foreach if you want proof. PhilrocMy contribs 23:15, 28 January 2017 (UTC)[reply]
Your attempted ping did not work. Assuming there no need for any restrictions is not a good idea. It is looking at the wrong editnotice page. It should be the editnoitce page for the talk page of each page in the category. It shouldn't just skip editnotices that exist. It should append the one in this task if it is not present. — JJMC89(T·C) 23:58, 28 January 2017 (UTC)[reply]
@JJMC89: I have made a new paste. PhilrocMy contribs 16:51, 30 January 2017 (UTC)[reply]
Well, the TE request failed. Guess I can't do the bot. Another admin or TE can make this proposal a reality if they want to. PhilrocMy contribs 16:17, 1 February 2017 (UTC)[reply]

BRFA filed. — JJMC89(T·C) 03:47, 2 February 2017 (UTC)[reply]

Thanks! —swpbT 15:08, 2 February 2017 (UTC)[reply]

Non-free images used excessively

I'd like a few reports, if anyone's able to generate them.

1) All images in Category:Fair use images, defined recursively, which are used outside of the mainspace.

2) All images in Category:Fair use images, defined recursively, which are used on more than 10 pages.

3) All images in Category:Fair use images, defined recursively, which are used on any page that is not anywhere in the text of the file description page. i.e. If "File:Image1.jpg" was used on page "Abraham Lincoln" but the text "Abraham Lincoln" appeared nowhere on the file page.

If anyone can handle all or some of these, it would be much appreciated. Feel free to write to a subpage in my userspace. ~ Rob13Talk 20:29, 21 January 2017 (UTC)[reply]

No bot needed for tasks 1 and 2:
  1. https://tools.wmflabs.org/betacommand-dev/nfcc/NFCC9.html
  2. https://tools.wmflabs.org/betacommand-dev/nfcc/high_use_NFCC.html
Task 3 was done by User:BetacommandBot, but the bot and its master have been since blocked. User:FairuseBot, I think. I'd very much like to see this task being done by a bot. – Finnusertop (talkcontribs) 20:42, 21 January 2017 (UTC)[reply]

Replace Donald Trump image with presidential portrait

Per Talk:Donald_Trump/Archive_38#Trump_Photo_2_Rfc, the image File:Donald_Trump_August_19,_2015_(cropped).jpg and variants such as File:Donald_Trump_August_19,_2015_3_by_2.jpg should be replaced with an official presidential portrait, which is at File:Donald_Trump_official_portrait.jpg. Thanks. - CHAMPION (talk) (contributions) (logs) 00:12, 22 January 2017 (UTC)[reply]

It looks like the old images linked at the top of the RFC are used in only about 200 pages, at the most, not including sandbox pages. Someone could probably do this with AWB. Make sure to wash your hands afterwards.
I read the RFC closure, and it didn't say which types of pages the closure applies to. Should User pages have the image replaced? Archive pages? Talk pages? Pinging EvergreenFir in case this might need clarification. – Jonesey95 (talk) 02:45, 22 January 2017 (UTC)[reply]
Another couple of notes: Some of these images are used on pages where multiple images are being discussed, like Talk:List of Presidents of the United States. Some of the images are used specifically on pages discussing the campaign for president, and illustrate Donald Trump during the campaign. Replacing those photos with a post-campaign official White House photo may not be editorially valid. – Jonesey95 (talk) 04:11, 22 January 2017 (UTC)[reply]
This should be applied only to the mainspace. I'd do this with AWB, but AWB is technically part of the admin toolkit, and I'm technically involved with regard to American politics, so it's not clear that I should do so. ~ Rob13Talk 07:18, 22 January 2017 (UTC)[reply]
Doing... per discussion with Rob13 over IRC. --JustBerry (talk) 20:35, 22 January 2017 (UTC)[reply]
Y Done within mainspace for files relating to File:Donald Trump August 19, 2015.jpg. Log available upon request. --JustBerry (talk) 21:07, 22 January 2017 (UTC)[reply]
@JustBerry: Appears File:Donald Trump August 19, 2015 (cropped).jpg still needs doing. ~ Rob13Talk 05:23, 23 January 2017 (UTC)[reply]

That discussion was about the infobox portrait on that article, not the whole of Wikiepdia! All the best: Rich Farmbrough, 00:38, 5 February 2017 (UTC).[reply]

Move GA reviews to the standard location

There are about 3000 Category:Good articles that do not have a GA review at the standard location of Talk:<article title>/GA1. This is standing in the way of creating a list of GAs that genuinely do not have a GA review. Many of these pages have a pointer to the actual review location in the article milestones on the talk page, and these are the ones that could potentially be moved by bot.

There are two cases, the easier one is pages that have a /GA1 page but the substantive page has been renamed. An example is 108 St Georges Terrace whose review is at Talk:BankWest Tower/GA1. This just requires a page move and the milestones template updated. Note that there may be more than one review for a page (sometimes there are several failed reviews before a pass). GA reviews are identified in the milestones template with the field actionn=GAN and the corresponding review page is found at actionnlink=<review>. Multiple GA reviews are named /GA1, /GA2 etc but note that there is no guarantee that the review number corresponds to the n number in actionn.

The other case (older reviews, example 100,000-year problem) is where the review took place on the article talk page rather than a dedicated page. This needs a cut and paste to a /GA1 page and the review transcluding back on to the talk page. This probably needs to be semi-automatic with some sanity checks by human, at least for a test run (has the bot actually captured a review, is it a review of the target article, did it capture all of the review). SpinningSpark 08:30, 22 January 2017 (UTC)[reply]

Discussion at Wikipedia talk:Good articles#Article incorrectly listed as GA here? and Wikipedia:Village pump (technical)/Archive 152#GA reviews SpinningSpark 08:37, 22 January 2017 (UTC)[reply]

For some time Russian soccer stats website Klisf.info is inactive, inavailable. There are many links to this website (either as incline references or simple external links like the one from this article) and they should be tagged as dead links (at least). --XXN, 12:54, 22 January 2017 (UTC)[reply]

No one bot operator is interested in this task? This is an important thing, there are a lot of articles based on only one Klisf.info dead link, and the WP:VER is problematic. I don't request (yet) to remove these links - just tag them as dead, and another bot will try to update them with a link to an archived version, if possible. The FOOTY wikiproject was notified some time ago, but there is nothing controversial. XXN, 13:55, 10 February 2017 (UTC)[reply]

Add https to ForaDeJogo.net

Please change foradejogo.net links to https. I have already updated its templates. SLBedit (talk) 19:24, 22 January 2017 (UTC)[reply]

@SLBedit: see User:Bender the Bot and its contribs. You can contact directly the bot operator, probably. XXN, 13:59, 10 February 2017 (UTC)[reply]
I'll keep it in mind. --bender235 (talk) 15:40, 10 February 2017 (UTC)[reply]

User's recognized content list

List like Wikipedia:WikiProject Physics/Recognized content generated by User:JL-Bot/Project content seems very neat. Is it possible to generate and maintain the same list, tied to a user instead of a Wikiproject? For example, I can use it to have a list of DYK/GA/FAs credited to me in my user page. HaEr48 (talk) 03:51, 23 January 2017 (UTC)[reply]

Let's ping JLaTondre (talk · contribs) on this. Headbomb {talk / contribs / physics / books} 04:43, 23 January 2017 (UTC)[reply]
He replied in User talk:JL-Bot#Generating User-centric recognized content and said that he doesn't have time to add this new feature right now, and the way such a thing can be implemented is a bit different from JL-Bot's existing implementation. So probably we need new bots. HaEr48 (talk) 06:50, 9 February 2017 (UTC)[reply]

Bot to delete emptied monthly maintenance categories

I notice that we have a bot, AnomieBOT that automatically creates monthly maintenance categories (Femto Bot used to do it earlier). Going by the logs for a particular category, I find that it has been deleted and recreated about 10 times. While all recreations are by bots, the deletions are done by human adminstrators. Why so? Mundane, repetitive tasks like the deletion of such categories (under CSD G6) when they get emptied should be done by bots. This bot task is obviously non-controversial and absolutely non-contentious, since AnomieBOT will recreate the category if new pages appear in the category. 103.6.159.93 (talk) 14:21, 23 January 2017 (UTC)[reply]

Needs wider discussion. It should be easy enough for AnomieBOT III to do this, but I'd like to hear from the admins who actually do these deletions regularly whether the workload is enough that they'd want a bot to handle it. Anomie 04:54, 24 January 2017 (UTC)[reply]
Are these already being tagged for CSD by a bot? I don't work CAT:CSD has much as I used to, but rarely see these in the backlog there now. — xaosflux Talk 14:08, 24 January 2017 (UTC)[reply]
I think they are tagged manually by editors. Anyway, this discussion is now shifted to WP:AN#Bot to delete emptied monthly maintenance categories, for the establishment of consensus as demanded by Anomie. 103.6.159.67 (talk) 14:12, 24 January 2017 (UTC)[reply]
Thanks for taking it there, 103.6.159.67. It looks like it's tending towards "support", if that keeps up I'll write the code once the discussion there is archived. I also see some good ideas in the comments, I had thought of the "only delete if there are no edits besides AnomieBOT" condition already but I hadn't thought of "... but ignore reverted vandalism" or "don't delete if the talk page exists". Anomie 03:21, 25 January 2017 (UTC)[reply]
@Xaosflux: No, {{Monthly clean-up category}} (actually {{Monthly clean-up category/core}}) automatically applies {{Db-g6}} if the category contains zero pages. Anomie 03:12, 25 January 2017 (UTC)[reply]
My experience shows this is safe to delete. They can even be recreated when needed (usually a delayed reversion in a page edit history). -- Magioladitis (talk) 23:09, 24 January 2017 (UTC)[reply]
The question isn't if they're safe to delete, that's obvious. The question is whether the admins who actually process these deletions think it's worth having a bot do it since there doesn't seem to be any backlog. Anomie 03:12, 25 January 2017 (UTC)[reply]
Category:Candidates for uncontroversial speedy deletion is almost always empty when I drop by it. — xaosflux Talk 03:39, 25 January 2017 (UTC)[reply]

The AN discussion is archived now, no one opposed. I put together a task to log any deletions such a bot would make at User:AnomieBOT III/DatedCategoryDeleter test‎, to see if it'll actually catch anything. If it logs actual deletions it might make I'll make a BRFA for actually doing them. Anomie 14:45, 31 January 2017 (UTC)[reply]

Bot to remove old warnings from IP talk pages

There is consensus for removing old warnings from IP talk pages. See Wikipedia_talk:Criteria_for_speedy_deletion/Archive_9#IP_talk_pages and Wikipedia:Village_pump_(proposals)/Archive_110#Bot_blank_and_template_really.2C_really.2C_really_old_IP_talk_pages.. This task is being done using AWB by BD2412 for several years now. Until around 2007, it was also being done by Tawkerbot.

I suggest that a bot should be coded up to remove all sections from IP talk pages that are older than 2 years, and add the {{OW}} template to the page if it doesn't already exist (placed at the top of the page, but below any WHOIS/sharedip templates) There are many reasons why this should be done by a bot. (i) Bot edits marked as minor do not cause the IPs to get a "You have new messages" notification, when the IP talk page is edited. (ii) Blankings done using AWB also remove any WHOIS/sharedip templates, for which there is no consensus. (iii) This is a type of mundane task that should be done by bots. Human editors should not waste their time with this, rather spend it at tasks that require some human intelligence. 103.6.159.93 (talk) 14:41, 23 January 2017 (UTC)[reply]

Needs wider discussion. These are pretty old discussions to support this sort of mass blanking of talk pages. If I recall correctly, an admin deleted a bunch of IP user talk pages a while back and this proved controversial. This needs a modern village pump discussion. ~ Rob13Talk 20:21, 24 January 2017 (UTC)[reply]
Here is one such discussion that I initiated. I think that two years is a bit too soon. Five years is reasonable. When I do these blankings with AWB, I typically go back seven, just because it is easy to skip any page with a date of 2010 or later on hte page. I think some flexibility could be built in based on the circumstances. An IP address from which only one edit has ever been made, resulting in one comment or warning in response, is probably good for templating after no more than three years. I would add that I intentionally remove the WHOIS/sharedip templates, because, again, these are typically pages with nothing new happening in the past seven (and sometimes ten or eleven) years. We are not a permanent directory of IP addresses. bd2412 T 01:01, 25 January 2017 (UTC)[reply]
@BU Rob13: don't be silly. The is consensus for this since 2006. Tawkerbot did it till 2007 and BD2412 has been doing it for years, without anyone disputing the needs for doing it on his talk page. You correctly remember that MZMcBride used an unapproved bot to delete over 400,000 IP talk pages in 2010. That was obviously controversial since there is consensus only for blankings, not for deletions. Any new discussion on this will only result in repetition of arguements. The only thing that needs discussion is the approach. 103.6.159.84 (talk) 04:29, 25 January 2017 (UTC)[reply]
  • I wrote above "remove all sections from IP talk pages that are older than 2 years". I realise that this was misunderstood. What I meant was remove the sections in which the last comment is over 2 years old. This is more moderate proposal. Do you agree with this, BD2412? 103.6.159.84 (talk) 04:29, 25 January 2017 (UTC)[reply]
    I have two thoughts on that. First, I think that going after individual sections, as opposed to 'everything but the top templates' is a much harder task to program. I suppose it would rely on the last date in a signature in the section, or on reading the page history. Secondly, I think that there are an enormous number of pages to deal with that would have all sections removed even under that criteria, so we may as well start with the easy task of identifying those pages and clearing everything off of them. If we were to go to a section-by-section approach, I would agree with a two year window. bd2412 T 04:35, 25 January 2017 (UTC)[reply]
As mentioned, deletion should NOT be done (and is also not requested), deletion results in hiding tracks that may be of interest (either discussions on a talkpage of an IP used by an editor years ago that has relevance to edits to mainspace pages (every now and then there are edits with a summary 'per discussion on my talk'), and it hides that certain IPS that behaved bad were actually warned (company spamming in 2010, gets several warnings, sits still for 7 years, then someone else spams again - we might consider blacklisting with reasoning 'you were warned in 2010, and now you are at it again' - it may be a different person behind a different IP, and the current editor may not even be aware of the situation of 2 1/2 years ago, it is the same organisation that is responsible). If the talkpage 'exists', and we find the old IP that showed the behaviour, it is easy to find the warnings back; if it involves 15 IPs of which 2 were heavily warned, and those two pages now are also redlinks, we need someone with the admin bit to check deleted revisions on 15 talkpages - in other cases, anyone can do it.
Now, regarding blanking: what would be the arguments against archiving threads on talkpages where:
  1. the thread is more than X old (2 years?)
  2. the IP did not edit in the last Y days (1 year?)
We would just insert a custom-template in the header like {{IPtalkpage-autoarchive}} which is pointing to the automatically created archives and which provides a lot of explanation, and we have a specified bot that archives these pages as long as the conditions are met. Downside is only that it would preserve utter useless warnings (though, some editors reply to warnings and go in discussion, and are sometimes right, upon which the perceived vandalism is re-performed), upside is that it preserves also constructive private discussions.
(@BD2412: regarding your "I think that going after individual sections, as opposed to 'everything but the top templates' is a much harder task to program" - the former is exactly what our archiving bots do). --Dirk Beetstra T C 05:51, 25 January 2017 (UTC)[reply]
As far as I am aware, editors have previously opposed archiving of IP talk pages and so this would require wider discussion at an appropriate forum first. Regarding removal of warnings by section, I don't think there is any need to bother about the time the IP last edited -- the whole point of removing old warnings is to ensure that the current (or future) users of the IPs don't see messages that were intended for someone who used the IP years ago. Ideally, a person who visit their IPtalk should see only the messages intended for that person. 103.6.159.84 (talk) 06:19, 25 January 2017 (UTC)[reply]
That consensus could have changed - it may indeed need a wider community consensus. As I read the above thread, however, removal is not only restricted to warnings, it mentions remove the sections in which the last comment is over 2 years old, which also would include discussions. Now, archiving is not a must, one is allowed to simply delete old threads on ones 'own' talkpage.
Whether you archive, or delete - in both cases the effect is the same: the thread that is irrelevant to the current user of the IP is not on the talkpage itself anymore. And with highly fluxional IPs, or with IPs that are used by multiple editors at the same time it is completely impossible to address the 'right editor', you will address all of them. On the other hand, some IPs stay for years with the same physical editor, and the messages that are deleted will be relevant to the current user of the page, even if they did not edit for years. And that brings me to the point whether the editor has been editing in the last year (or whichever time period one choses) - if the IP is continuously editing there is a higher chance that the editor is the same, as when an IP has not been editing for a year (though in both cases, the IP can be static or not static, thwarting that analysis and making it needful to check on a case-by-case basis, which would preclude bot-use). --Dirk Beetstra T C 10:53, 25 January 2017 (UTC)[reply]
I favour archiving using the {{wan}} template rather than blanking. It alerts future editors that there have been previous warnings. If the IP belongs to an organisation, they might just possibly look at the old warnings and discover that the things they are about to do were done before and were considered bad. SpinningSpark 12:07, 25 January 2017 (UTC)[reply]
I think that archiving for old IP talk pages is very problematic. One of the main reasons I am interested in blanking these pages is to reduce link load - the amount of dross on a "What links here" page that obscures the links from that page to other namespaces, which is particularly annoying when a disambiguator is trying to see whether relevant namespaces (mainspace, templates, modules, files) are clear of links to a disambiguation page. All archiving does for IP talk pages is take a group of random conversations - link load and all - and disassociate them from their relevant edit history, which is what a person investigating the IP address is most likely to need. This is very different from archiving article talk pages or wikispace talk pages, where we may need to look back at the substance of old discussions. bd2412 T 14:08, 25 January 2017 (UTC)[reply]
Agree with that. I also don't think archiving of IP talk pages is useful. In any case, it needs to be discussed elsewhere (though IMO it's unlikely to get consensus). There is no point in bringing it up within this bot request. 103.6.159.89 (talk) 15:59, 25 January 2017 (UTC)[reply]
I see the point of that, but that is also the reason why some people want to see what links to a page - where the discussions were. The thread above is rather unspecific, and suggests to blank ALL discussions, not only warnings. And that are the things that are sometimes of interest, plain discussions regarding a subject, or even discussions following a warning. If the talkpage-discussions obscure your view, then you can choose to select incoming links per namespace.
@103.6.159.89: if there is no consensus to blank, but people are discussing whether it should be blanking or archiving or nothing, then there is no need for a discussion here - bots should simply not be doing this. I agree that the discussion about what should be done with it should be somewhere else. --Dirk Beetstra T C 03:29, 26 January 2017 (UTC)[reply]
You can not choose to select incoming links per namespace if you need to see multiple namespaces at once to figure out the source of a problem. For example, sometimes a link return appears on a page that can not actually be found on that page, but is transcluding from another namespace (a template, a portal, a module, under strange circumstances possibly even a category or file), and you need to look at all the namespaces at once to determine the connection. It would be nice if the interface would allow that, but that would be a different technical request. bd2412 T 17:06, 28 January 2017 (UTC)[reply]
I agree that that is a different technical request. But the way this request is now written (to remove all sections from IP talk pages that are older than 2 years) I am afraid that important information could be wiped. I know the problems with the Wikimedia Development team (regarding feature requests etc., I have my own frustrations about that), but alternatives should be implemented with extreme care. I would be fine with removal of warnings (but not if those warnings result in discussion), but not with any other discussions, and I would still implement timing restrictions (not having edited for x amount of time, etc.). --Dirk Beetstra T C 07:32, 29 January 2017 (UTC)[reply]
If there is a really useful discussion on an IP talk page that has otherwise gone untouched for half a decade or more, than that discussion should be moved to a more visible location. We shouldn't be keeping important matters on obscure pages, and given the hundreds of thousands of existing IP talk pages, there isn't much that can be more obscure than the random set of numbers designating one of those. (Yes, I know they are not really random numbers, but for purposes of finding a particular one, they may as well be). bd2412 T 17:02, 5 February 2017 (UTC)[reply]
@BD2412: and how are you going to see that (and what is the threshold of importance)? When you archive it is at least still there, with blanking any discussion is 'gone'. --Dirk Beetstra T C 03:54, 9 February 2017 (UTC)[reply]
Then people will learn not to leave important discussions on IP talk pages with no apparent activity after half a decade or more. bd2412 T 04:13, 9 February 2017 (UTC)[reply]
Your kidding, right? Are we here to collaboratively create an encyclopedia, or are we here to teach people a lesson? --Dirk Beetstra T C 05:49, 9 February 2017 (UTC)[reply]
We are not here to create a permanent collection of random IP talk page comments. bd2412 T 00:33, 14 February 2017 (UTC)[reply]

Trump 2016 election image by state

Can you change the images in Category:United States presidential election, 2016 by state to the cropped version File:Donald Trump official portrait (cropped).jpg so they all match as some are cropped and others aren't. It will also match the main United States presidential election, 2016. 80.235.147.186 (talk) 06:21, 25 January 2017 (UTC)[reply]

Maybe JustBerry could take also this? --Edgars2007 (talk/contribs) 08:28, 25 January 2017 (UTC)[reply]
See above. – Jonesey95 (talk) 14:56, 25 January 2017 (UTC)[reply]
That doesn't address the issues mentioned. 80.235.147.186 (talk) 15:27, 25 January 2017 (UTC)[reply]
@Jonesey95: This issue refers to change between official photos, rather than changing non-official photos to official photos. --JustBerry (talk) 18:43, 25 January 2017 (UTC)[reply]
@80.235.147.186: @Edgars2007: Has consensus been established regarding which image should be used (cropped versus non-cropped)? If so, please link to the discussion. --JustBerry (talk) 18:43, 25 January 2017 (UTC)[reply]

 On hold until the proposal has achieved WP:CONSENSUS. If proposal demonstrates consensus, please link to the corresponding discussion. --JustBerry (talk) 05:21, 26 January 2017 (UTC)[reply]

I propose to change the images in Category:United States presidential election, 2016 by state to a single campaign photo, not a post-election photo. Must illustrate Donald Trump during the campaign, not after. --Frodar (talk) 04:05, 27 January 2017 (UTC)[reply]
This is not the right place for a discussion. I suggest Wikipedia talk:WikiProject Donald Trump. Once consensus has been reached there, post here with a link to the discussion's outcome. Thanks. – Jonesey95 (talk) 04:11, 27 January 2017 (UTC)[reply]
The proposal doesn't need a consensus and no one has disputed the changes to cropped on the other articles. It should be cropped to match the main United States presidential election, 2016 article and an admin has said here you can be WP:BOLD. 80.235.147.186 (talk) 18:20, 27 January 2017 (UTC)[reply]

Migrate from deprecated WikiProject Central America country task forces

All seven of the task forces for WikiProject Central America have graduated to full-fledged WikiProjects (for example, WikiProject Costa Rica) and the task force parameters have been deprecated. We need a bot to go through all of the existing transclusions of {{WikiProject Central America}} and perform the following changes:

  • If there are no country task forces assigned, leave the {{WikiProject Central America}} template.
  • If there are 1 or 2 country task forces assigned, replace the task force entries with full WikiProject templates for those countries (replicating any relevant assessment data) and delete the {{WikiProject Central America}} template.
  • If there are 3 or more country task forces assigned, leave the {{WikiProject Central America}} template and remove the task force entries (as the scope of the topic is unlikely to be country-specific).

The only parameters supported by the country-specific templates are class, importance, small, and listas, so you don't have to worry about replicating any other parameters (like attention or needs-infobox). Kaldari (talk) 00:25, 26 January 2017 (UTC)[reply]

Kaldari why do you need listas? The namespace is automatically omitted. -- Magioladitis (talk) 00:32, 26 January 2017 (UTC)[reply]

@Magioladitis: I was just thinking it might be good to copy it if it exists. If listas isn't migrated, that's fine with me though (same for small). Kaldari (talk) 00:54, 26 January 2017 (UTC)[reply]
AFAIK all WikiProject banner templates support |listas= - it's one of the three "universal" parameters along with |category= and |small=. I should qualify that by modifying that to "... WikiProject banner templates built upon {{WPBannerMeta}} support ...", that is to say, I don't know of any that don't support these params, apart from Mathematics and Military history, which have their own peculiar banners that are not built upon {{WPBannerMeta}}.
My recommendation would be that if {{WikiProject Central America}} has |listas= and it is non-blank, copy that to the replacement template - but if there are two or more replacement templates, copy it to just one of them, since it is not required if another WikiProject template on the same page has its own |listas= set: it not only affects categories used by the banner in which it is set, but it also affects the sortkey of all other banners and templates. --Redrose64 🌹 (talk) 19:08, 26 January 2017 (UTC)[reply]
@Kaldari: Could you direct me toward a list of what taskforces get what templates, etc.? Might be able to do this. ~ Rob13Talk 15:53, 31 January 2017 (UTC)[reply]
@BU Rob13::
Kaldari (talk) 18:16, 31 January 2017 (UTC)[reply]
 Doing... ~ Rob13Talk 21:58, 31 January 2017 (UTC)[reply]
@BU Rob13: Are you still interested in doing this task? No rush. Just wanted to check in. Kaldari (talk) 03:25, 10 February 2017 (UTC)[reply]
Yes, I've just been both very busy and very low motivation recently due to some things going on. I'll get to it definitely, but might need to be a tad patient with me. I'll try to get a BRFA submitted this weekend or next week. ~ Rob13Talk 04:36, 10 February 2017 (UTC)[reply]
@Kaldari: Written and mostly tested at Module:WikiProject Central America/convert. This will need an AWB bot run to substitute the module (including some alterations to a couple parameters going into the substitution to correct an annoying quirk of using template args in Lua modules). Note that the output from my method of implementation would be something like this: [3] with each parameter on a new line. Are you fine with that? The output of the page isn't changed at all by doing it that way instead of on one line. ~ Rob13Talk 15:12, 10 February 2017 (UTC)[reply]
@BU Rob13: That looks fine with me. Kaldari (talk) 16:09, 10 February 2017 (UTC)[reply]
@Kaldari: BRFA filed as Task 33 of BU RoBOT. ~ Rob13Talk 11:08, 11 February 2017 (UTC)[reply]
@Kaldari: Do you want the country projects to inherit the default importance, the country-specific importance, or the country if it's available and the default if not? ~ Rob13Talk 16:28, 12 February 2017 (UTC)[reply]
@BU Rob13: The country-specific importance if it's available and the default if not. Kaldari (talk) 17:11, 12 February 2017 (UTC)[reply]
@Kaldari: Thanks for the clarification. Trial results are available at Wikipedia:Bots/Requests for approval/BU RoBOT 33 if you want to take a look. ~ Rob13Talk 22:59, 12 February 2017 (UTC)[reply]

Fix duplicate references in mainspace

Hi. Apologies if this is malformed. I'd like to see a bot that can do this without us depending on a helpful human with AWB chancing across the article. --Dweller (talk) Become old fashioned! 19:11, 26 January 2017 (UTC)[reply]

As a kind of clarification, if an article doesn't used named references because the editors of that article have decided not to, we don't want to require the use of named references to perform this kind of merging. In particular, AWB does not add named references if there are not already named references, in order to avoid changing the citation style. This is mentioned in the page linked above (which is an AWB subpage), but it is an important point for bot operators to keep in mind. — Carl (CBM · talk) 19:27, 26 January 2017 (UTC)[reply]
Been here bonkers years and never come across that, thanks! Wikipedia:Citing_sources#Duplicate_citations suggests finding other ways to fix duplicates. I don't know what those other ways are, but if that makes it too difficult, maybe the bot could only patrol articles that already make use of the refname parameter. --Dweller (talk) Become old fashioned! 19:57, 26 January 2017 (UTC)[reply]
It's easy enough for a bot to limit itself to articles with at least one named ref; a scan for that can be done at the same time as a scan for duplicated references, since both require scanning the article text. — Carl (CBM · talk) 20:26, 26 January 2017 (UTC)[reply]
Smashing! Thanks for the expertise. --Dweller (talk) Become old fashioned! 20:44, 26 January 2017 (UTC)[reply]
Note: this is not what is meant by CITEVAR. It is perfectly fine to add names to references. All the best: Rich Farmbrough, 00:48, 5 February 2017 (UTC).[reply]

NB Your chart, above, is reading my signature as part of my username. Does that need a separate Bot request ;-) --Dweller (talk) Become old fashioned! 12:00, 1 February 2017 (UTC)[reply]

Create a simple TemplateData for all Infoboxes

Request
Check if the /doc file in all templates contained in WP:List of infoboxes contains <templatedata> or <templatedata /> or <templatedata/> :

If none exists add to the bottom:

<templatedata>
{
	"params": {},
	"paramOrder": [],
	"format": "block"
}
</templatedata>

— አቤል ዳዊት?(Janweh64) (talk) 17:05, 27 January 2017 (UTC)[reply]

Why? How is the editor's or reader's experience improved if this is done? – Jonesey95 (talk) 16:15, 27 January 2017 (UTC)[reply]
It is only step one of a series of bot operations I have in mind to systematical create a base-line TemplateData for all Biography Infoboxes by importing data from Infobox person. Maybe, it is too small of a step. Sorry this is my first bot request. The next step would be to check if the template contains a "honorific_prefix" parameter and if so add between {}:
"honorific_prefix": {"description": "To appear on the line above the person's name","label": "Honorific prefix","aliases": ["honorific prefix"]},
and
"honorific_prefix",
inside []. Step by step we could accomplish the goals set out by this daunting task. The same idea could be used to create TemplateData to other infoboxes or even many other templates by inheriting the data from their parents.— አቤል ዳዊት?(Janweh64) (talk) 17:05, 27 January 2017 (UTC)[reply]
It sounds like this idea needs more development. I suggest having a discussion at that TemplateData talk page, coming up with a plan that could be executed by a bot, and then coming back here with that plan. – Jonesey95 (talk) 17:31, 27 January 2017 (UTC)[reply]
This discussion is proof that discussing things accomplishes nothing. Nevermind, I will just learn how to build a bot myself and get it approved. — አቤል ዳዊት?(Janweh64) (talk) 23:10, 27 January 2017 (UTC)[reply]
Sorry to disappoint you. So that you don't waste your time and get more frustrated, here's one more piece of information: you will find that when you make a bot request, you will also be asked for a link to the same sort of discussion. If you take a look at Wikipedia:Bots/Requests for approval, you will see these instructions: If your task could be controversial (e.g. most bots making non-maintenance edits to articles and most bots posting messages on user talk pages), seek consensus for the task in the appropriate forums. Common places to start include WP:Village pump (proposals) and the talk pages of the relevant policies, guidelines, templates, and/or WikiProjects. Link to this discussion from your request for approval. This is how things work. – Jonesey95 (talk) 00:51, 28 January 2017 (UTC)[reply]
I was rude. You were kind. — አቤል ዳዊት?(Janweh64) (talk) 13:46, 28 January 2017 (UTC)[reply]

WP:UAAHP

Hi, is it possible for a bot, such as DeltaQuadBot, to remove stale reports at the UAA holding pen (those blocked and those with no action in seven days), like it does with blocked users and declined reports at WP:UAAB? If this is not possible I would be happy to create my own bot account and have it do this task instead. Thanks! Linguisttalk|contribs 22:23, 28 January 2017 (UTC)[reply]

You should ask DeltaQuad if she would consider adding it to her bot. Also, is this something that the UAA admins want? — JJMC89(T·C) 22:39, 28 January 2017 (UTC)[reply]
I haven't asked the UAA regulars but I'm sure this would be helpful. In fact, I'm almost the only one who cleans up the HP and it would be helpful to me. Linguisttalk|contribs 22:41, 28 January 2017 (UTC)[reply]
This would certainly be helpful if it could remove any report that is more than seven days old, where the account has not edited at all. This is the bulk of what gets put in the holding pen, so keeping it up to date would be quite simple if these type of reports were removed automatically. Beeblebrox (talk) 22:10, 19 February 2017 (UTC)[reply]

This backlog is really crowded (it is near 200K). Legobot used to tag images that have rationales, but they have not edited in the File namespace since May 2014 [4] . Is there a way that someone could set up their bot to take over this bot's previous responsibility? Thanks. -- 1989 (talk) 23:32, 31 January 2017 (UTC)[reply]

@1989: See Wikipedia:Bots/Requests_for_approval/BU_RoBOT_32. I filed a BRFA to help address this a few days ago. ~ Rob13Talk 01:55, 1 February 2017 (UTC)[reply]

MOS Bot

A bot to perform certain simple edits to make articles comply with MOS. One feature would be to italicize foreign words (the bot could have a list of words that are commonly used, but need to be italicized). And also to removing the italics from words that don't need them, but commonly are italicized. It would also add a {{nbsp}} between any integer, and AD, BC, CE, BCE. It would also capitalize AD, BC, CE, or BCE if it was next to a number. Iazyges Consermonor Opus meum 17:30, 1 February 2017 (UTC)[reply]

This isn't a suitable task for a bot. See WP:CONTEXTBOT for details. Headbomb {talk / contribs / physics / books} 17:39, 1 February 2017 (UTC)[reply]
To give a concrete example, look at this article to see the phrase "2014 ad campaign", which should obviously not be changed by a bot to "2014{{nbsp}}AD campaign". The word "ad" in this context means "advertisement", not "anno domini". There is no way for a bot to tell the difference. – Jonesey95 (talk) 18:13, 1 February 2017 (UTC)[reply]
However a good NLP system would realise this, because there are three glaring clues, firstly the lower, secondly the presence of the word "campaign" and thirdly AD is rarely used with years as late as 2104. The combination would disambiguate ad to "advertisement" rather than its alternative meanings. All the best: Rich Farmbrough, 00:57, 5 February 2017 (UTC).[reply]
In the meantime, if anyone is interested, I have a formatting script that I use for making MOS-related changes. -- Ohc ¡digame! 23:20, 14 February 2017 (UTC)[reply]

One-off bot to ease archiving at WP:RESTRICT

This isn't urgent, or even 100% sure to be needed, but it looks likely based on this discussion that we will be moving listings at WP:RESTRICT to an archive page if the user in question has been inactive for two years or more. Some of the restrictions involve more than one user and would require a human to review them, but it would be awesome if a bot could determine that if a user listed there singly had not edited at all in two or more years it could automatically transfer their listing to the archive. There are aloso some older restrictions that involved a whole list of users (I don't think arbcom does that anymore), and in several of those cases all of the users are either blocked or otherwise totally inactive. This would only be needed once, just to reduce the workload to get the archive started. (the list is extremely long, which is why this was proposed to begin with) Is there a bot that could manage this? Beeblebrox (talk) 18:46, 4 February 2017 (UTC)[reply]

Ongoing would be better, and even bringing back "resurrected" users might be helpful too. All the best: Rich Farmbrough, 01:01, 5 February 2017 (UTC).[reply]
 Doing... All the best: Rich Farmbrough, 23:25, 13 February 2017 (UTC).[reply]
Awesome, the discussion was archived without a formal close, but consnesus to do this is pretty clear. Beeblebrox (talk) 20:57, 15 February 2017 (UTC)[reply]

Addition of tl:Authority control

There are very many biographies - tens of thousands - for which wikidata has authority control data - example Petscan, and where such data is not displayed in the article. {{Authority control}}, with no parameters, displays authority control data from wikidata. It is conventionally placed immediately before the list of categories at the foot of an article. It is used on 510,000+ articles and appears to be the de facto standard for handling authority control data on wikipedia.

Would this make a good bot task: use the petscan lists, like the above, to identify articles for which {{Authority control}} can be placed at the foot of the article? --Tagishsimon (talk) 11:51, 6 February 2017 (UTC)[reply]

I think there is already a bot doing this. -- Magioladitis (talk) 18:48, 11 February 2017 (UTC)[reply]

Tagishsimon User:KasparBot adds Authority Control. -- Magioladitis (talk) 09:24, 12 February 2017 (UTC)[reply]

Thanks Magioladitis. I've added to User talk:T.seppelt. --Tagishsimon (talk) 17:34, 12 February 2017 (UTC)[reply]

Doing... I'm taking care of it. --19:01, 12 February 2017 (UTC) — Preceding unsigned comment added by T.seppelt (talkcontribs)

Requesting bot for wikisource

I'm not sure exactly what to say here, at least in part because I'm not sure exactly what functions we are necessarily seeking a bot to do. But there is currently a discussion at wikisource:Wikisource:Scriptorium#Possible bot about trying to get some sort of bot which would be able to generate an output page roughly similar to Wikipedia:WikiProject Christianity#Popular pages and similar for the portals, author, and categories over there at wikisource. I as an individual am not among the most knowledgeable editors there. On that basis, I think it might be useful to get input from some of the more experienced editors there regarding any major issues which might occur to either a bot developer or them but not me. Perhaps the best way to do this would be to respond at the first linked to section above and for the developer to announce himself, perhaps in a separate subsection of the linked to thread there, to iron out any difficulties. John Carter (talk) 14:31, 6 February 2017 (UTC)[reply]

How about a bot to update (broken) sectional redirects?

When a section heading is changed, it breaks all redirects targeting that heading. Those redirects then incorrectly lead to the top of the page rather than to the appropriate section.

Is this desirable and feasible? If so, how would such a script work? The Transhumanist 22:14, 6 February 2017 (UTC)[reply]

This may turn out to be a WP:CONTEXTBOT. How often do people delete the section entirely, or split the section into two (then which should the bot pick?), or revise the section such that the redirect doesn't really apply anymore? Can the bot correctly differentiate these cases from cases where it can know what section to change the target to?
Such a script would presumably work by watching RecentChanges for edits that change a section heading, and then would check all redirects to the article to see if they targeted that section. It would probably want to delay before actually making the update in case the edit gets reverted or further edited. Anomie 22:29, 6 February 2017 (UTC)[reply]

Script works intermittently

Hi guys, I'm stuck.

I forked the redlink remover script above, with an eye toward possibly developing a bot from it in the future, after I get it to do what I need on pages one-at-a-time. But first I need to get it to run. Sometimes it works and sometimes it doesn't (mostly doesn't).

For example, the original worked on Chrome for AlexTheWhovian but not for me. But later, it started working for no apparent reason. I also had the fork I made (from an earlier version) working on two machines with Firefox. But I turned one off for the night. And in the morning, it worked on one machine and not the other.

The script I'm trying to get to work is User:The Transhumanist/OLUtils.js.

I'm thinking the culprit is a missing resource module or something.

Is there an easy way to track down what resources the script needs in order to work? Keep in mind I'm a newb. The Transhumanist 01:41, 11 February 2017 (UTC)[reply]

After some trial and error, I learned the following: in Firefox, if I run the Feb 28 2016 version of User:AlexTheWhovian/script-redlinks.js and if I use it to strip redlinks from a page (I didn't save the page), then I can load the 15:05, December 26, 2016 version and it works.

Does anyone have any idea why using one script (not just loading it) will cause another script to work? I'm really confused. The Transhumanist 05:33, 11 February 2017 (UTC)[reply]

Maybe one has dependencies that it doesn't load itself, instead relying on other scripts to load them. --Redrose64 🌹 (talk) 21:55, 11 February 2017 (UTC)[reply]
The author said it was stand alone. (They are both versions of the same script). I now have them both loaded, so I can more easily use the first one (User:The Transhumanist/redlinks Feb2016.js) to enable the other (User:The Transhumanist/OLUtils.js). Even the original author doesn't know why it isn't working.
What's the next step in solving this? The Transhumanist 06:46, 12 February 2017 (UTC)[reply]
You've changer the outer part: that's what I would suspect, maybe not loading the mw library properly. Possibly the best way is to make the changes step-by-step, with a browser restart between. (Or better still binary chop.) All the best: Rich Farmbrough, 22:32, 13 February 2017 (UTC).[reply]

Creating a list of red-linked entries at Recent deaths

I request a bot to create and maintain a list consisting of red-linked entries grabbed from the Deaths in 2017 page, as and when they get added there. These entries, as you may know, are removed from the "Deaths in ... " pages if an article about the subject isn't created in a month's time. It would be useful to maintain a list comprising of just the red entries (from which they are not removed on any periodic basis) for editors to go through. This would increase the chances of new articles being created. Preferably at Wikipedia:WikiProject Biography/Recent deaths red list, or in the bot's userspace to begin with. (In the latter case, the bot wouldn't need any BRFA approval.) 103.6.159.71 (talk) 12:54, 15 February 2017 (UTC)[reply]

Check book references for self-published titles

This is in response to this thread at VPT. So here's the problem. We have a list of vanity publishers whose works should be used with extreme caution, or never (some of these publishers exclusively publish bound collections of Wikipedia articles). But actually checking if a reference added to Wikipedia is on this list is time consuming. However, it occurs to me that in some cases it should be simple to automate. At any Amazon webpage for a book, there is a line for the publisher, marked "publisher". On any GoogleBooks webpage, there is a similar line to be found in the metadata hiding in the page source. If an ISBN is provided in the reference, it can be searched on WorldCat to identify the publisher.

So it seems to me like a bot should be able to do the following:

1) Watch recent changes for anything that looks like a link or reference to a book, such as a "cite book" template, a number that looks like an ISBN, or a link to a website like Amazon or GoogleBooks
2) Follow the link (if to Amazon or GoogleBooks), or search the ISBN (if provided), to identify the publisher
3) Check the publisher against the list of vanity publishers
4) Any positive hits could then be automatically reported somewhere on Wikipedia. There could even be blacklisted publishers (such as those paper mirrors of Wikipedia I mentioned) that the bot could automatically revert, after we're sure there are few/no false positives

What do people think? Doable? Someguy1221 (talk) 00:13, 16 February 2017 (UTC)[reply]

Bot to move files to Commons

Certainly files that a trusted user like Sfan00 IMG has reviewed and marked as suitable for move to Commons, can be moved without any further review using bot? All these files are tagged with {{Copy to Commons|human=Sfan00 IMG}} and appear in Category:Copy to Wikimedia Commons reviewed by Sfan00 IMG. There are over 11,000 files. I have personally no experience in dealing with files and so can't talk about the details, but I reckon something like CommonsHelper would be useful? I have asked Sfan about this but they have been inactive for 3 days.

If the process is likely to be error-free, I suppose that instead of marking the transferred files as {{NowCommons}} (which creates more work for admins in deleting the local copy), the bot could outright delete them under CSD F8. 103.6.159.65 (talk) 05:14, 16 February 2017 (UTC)[reply]

Technical details: such a bot would need to operate on a user-who-tagged-the-file basis; I'd envision using a parameter with the tagging user's username, combined with some setup comparable to {{db-u1}} to detect if the last user to edit the page was someone other than the user whose name appears in the parameter. On eligibility for specific user participation, I'm hesitant with Sfan00 IMG, basically because it's a semiautomated script account, and I'd like to ensure that every such file be checked manually first; of course, if ShakespeareFan00 is checking all these images beforehand and then tagging the checked images with the script, that's perfectly fine. Since you asked my perspective as a dual-site admin: on the en:wp side, the idea sounds workable, and bot-deleting the files sounds fine as long as we're programming it properly. On the Commons side, I hesitate. We already have several bots that do the Commons side of things, and they tend to do a rather poor-quality job; they can accurately copy the license and description, but they often mess up with the date and sometimes have problems with copying template transclusion properly, and they're horrendous with categories (which are critical for Commons images) — basically the only options with such bots are leaving the images entirely uncategorised, or ending up with absolute junk, e.g. "Companies of [place]" on portraits because the subject was associated with a company from that place; you can see one bad example in this revision of File:Blason Imbert Bourdillon de la Platiere.svg. If we deem it a good idea, adding another Commons bot would be fine; the issue is whether having a bot do this at all is a good idea on the Commons side. Nyttend (talk) 05:55, 16 February 2017 (UTC)[reply]
phew, there are 200k files in Category:Copy to Wikimedia Commons (bot-assessed)‎ which need further review. But what is surprising is that there are over 12,000 in Category:Copy to Wikimedia Commons maincat, which must all have been tagged by humans (because the bot-tagged ones are in the former cat). I wonder whether it would be a good idea to have a bot identify the tagger from the page history and add the |human= parameter. I also note that there are some files like File:Ambyun official.jpg that were tagged by Sfan without the human parameter. 103.6.159.65 (talk) 15:17, 16 February 2017 (UTC)[reply]
I've written a tool, Wikipedia:MTC!, which does have an option for mass-transfer. While it is reasonably accurate, I still think it is important for human to review each file before and after transfer. -FASTILY 00:10, 18 February 2017 (UTC)[reply]


I would echo Nyttend's concerns, especially as a few I'd tagged in good faith were subsquently found to have copyright concerns despite my best efforts.
There also a category that I created for inline-assessed based on a PD licence.

Those might also be amenable for semi-automated transfer with the above caveats. Sfan00 IMG (talk) 10:50, 20 February 2017 (UTC)[reply]

Linkfix: www.www. to www.

We have 87 links in the form www.www.foo.bar which should really be www.foo.bar - the form www.www. is normally a fault. A simple link checker with text replacement would help.--Oneiros (talk) 13:49, 16 February 2017 (UTC)[reply]

OK - I will have a look into this. TheMagikCow (talk) 17:19, 16 February 2017 (UTC)[reply]
BRFA filed TheMagikCow (talk) 10:33, 18 February 2017 (UTC)[reply]

Could someone have one of their bots update this page frequently? A bot once updated it, but stopped in 2014. MCMLXXXIX 16:59, 16 February 2017 (UTC)[reply]

Hi 1989. We have Special:LongPages. Is that insufficient? --MZMcBride (talk) 05:30, 17 February 2017 (UTC)[reply]
Yes. The list only shows articles while the page I referenced has talk pages. MCMLXXXIX 09:21, 17 February 2017 (UTC)[reply]
I have popped up a page on tool labs that lists the fifty longest talk pages that are not user talk page or sub-pages. Hope this helps. - TB (talk) 12:35, 17 February 2017 (UTC)[reply]

Bot to update Alexa ranks

OKBot apparently used to do this, but blew up in April 2014 and has never been reactivated. It would be quite handy as there's a lot of articles that contain Alexa ranks and they do change frequently. Triptothecottage (talk) 05:35, 18 February 2017 (UTC)[reply]

WP 1.0 bot

Hi, there is a problem with WP 1.0 bot that produces the various project assessment tables and logs of changes to article assessment. The bot has not been working since early February and both of the stated maintainers are no longer active. We need some one to get the bot operating again and possibly a couple of people who could take over the maintenance of the bot. Any offers? Keith D (talk) 23:51, 19 February 2017 (UTC)[reply]

@Keith D: Oh no ... that's a bit of a mess. Have you tried reaching out to the maintainers via email? In the absence of a handing over of existing code, a bot operator would need to start from scratch. Someone could definitely do it (not me ... but someone), but it would take longer. ~ Rob13Talk 00:13, 20 February 2017 (UTC)[reply]
Thanks, sorry for not putting brain in gear I had forgotten about e-mail. Keith D (talk) 13:08, 20 February 2017 (UTC)[reply]
@Keith D and BU Rob13: There is a formal procedure for adding maintainers/taking over an abandoned project: Tool Labs Abandoned tool policy. --Bamyers99 (talk) 14:47, 20 February 2017 (UTC)[reply]

template:Geographic location

Hi! I am a hungarian Wiki user. I speak english a little bit... I would like import lot of item for template:Geographic location. My bot not working, is lot of error in English Wikipedia. Please replace: Geographic Location --> Geographic location. Thank you! --B.Zsolt (talk) 20:37, 20 February 2017 (UTC)[reply]

N Not done Replacing a redirect with its target is cosmetic. — JJMC89(T·C) 00:31, 21 February 2017 (UTC)[reply]
@B.Zsolt: You appear to be indicating you've attempted running a bot from your main account without approval. Am I correct in assuming edits like these [5] [6] were automated? If you wish to run a bot, you must seek approval at WP:BRFA before doing so, or you'll likely be quickly blocked. It is not permitted to run a bot from your main account or to do so at all without approval. Further, per WP:COSMETICBOT, the bot you suggest would not be approved. ~ Rob13Talk 02:06, 21 February 2017 (UTC)[reply]
It's also against WP:NOTBROKEN. --Redrose64 🌹 (talk) 21:19, 21 February 2017 (UTC)[reply]
@Redrose64: Disagree; NOTBROKEN applies only to piped links that bypass redirects. ―Mandruss  21:24, 21 February 2017 (UTC)[reply]
No, it applies to any redirect, piped or not - the very first example ([[Franklin Roosevelt]] to [[Franklin D. Roosevelt]]) does not involve a pipe. --Redrose64 🌹 (talk) 21:44, 21 February 2017 (UTC)[reply]
That's a more complicated situation as changing the visible link text to match the article title is often legitimate, and too often reverted incorrectly per NOTBROKEN. The piped-link fix does not change the visible text. The two very different situations should not be addressed in the same guideline, but it appears they currently are anyway, so you're technically correct. ―Mandruss  22:00, 21 February 2017 (UTC)[reply]
@Mandruss: I'm quite confused. How are you suggesting that replacing a template redirect with its target will improve the encyclopedia? ~ Rob13Talk 23:49, 21 February 2017 (UTC)[reply]
@BU Rob13: I'm not. It was a tangential discussion of WP:NOTBROKEN, now resolved. ―Mandruss  23:57, 21 February 2017 (UTC)[reply]

I would like not editing English wikipedia! I am only Hungarian Wiki user. I would like only import date from English wiki to Wikidata, but Pltools software does not work. Why? Pltools not like template redirect.

My plan:

hi, any help is appreciated (or suggestion)

from the help desk .....

I'm a member of wikiproject Medicine, basically this happened to us [7] and so we have a source code but we need someone's help to do 2016 version (of 2015[8]), I can assist in whatever is needed. ...thank you--Ozzie10aaaa (talk) 17:56, 17 February 2017 (UTC)[reply]

@Ozzie10aaaa: To track down this kind of expert, try posting at Wikipedia:Request a query. -- John of Reading (talk) 19:49, 17 February 2017 (UTC)[reply]
will do, and thanks--Ozzie10aaaa (talk) 20:59, 17 February 2017 (UTC)[reply]
as indicated above I'm posting there(with no response)...if anyone can help here it would be greatly appreciated, thank you--Ozzie10aaaa (talk) 13:20, 20 February 2017 (UTC)[reply]
A couple of days is nothing - so don't give up just yet. But to be honest neither here nor there is the best place for this kind of request - people here are mostly "consumers" of reports rather than report-makers. The best place to ask would be Wikipedia:Bot requests, that's where the coders are. Le Deluge (talk) 02:04, 21 February 2017 (UTC)[reply]


per the above suggestion I'm posting here, thank you--Ozzie10aaaa (talk) 02:23, 21 February 2017 (UTC)[reply]

I request a bot to remove any wikilink at a page that redirects back to the same page or section. Thanks.Anythingyouwant (talk) 03:08, 22 February 2017 (UTC)[reply]

It isn't clear to me exactly what is being asked for. If a link points to a redirect that points to a different section in the original article, the link should not be removed. If anything, it should be replaced with a section link, see Wikipedia:Manual of Style/Linking#Section links (second paragraph). Anyway, in all cases care is needed for redirects that have recently been created from existing articles. Sometimes such redirects are controversial and will be reverted. Thincat (talk) 08:55, 22 February 2017 (UTC)[reply]
I said the same section, not a different section. Here is an example of what the bot would do. In that example, the redirect has existed for years (since 2013).Anythingyouwant (talk) 17:22, 22 February 2017 (UTC)[reply]

This is already part of CHECKWIKI. I can do them semi-automatically. -- Magioladitis (talk) 17:51, 22 February 2017 (UTC)[reply]