Wikinews:Bots/Archive 5
|
This is an archive of past discussions from Wikinews:Bots. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current page. |
Contents
- The following discussion is preserved as an archive. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.
- Closed as unsuccessful. Unfortunately this request has been open for almost two years with little progress. Please feel free to open a fresh request if and when needed. [24Cr][talk] 19:34, 29 August 2021 (UTC)[reply]
- Operator: DannyS712 (talk · contribs)
- Bot name: DannyS712 bot (Talk • contribs • bot status log • actions • block log • other log)
- Programming language: Javascript
- Already used on: None for this task
- Task: Update use-counts regularly (weekly, biweekly, monthly / up to discussion)
The below SQL query retrieves, for each template that has a "use count" sub template, the template/module name, the full number of transclusions, the number that would be put in the template (not the exact number, but some number below it to say "over xxx"), and the name of the use count page. I've only some of the results here, but the total query retrieves 61 rows
SQL | Current result | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
USE enwikinews_p;
SELECT
CONCAT (
CASE
WHEN tl.tl_namespace = 10 THEN 'Template'
WHEN tl.tl_namespace = 828 THEN 'Module'
END,
':',
tl.tl_title
) AS 'Page',
COUNT(*) AS 'Transclusions',
CASE
WHEN COUNT(*) < 1000 AND MOD(COUNT(*), 100) < 60
THEN CONCAT( SUBSTR(COUNT(*), 1, LENGTH(COUNT(*)) - 2), '00' )
WHEN COUNT(*) < 1000
THEN CONCAT( SUBSTR(COUNT(*), 1, LENGTH(COUNT(*)) - 2), '50' )
WHEN COUNT(*) < 10000 AND MOD(COUNT(*), 100) < 60
THEN CONCAT( SUBSTR(COUNT(*), 1, 1), ',', SUBSTR(COUNT(*), 2, 1), '00' )
WHEN COUNT(*) < 10000
THEN CONCAT( SUBSTR(COUNT(*), 1, 1), ',', SUBSTR(COUNT(*), 2, 1), '50' )
WHEN COUNT(*) < 1000000 AND MOD(COUNT(*), 1000) < 600
THEN CONCAT( SUBSTR(COUNT(*), 1, LENGTH(COUNT(*)) - 3), ',000' )
WHEN COUNT(*) < 1000000
THEN CONCAT( SUBSTR(COUNT(*), 1, LENGTH(COUNT(*)) - 3), ',500' )
ELSE
CONCAT( SUBSTR(COUNT(*), 1, 1), '.', SUBSTR(COUNT(*), 2, 1), ' million' )
END AS 'Over',
CONCAT (
CASE
WHEN tl.tl_namespace = 10 THEN 'Template'
WHEN tl.tl_namespace = 828 THEN 'Module'
END,
':',
tl.tl_title,
'/doc/use-count/doc'
) AS 'Counter'
FROM templatelinks tl
WHERE tl.tl_title IN (
SELECT
SUBSTR(pg.page_title, 1, LENGTH(pg.page_title) - 18)
FROM
page pg
WHERE
pg.page_title LIKE '%/doc/use-count/doc'
AND pg.page_id IN (
SELECT cl.cl_from
FROM categorylinks cl
WHERE cl.cl_to = 'Possible-risk/use-count'
)
)
AND tl.tl_namespace IN (10, 828)
GROUP BY CONCAT(tl.tl_title, tl.tl_namespace)
HAVING COUNT(*) > 100
ORDER BY COUNT(*) DESC;
|
|
The bot would, at a set duration, go and update each counter template with the current transclusions, to keep them (relatively) up to date. The bot will only edit pre-existing templates. Let me know if there are any questions. Thanks, --DannyS712 (talk) 21:37, 13 October 2019 (UTC)[reply]
Comments
- Comment This doesn't seem needed. I of course have long freely admitted I'm unfond of bots. So I'm left with vaguely negative feelings but, atm, not strong enough to actively oppose. --Pi zero (talk) 22:11, 13 October 2019 (UTC)[reply]
- @DannyS712: Is this still a current request? --Green Giant (talk) 03:32, 7 August 2021 (UTC)[reply]
- I mean, if the community would support this, I can still write it, but I have no objections to closing as expired or withdrawn if not --DannyS712 (talk) 03:47, 7 August 2021 (UTC)[reply]
- @DannyS712: Thank you for the prompt reply. My main concern was that it’s been open for so long. I know that you are less active here than you used to be. I think it would be unfair to ask you to commit to this unless you actively wanted to do it. Otherwise I’m not opposed to the proposal. --Green Giant (talk) 03:56, 7 August 2021 (UTC)[reply]
- Well since I already did the hard part of writing the query, it should be fairly simple to implement if this gets approved and whenever I have time --DannyS712 (talk) 03:59, 7 August 2021 (UTC)[reply]
- The normal process of bot approvals requires a test run, analysis of that run, and then approval if deemed to be done correctly. When you are ready, please go ahead and do a test run of 10 edits. Then we can look at it. Please keep the edit rate at no more than two edits per minute. --Green Giant (talk) 04:22, 7 August 2021 (UTC)[reply]
- I don't think it is *that* kind of bot which has multiple edits, but then, I am not too entirely convinced we require tables like these, do we?
•–• 04:26, 7 August 2021 (UTC)[reply]
- I don't think it is *that* kind of bot which has multiple edits, but then, I am not too entirely convinced we require tables like these, do we?
- The normal process of bot approvals requires a test run, analysis of that run, and then approval if deemed to be done correctly. When you are ready, please go ahead and do a test run of 10 edits. Then we can look at it. Please keep the edit rate at no more than two edits per minute. --Green Giant (talk) 04:22, 7 August 2021 (UTC)[reply]
- Well since I already did the hard part of writing the query, it should be fairly simple to implement if this gets approved and whenever I have time --DannyS712 (talk) 03:59, 7 August 2021 (UTC)[reply]
- @DannyS712: Thank you for the prompt reply. My main concern was that it’s been open for so long. I know that you are less active here than you used to be. I think it would be unfair to ask you to commit to this unless you actively wanted to do it. Otherwise I’m not opposed to the proposal. --Green Giant (talk) 03:56, 7 August 2021 (UTC)[reply]
┌─────────────────────────────────┘
@DannyS712: It’s been a fortnight since the last comment. Please could you update us on any progress? If not, shall we close it for now? [24Cr][talk] 12:51, 21 August 2021 (UTC)[reply]
- Sorry, I don't have enwikinews in my global watchlist so I didn't see it. The table above is just for demonstration of the data that the bot gets, it doesn't edit such a table but rather as I explained above "[t]he bot would, at a set duration, go and update each counter template with the current transclusions" so yes, it has multiple edits. An initial run would probably update most of the existing templates because I assume they haven't been updated in a while. --DannyS712 (talk) 03:15, 22 August 2021 (UTC)[reply]
- OK, could you do a test run of maybe 10-20 edits so we can see it in operation? Cheers. [24Cr][talk] 16:38, 22 August 2021 (UTC)[reply]
Votes
- The above discussion is preserved as an archive. Please do not modify it. Subsequent comments should be made on the appropriate discussion page, such as the current discussion page. No further edits should be made to this discussion.
- The following discussion is preserved as an archive. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.
- Closed as successful. [24Cr][talk] 07:08, 30 October 2021 (UTC)[reply]
- Operator: Cyberpower678 (talk · contribs) and Harej (talk · contribs)
- Bot name: InternetArchiveBot (Talk • contribs • bot status log • actions • block log • other log)
- Programming language: PHP
- Already used on: Operates on dozens of additional Wikimedia wikis
- Task: InternetArchiveBot identifies dead links and adds links to archived versions where available. Per request on Phabricator. Harej (talk) 22:32, 20 January 2021 (UTC)[reply]
Comments
Task 1
- Comment @Harej: Please note, I would like to have all links (not just the dead links) archived. There are roughly 22k articles in the CAT:Published category. And the bot just needs to archive the links, it should NOT edit the articles. After all the links are archived in that category, IABot needs to monitor the same category for any new additions to the category. Can you please confirm IABot can do that, Harej? Thanks.
•–• 06:31, 21 January 2021 (UTC)[reply]- Why should the bot not edit the articles? Providing working alternatives to non-functioning links is the entire point. Harej (talk) 17:17, 21 January 2021 (UTC)[reply]
- @Harej: We use {{source}} for adding the sources as well as the external links. That template has parameters to mark if the link is broken or to provide archived link. However, there are articles which do not use that template. We do not remove/replace sources after the article is published. However, while cleaning up, if a source has broken URL, admins generally find the archived version and add it. Giving a bot access to edit and review archived pages is scary -- as it could do irreversible damage. We wish to use the bot to save the source before the link rot -- once it is saved, one could add the archived version. Moreover, if an archive already exists, there is a sapient decision to make which version of the archive to use. There are articles which are not sighted (maybe because the file/template used for that page has been modified) -- if the bot were to add archived links to that article and sights, such important changes would disappear unnoticed forever. That is why the bot should not be directly editing the article, just merely archiving all the hyperlinks.
•–• 17:45, 21 January 2021 (UTC)[reply]- InternetArchiveBot operates on 50 Wikimedia wikis without causing "irreversible damage" to them. In the event the bot malfunctions on a given wiki, mistakes can be undone and the bot can be stopped by any logged in user through the bot interface. The bot is very sophisticated, can be configured to parse templates as well as fix plain links, and automatically picks archives closest to when the link was added. The Wayback Machine has been automatically saving outbound links on Wikimedia projects for years. The benefit of proactively, and automatically, fixing broken links vastly outweighs the risks. Linkrot is endemic to wikis at such a scale that automated solutions are required. Harej (talk) 17:59, 21 January 2021 (UTC)[reply]
- With respect, acagastya is correct: there is danger of irreversible damage. There's history here, between Wikinews and the Foundation, involving a WON'T FIX closure of a Bugzilla request (from before my time, but I learned about it from bawolff). 'Nuff said on that point, I hope. --Pi zero (talk) 18:18, 21 January 2021 (UTC)[reply]
- I don't know what the Wikimedia Foundation has to do with this; InternetArchiveBot is a service of the Internet Archive. The bot is operated in many diverse contexts and can be highly customized for a given use case, and I am more than happy to work with the community on that. If you do not want InternetArchiveBot to fix broken links, which is its primary function, then I am not sure what you are requesting at all. If you want the Wayback Machine to preserve outgoing links, it is already doing that. Harej (talk) 18:22, 21 January 2021 (UTC)[reply]
- What you may be interested in is not automated operation, but the option to scan a page and add archives with the element of human review. If so the Analyze a Page feature may be useful for you. (Make sure you have "English Wikinews" as your selected project.) With this you can enter a page title, have it retrieve archive links for all links (not just dead ones), and make the edits conveniently while giving you the ability to review. This may be a more workable option. Harej (talk) 02:11, 22 January 2021 (UTC)[reply]
- With respect, acagastya is correct: there is danger of irreversible damage. There's history here, between Wikinews and the Foundation, involving a WON'T FIX closure of a Bugzilla request (from before my time, but I learned about it from bawolff). 'Nuff said on that point, I hope. --Pi zero (talk) 18:18, 21 January 2021 (UTC)[reply]
- InternetArchiveBot operates on 50 Wikimedia wikis without causing "irreversible damage" to them. In the event the bot malfunctions on a given wiki, mistakes can be undone and the bot can be stopped by any logged in user through the bot interface. The bot is very sophisticated, can be configured to parse templates as well as fix plain links, and automatically picks archives closest to when the link was added. The Wayback Machine has been automatically saving outbound links on Wikimedia projects for years. The benefit of proactively, and automatically, fixing broken links vastly outweighs the risks. Linkrot is endemic to wikis at such a scale that automated solutions are required. Harej (talk) 17:59, 21 January 2021 (UTC)[reply]
- @Harej: We use {{source}} for adding the sources as well as the external links. That template has parameters to mark if the link is broken or to provide archived link. However, there are articles which do not use that template. We do not remove/replace sources after the article is published. However, while cleaning up, if a source has broken URL, admins generally find the archived version and add it. Giving a bot access to edit and review archived pages is scary -- as it could do irreversible damage. We wish to use the bot to save the source before the link rot -- once it is saved, one could add the archived version. Moreover, if an archive already exists, there is a sapient decision to make which version of the archive to use. There are articles which are not sighted (maybe because the file/template used for that page has been modified) -- if the bot were to add archived links to that article and sights, such important changes would disappear unnoticed forever. That is why the bot should not be directly editing the article, just merely archiving all the hyperlinks.
@Harej: I don't think it works the way I expected. Could you please try with US Republicans query Linux Foundation about open-source security? I think it requires <ref></ref> and does not detect URLs.
•–• 05:54, 22 January 2021 (UTC)[reply]
- Acagastya I was not able to run the bot on that page as it is locked. Harej (talk) 18:43, 25 January 2021 (UTC)[reply]
- Alternatively, @Harej:, (this is a less likely scenario), if there was a list of source URLs to be archived, can the tool take care of those? I could write a script to extract all URLs, if there exists a way to automate the archival of those URLs.
•–• 11:13, 22 January 2021 (UTC)[reply] - There are 82k sources in a 6MB file which needs to be archived <https://0x0.st/-iR0.txt> -- is some way I could use this list, run it across the bot/tool, and archive all the links?
•–• 15:33, 22 January 2021 (UTC)[reply]- User:Acagastya the API documentation is available at meta:InternetArchiveBot/API. Harej (talk) 20:22, 22 January 2021 (UTC)[reply]
- Thanks, @Harej:. Also, there is some context sensitive information I need to discuss, re IABot. Will you be available on IRC?
103.48.105.246 (talk) 20:36, 22 January 2021 (UTC)[reply]- Sure, I am in #IABot on Freenode as "harej". Harej (talk) 20:55, 22 January 2021 (UTC)[reply]
- Thanks, @Harej:. Also, there is some context sensitive information I need to discuss, re IABot. Will you be available on IRC?
- User:Acagastya the API documentation is available at meta:InternetArchiveBot/API. Harej (talk) 20:22, 22 January 2021 (UTC)[reply]
- I'm confused as to what the opposition is to this bot. It has been useful on countless wikis and is not likely to cause any damage. Even if it did, it could be easily reverted. It is useful for dead links. --IWI (talk) 14:01, 25 January 2021 (UTC)[reply]
- Actually, as noted above some damage that can be done is irreversible. --Pi zero (talk) 14:50, 25 January 2021 (UTC)[reply]
- Can you explain how irreversible damage happens on a wiki when all versions of a page are stored and can be restored at any time? I don't think this was ever clearly explained. Harej (talk) 17:54, 25 January 2021 (UTC)[reply]
- It's not readily fixable; as I remarked before, the bugzilla request was closed as WON'T FIX. Given which, I'm not really seeing any cause to dwell on the details (unless you know of someone considering attempting a rewrite of the DPL extension, in which case please let me know and perhaps I'll contact them). --Pi zero (talk) 20:29, 25 January 2021 (UTC)[reply]
- Can you explain how irreversible damage happens on a wiki when all versions of a page are stored and can be restored at any time? I don't think this was ever clearly explained. Harej (talk) 17:54, 25 January 2021 (UTC)[reply]
- Actually, as noted above some damage that can be done is irreversible. --Pi zero (talk) 14:50, 25 January 2021 (UTC)[reply]
We needed IABot to take care of two different tasks. Upon careful discussion, it was agreed to split the tasks, the prior being taking the snapshot of all the sources in archives (currently under progress, oversee by user:acagastya); and the other being taking snapshots of all the sources for new articles. Since the first task is under progress, let's focus the discussion on just the latter task.
•–• 20:34, 25 January 2021 (UTC)[reply]
Task 2
- Comment @Harej: All the sources used in a mainspace article are expected to be listed in the url parameter of {{source}}. {{source}} also accepts archiveurl parameter. Could you configure IABot in such a way that it archives the sources from mainspace articles (which are NOT archived), from url and update archiveurl? @Pi zero: does this sound safe? I hope I am not overlooking something, please let me know if I am.
•–• 20:41, 25 January 2021 (UTC)[reply]
- I have configured the bot to recognize the "archiveurl" parameter. Harej (talk) 20:50, 25 January 2021 (UTC)[reply]
- Comment Seems to me one ought to wait until publication-plus-24-hours before archiving sources. Other than that, I have no strong objection; the most technical harm that could possibly be done is quite limited. --Pi zero (talk) 21:05, 25 January 2021 (UTC)[reply]
- Question @Harej: Does IABot have specific triggers (like working at the time of new page creation)? Or does it work in a said time-interval? Re what @Pi zero: has added Harej, I think IABot can just check, if(wikitext.categories.includes('Published')) { run(); }. Will that work? Additionally, can you control which snapshot instance will be added to archive URL? The snapshot with timestamp closest after category:published was added will be ideal.
•–• 07:17, 26 January 2021 (UTC)[reply]
- (Yeah, published will likely do, which would admittedly be much easier.) --Pi zero (talk) 17:45, 26 January 2021 (UTC)[reply]
- Acagastya, IABot does not have triggers. However, the bot will use the archive corresponding to the stated access time in the citation, or the closest one to when the URL was added. Harej (talk) 17:20, 1 February 2021 (UTC)[reply]
- (Yeah, published will likely do, which would admittedly be much easier.) --Pi zero (talk) 17:45, 26 January 2021 (UTC)[reply]
- Question @Harej: Would IABot be able to fill in all three parameters on {{source}} for pages with deadlinks? Namely also, brokenURL and archivedescription when links are being recovered? —chaetodipus (talk · contribs) 04:46, 29 July 2021 (UTC)[reply]
- Question @Cyberpower678, Harej: I see the Phabricator note says this is stalled. Could you briefly outline what needs to be done to unstall? --Green Giant (talk) 03:38, 7 August 2021 (UTC)[reply]
- @Cyberpower678, Harej: It’s been a fortnight with no response. Please could you provide an update? If not, shall we close this request? [24Cr][talk] 12:54, 21 August 2021 (UTC)[reply]
- @Cromium: I think that is because noone here is voting -- we need to vote it to pass or fail for them to proceed.
•–• 12:59, 21 August 2021 (UTC)[reply]- @Acagastya: Is that why it is listed as stalled? If so, we can vote but I was hoping to see a test run of 10-20 edits first. However, if the same task is being done on another wiki, I guess we can move to approval. [24Cr][talk] 13:10, 21 August 2021 (UTC)[reply]
- I will test the bot for 20 edits. After around 20 edits I will stop the bot, and let you assess. Harej (talk) 22:08, 13 September 2021 (UTC)[reply]
- This may be delayed as the bot is currently down for maintenance. Harej (talk) 23:44, 13 September 2021 (UTC)[reply]
We have been trying to do test edits on the bot and have not been succeeding. Basically the bot goes through AllPages in alphabetical order and all the pages it has come across are protected. The bot will work for “single page” runs via the Management Interface (makes edits on your user account’s behalf, so it will work on the pages you normally can edit) and also multi-page runs for unprotected pages. If you want the bot to make background edits, it will need to be promoted to admin. Harej (talk) 19:08, 27 September 2021 (UTC)[reply]
- @Harej: I’ve promoted the bot to admin for a month to help the test run. Please advise if anything else is needed. [24Cr][talk] 12:53, 2 October 2021 (UTC)[reply]
- @Harej, I saw the bot made a single edit so far (diff). If the bot is rescuing a dead link in sources, can it also add "brokenURL = true" so that it displays the archived link in the article? —chaetodipus (talk · contribs) 04:39, 20 October 2021 (UTC)[reply]
- chaetodipus, it will add that parameter on future edits. Harej (talk) 18:30, 20 October 2021 (UTC)[reply]
- @Harej, I saw the bot made a single edit so far (diff). If the bot is rescuing a dead link in sources, can it also add "brokenURL = true" so that it displays the archived link in the article? —chaetodipus (talk · contribs) 04:39, 20 October 2021 (UTC)[reply]
Votes
Support given the limitations of the bot (the way it was designed, to serve a specific purpose), I think it is achieves a part of the task, and I am okay with the compromise.
•–• 17:34, 1 February 2021 (UTC)[reply]
- And thinking about it, if the bot does not edit semi-protected pages, we won't royally screw up. :D
•–• 17:36, 1 February 2021 (UTC)[reply]
Support if it gets the process moved to implementation. [24Cr][talk] 00:01, 27 August 2021 (UTC)[reply]
Support I think this would definitely be useful in recovering the many dead sources in our archives. —chaetodipus (talk · contribs) 05:27, 21 October 2021 (UTC)[reply]
Support I am familiar with IAB and find it extremely useful on other projects. I am surprised to learn in viewing this request that it isn't already approved. I think including this is a no-brainer. --TheSandDoctor (talk) 21:29, 22 October 2021 (UTC)[reply]
- The above discussion is preserved as an archive. Please do not modify it. Subsequent comments should be made on the appropriate discussion page, such as the current discussion page. No further edits should be made to this discussion.
- The following discussion is preserved as an archive. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.
Neriah bot (talk · contribs)
Operator: user:Neriah
Bot tasks for which permission is being sought: Treatment of Double Redirects (450+).
Automatic or manually assisted: Automatic.
Edit type: one time.
Maximum edit rate (e.g. edits per minute):
Bot flag requested: (Y/N): I need a bot flag for a week.
Programming language(s): python.
Neriah (talk) 13:42, 21 December 2022 (UTC)[reply]
Comments
- Can you briefly describe/explain the problem (with a bit more detail) you are intending to address, please? --Bddpaux (talk) 19:28, 4 January 2023 (UTC)[reply]
- Bddpaux, I would like to run a bot that fixes the double redirects (mw:Manual:Pywikibot/redirect.py). The problem with Double Redirects is that when we go to a certain page (like Wikinews Shorts: January 14, 2009) the site jumps to the next referral page, and not to the third page... This bot runs regularly on many sites, and here on the site I saw that it does not. Usually there are not many Double Redirects on the sites, so it is possible to run without a bot account, but here there are several hundred, so in this run I need a bot account, and later on not because it will be individual edits. Neriah (talk) 10:13, 6 January 2023 (UTC)[reply]
- @Bddpaux? Neriah (talk) 10:58, 13 January 2023 (UTC)[reply]
- Bureaucrats Can you perform? Neriah (talk) 13:29, 20 January 2023 (UTC)[reply]
- ? Neriah (talk) 20:27, 30 January 2023 (UTC)[reply]
- The 'crats are not active very often. Sorry, SVTCobra 20:46, 30 January 2023 (UTC)[reply]
- Yes. Perhaps try messaging Gryllida
or Acagastya? Sorry, I did not realize Acagastya is not one (a CU but not a bureaucrat?). Heavy Water (talk) 20:51, 30 January 2023 (UTC)[reply]- I don't think it's appropriate to give it to your main account, as then malfunctions in the bot may lead to your main account being blocked. Could you make a separate account for the bot please? Then I will be able to give it the bot flag. Gryllida (talk) 21:09, 30 January 2023 (UTC)[reply]
- User:Neriah bot. Neriah (talk) 21:04, 31 January 2023 (UTC)[reply]
- I updated this page to reflect this. SVTCobra 22:16, 31 January 2023 (UTC)[reply]
- Ta. Bot flag added. Gryllida (talk) 22:54, 31 January 2023 (UTC)[reply]
- (Note that I have set it to expire in a month, just in case there are some unforeseen tasks that might take more than a week, as requested above.) Gryllida (talk) 22:55, 31 January 2023 (UTC)[reply]
- As only a week was requested, a month seems fine. Cheers, SVTCobra 23:21, 31 January 2023 (UTC)[reply]
- Done Neriah bot (talk) 06:02, 1 February 2023 (UTC)[reply]
- @Gryllida Since this is at the bot's expense, I think it's worth flagging a bot indefinitely. Do we need a new vote for this or not? Neriah (talk) 09:06, 10 February 2023 (UTC)[reply]
- Hi,
- I would like to understand the first part of the sentence, "this is at the bot's expense", better. Could you please rephrase it?
- Do you wish to continue running the bot next month?
- Note that I do not mind flagging the bot indefinitely. The only reason I did not do it was because there was a time limit, one week, in the request.
- Regards, -- Gryllida (talk) 02:20, 14 February 2023 (UTC)[reply]
- @Gryllida Since this is at the bot's expense, I think it's worth flagging a bot indefinitely. Do we need a new vote for this or not? Neriah (talk) 09:06, 10 February 2023 (UTC)[reply]
- Done Neriah bot (talk) 06:02, 1 February 2023 (UTC)[reply]
- As only a week was requested, a month seems fine. Cheers, SVTCobra 23:21, 31 January 2023 (UTC)[reply]
- User:Neriah bot. Neriah (talk) 21:04, 31 January 2023 (UTC)[reply]
- I don't think it's appropriate to give it to your main account, as then malfunctions in the bot may lead to your main account being blocked. Could you make a separate account for the bot please? Then I will be able to give it the bot flag. Gryllida (talk) 21:09, 30 January 2023 (UTC)[reply]
- ? Neriah (talk) 20:27, 30 January 2023 (UTC)[reply]
- Bureaucrats Can you perform? Neriah (talk) 13:29, 20 January 2023 (UTC)[reply]
- @Bddpaux? Neriah (talk) 10:58, 13 January 2023 (UTC)[reply]
- Bddpaux, I would like to run a bot that fixes the double redirects (mw:Manual:Pywikibot/redirect.py). The problem with Double Redirects is that when we go to a certain page (like Wikinews Shorts: January 14, 2009) the site jumps to the next referral page, and not to the third page... This bot runs regularly on many sites, and here on the site I saw that it does not. Usually there are not many Double Redirects on the sites, so it is possible to run without a bot account, but here there are several hundred, so in this run I need a bot account, and later on not because it will be individual edits. Neriah (talk) 10:13, 6 January 2023 (UTC)[reply]
@Gryllida: Can you close this if the bot was approved? Also, please update the active bot list. I don't know anything about the tech aspects of our beloved Wikis. Cheers, --SVTCobra 02:05, 14 February 2023 (UTC)[reply]
- I will keep it open for another two days to finalize the discussion of the expiry date on the bot's account, above. I will update the active bot list soon. Gryllida (talk) 02:20, 14 February 2023 (UTC)[reply]
- Keep it open? Didn't User:Neriah bot run on February 1 with nearly 500 edits? I am quite certain you granted the bot user rights on January 31. Maybe there's something I don't understand.. Cheers, SVTCobra 02:28, 14 February 2023 (UTC)[reply]
- I mean, in the comments above, Neriah asked to extend the bot flag duration from 'one month' to 'indefinite'. If this request is closed, then this section will not be editable, and that discussion will need to stop (and start elsewhere). This would make it fragmented. That's why I am trying to finish that discussion here before closing this request. Gryllida (talk) 03:03, 14 February 2023 (UTC)[reply]
- @Gryllida At first I asked for it on my account, but if it's on the bot's account I'd love it to stay forever, so I can run the bot every month. Thanks! Neriah (talk) 10:52, 14 February 2023 (UTC)[reply]
- Thank you both. I will adjust the bot flag expiry and close this section within about a day. Gryllida (talk) 11:27, 14 February 2023 (UTC)[reply]
- @Gryllida Can you handle and store? Neriah (talk) 15:57, 2 March 2023 (UTC)[reply]
- Bot flag expiry set to 'Does not expire'; closing the discussion now. Gryllida (talk) 01:49, 3 March 2023 (UTC)[reply]
- @Gryllida Can you handle and store? Neriah (talk) 15:57, 2 March 2023 (UTC)[reply]
- Thank you both. I will adjust the bot flag expiry and close this section within about a day. Gryllida (talk) 11:27, 14 February 2023 (UTC)[reply]
- @Gryllida At first I asked for it on my account, but if it's on the bot's account I'd love it to stay forever, so I can run the bot every month. Thanks! Neriah (talk) 10:52, 14 February 2023 (UTC)[reply]
- I mean, in the comments above, Neriah asked to extend the bot flag duration from 'one month' to 'indefinite'. If this request is closed, then this section will not be editable, and that discussion will need to stop (and start elsewhere). This would make it fragmented. That's why I am trying to finish that discussion here before closing this request. Gryllida (talk) 03:03, 14 February 2023 (UTC)[reply]
- Keep it open? Didn't User:Neriah bot run on February 1 with nearly 500 edits? I am quite certain you granted the bot user rights on January 31. Maybe there's something I don't understand.. Cheers, SVTCobra 02:28, 14 February 2023 (UTC)[reply]
Votes
- Support --SVTCobra 22:37, 22 December 2022 (UTC)[reply]
- Support I don't know much about running bots, but you seem to be a trusted contributor on the Hebrew Wikipedia and this is a task that needs doing. Heavy Water (talk) 22:54, 22 December 2022 (UTC)[reply]
- Support The recent run seems to be successful. Double redirect clean up needs to be done and I know several projects (including those in my native language) do that by bots. Estimating by global contributions, I think this is someone we can trust. MathXplore (talk) 08:32, 23 December 2022 (UTC)[reply]
- See my comment above.--Bddpaux (talk) 19:29, 4 January 2023 (UTC)[reply]
- Support Sounds good to me.--Bddpaux (talk) 15:47, 13 January 2023 (UTC)[reply]
- Support --DRC-B5 (talk) 14:42, 5 February 2023 (UTC)[reply]
- The above discussion is preserved as an archive. Please do not modify it. Subsequent comments should be made on the appropriate discussion page, such as the current discussion page. No further edits should be made to this discussion.