Wikinews:Bots
All bots must be approved here to run regardless of if they need a bot flag. Non-approved bots may be blocked at an administrator's discretion. This may be for reasons such as flooding recent changes. The bot flag will not be granted to any bot that does not have the {{botblock}} template on its userpage. A bot's userpage should clearly state the bot's intended purpose and other relevant details. See also Wikinews:Global Bots.
For archived discussions, see the Archive.
To add a new bot request, click here.
Bots on WikinewsEdit
Active botsEdit
These are bots that have made any page edit or logged action within 30 days:
- BOT-Superzerocool (Talk • contribs • bot status log • actions • block log • other log)
- Revibot I (Talk • contribs • bot status log • actions • block log • other log)
Inactive botsEdit
These are bots that have not made any page edits nor logged actions within 30 days:
- Bawolff bot (Talk • contribs • bot status log • actions • block log • other log)
- GrondinBot (Talk • contribs • bot status log • actions • block log • other log)
- MikemoralBot (Talk • contribs • bot status log • actions • block log • other log)
- Trigonidiida (Talk • contribs • bot status log • actions • block log • other log)
Defunct botsEdit
Discussion about current botsEdit
- General question regarding the 'inactive bots' list above
- Block, remove bot flag or, ...? --Brian McNeil / talk 08:51, 13 October 2012 (UTC)
- I'd think, in general when a bot goes inactive for, say, a year, remove the flag. Eventually block; just to name a figure, maybe that's two years? --Pi zero (talk) 14:11, 26 December 2012 (UTC)
- Would suppose the idea that if inactive for a year, remove the flag. --LauraHale (talk) 06:43, 23 March 2013 (UTC)
Discussion about new botsEdit
DannyS712 bot (talk · contribs)Edit
- Operator: DannyS712 (talk · contribs)
- Bot name: DannyS712 bot (Talk • contribs • bot status log • actions • block log • other log)
- Programming language: Javascript
- Already used on: None for this task
- Task: Update use-counts regularly (weekly, biweekly, monthly / up to discussion)
The below SQL query retrieves, for each template that has a "use count" sub template, the template/module name, the full number of transclusions, the number that would be put in the template (not the exact number, but some number below it to say "over xxx"), and the name of the use count page. I've only some of the results here, but the total query retrieves 61 rows
SQL | Current result | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
USE enwikinews_p;
SELECT
CONCAT (
CASE
WHEN tl.tl_namespace = 10 THEN 'Template'
WHEN tl.tl_namespace = 828 THEN 'Module'
END,
':',
tl.tl_title
) AS 'Page',
COUNT(*) AS 'Transclusions',
CASE
WHEN COUNT(*) < 1000 AND MOD(COUNT(*), 100) < 60
THEN CONCAT( SUBSTR(COUNT(*), 1, LENGTH(COUNT(*)) - 2), '00' )
WHEN COUNT(*) < 1000
THEN CONCAT( SUBSTR(COUNT(*), 1, LENGTH(COUNT(*)) - 2), '50' )
WHEN COUNT(*) < 10000 AND MOD(COUNT(*), 100) < 60
THEN CONCAT( SUBSTR(COUNT(*), 1, 1), ',', SUBSTR(COUNT(*), 2, 1), '00' )
WHEN COUNT(*) < 10000
THEN CONCAT( SUBSTR(COUNT(*), 1, 1), ',', SUBSTR(COUNT(*), 2, 1), '50' )
WHEN COUNT(*) < 1000000 AND MOD(COUNT(*), 1000) < 600
THEN CONCAT( SUBSTR(COUNT(*), 1, LENGTH(COUNT(*)) - 3), ',000' )
WHEN COUNT(*) < 1000000
THEN CONCAT( SUBSTR(COUNT(*), 1, LENGTH(COUNT(*)) - 3), ',500' )
ELSE
CONCAT( SUBSTR(COUNT(*), 1, 1), '.', SUBSTR(COUNT(*), 2, 1), ' million' )
END AS 'Over',
CONCAT (
CASE
WHEN tl.tl_namespace = 10 THEN 'Template'
WHEN tl.tl_namespace = 828 THEN 'Module'
END,
':',
tl.tl_title,
'/doc/use-count/doc'
) AS 'Counter'
FROM templatelinks tl
WHERE tl.tl_title IN (
SELECT
SUBSTR(pg.page_title, 1, LENGTH(pg.page_title) - 18)
FROM
page pg
WHERE
pg.page_title LIKE '%/doc/use-count/doc'
AND pg.page_id IN (
SELECT cl.cl_from
FROM categorylinks cl
WHERE cl.cl_to = 'Possible-risk/use-count'
)
)
AND tl.tl_namespace IN (10, 828)
GROUP BY CONCAT(tl.tl_title, tl.tl_namespace)
HAVING COUNT(*) > 100
ORDER BY COUNT(*) DESC;
|
|
The bot would, at a set duration, go and update each counter template with the current transclusions, to keep them (relatively) up to date. The bot will only edit pre-existing templates. Let me know if there are any questions. Thanks, --DannyS712 (talk) 21:37, 13 October 2019 (UTC)
CommentsEdit
- Comment This doesn't seem needed. I of course have long freely admitted I'm unfond of bots. So I'm left with vaguely negative feelings but, atm, not strong enough to actively oppose. --Pi zero (talk) 22:11, 13 October 2019 (UTC)
VotesEdit
InternetArchiveBot (talk · contribs)Edit
- Operator: Cyberpower678 (talk · contribs) and Harej (talk · contribs)
- Bot name: InternetArchiveBot (Talk • contribs • bot status log • actions • block log • other log)
- Programming language: PHP
- Already used on: Operates on dozens of additional Wikimedia wikis
- Task: InternetArchiveBot identifies dead links and adds links to archived versions where available. Per request on Phabricator. Harej (talk) 22:32, 20 January 2021 (UTC)
CommentsEdit
- Comment @Harej: Please note, I would like to have all links (not just the dead links) archived. There are roughly 22k articles in the CAT:Published category. And the bot just needs to archive the links, it should NOT edit the articles. After all the links are archived in that category, IABot needs to monitor the same category for any new additions to the category. Can you please confirm IABot can do that, Harej? Thanks.
•–• 06:31, 21 January 2021 (UTC)
- @Harej: We use {{source}} for adding the sources as well as the external links. That template has parameters to mark if the link is broken or to provide archived link. However, there are articles which do not use that template. We do not remove/replace sources after the article is published. However, while cleaning up, if a source has broken URL, admins generally find the archived version and add it. Giving a bot access to edit and review archived pages is scary -- as it could do irreversible damage. We wish to use the bot to save the source before the link rot -- once it is saved, one could add the archived version. Moreover, if an archive already exists, there is a sapient decision to make which version of the archive to use. There are articles which are not sighted (maybe because the file/template used for that page has been modified) -- if the bot were to add archived links to that article and sights, such important changes would disappear unnoticed forever. That is why the bot should not be directly editing the article, just merely archiving all the hyperlinks.
•–• 17:45, 21 January 2021 (UTC)- InternetArchiveBot operates on 50 Wikimedia wikis without causing "irreversible damage" to them. In the event the bot malfunctions on a given wiki, mistakes can be undone and the bot can be stopped by any logged in user through the bot interface. The bot is very sophisticated, can be configured to parse templates as well as fix plain links, and automatically picks archives closest to when the link was added. The Wayback Machine has been automatically saving outbound links on Wikimedia projects for years. The benefit of proactively, and automatically, fixing broken links vastly outweighs the risks. Linkrot is endemic to wikis at such a scale that automated solutions are required. Harej (talk) 17:59, 21 January 2021 (UTC)
- With respect, acagastya is correct: there is danger of irreversible damage. There's history here, between Wikinews and the Foundation, involving a WON'T FIX closure of a Bugzilla request (from before my time, but I learned about it from bawolff). 'Nuff said on that point, I hope. --Pi zero (talk) 18:18, 21 January 2021 (UTC)
- I don't know what the Wikimedia Foundation has to do with this; InternetArchiveBot is a service of the Internet Archive. The bot is operated in many diverse contexts and can be highly customized for a given use case, and I am more than happy to work with the community on that. If you do not want InternetArchiveBot to fix broken links, which is its primary function, then I am not sure what you are requesting at all. If you want the Wayback Machine to preserve outgoing links, it is already doing that. Harej (talk) 18:22, 21 January 2021 (UTC)
- What you may be interested in is not automated operation, but the option to scan a page and add archives with the element of human review. If so the Analyze a Page feature may be useful for you. (Make sure you have "English Wikinews" as your selected project.) With this you can enter a page title, have it retrieve archive links for all links (not just dead ones), and make the edits conveniently while giving you the ability to review. This may be a more workable option. Harej (talk) 02:11, 22 January 2021 (UTC)
- With respect, acagastya is correct: there is danger of irreversible damage. There's history here, between Wikinews and the Foundation, involving a WON'T FIX closure of a Bugzilla request (from before my time, but I learned about it from bawolff). 'Nuff said on that point, I hope. --Pi zero (talk) 18:18, 21 January 2021 (UTC)
- InternetArchiveBot operates on 50 Wikimedia wikis without causing "irreversible damage" to them. In the event the bot malfunctions on a given wiki, mistakes can be undone and the bot can be stopped by any logged in user through the bot interface. The bot is very sophisticated, can be configured to parse templates as well as fix plain links, and automatically picks archives closest to when the link was added. The Wayback Machine has been automatically saving outbound links on Wikimedia projects for years. The benefit of proactively, and automatically, fixing broken links vastly outweighs the risks. Linkrot is endemic to wikis at such a scale that automated solutions are required. Harej (talk) 17:59, 21 January 2021 (UTC)
- @Harej: We use {{source}} for adding the sources as well as the external links. That template has parameters to mark if the link is broken or to provide archived link. However, there are articles which do not use that template. We do not remove/replace sources after the article is published. However, while cleaning up, if a source has broken URL, admins generally find the archived version and add it. Giving a bot access to edit and review archived pages is scary -- as it could do irreversible damage. We wish to use the bot to save the source before the link rot -- once it is saved, one could add the archived version. Moreover, if an archive already exists, there is a sapient decision to make which version of the archive to use. There are articles which are not sighted (maybe because the file/template used for that page has been modified) -- if the bot were to add archived links to that article and sights, such important changes would disappear unnoticed forever. That is why the bot should not be directly editing the article, just merely archiving all the hyperlinks.
@Harej: I don't think it works the way I expected. Could you please try with US Republicans query Linux Foundation about open-source security? I think it requires <ref</ref> and does not detect URLs.
•–• 05:54, 22 January 2021 (UTC)