[12:17:10] !log tools running `aborrero@tools-sgegrid-master:~$ sudo grid-configurator --all-domains` after merging a few patches to the script to handle dead config [12:17:13] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools/SAL [17:03:46] So... really nothing to do about abusing PAWS? https://es.wikipedia.org/wiki/Especial:Contribuciones/Conejo_Navide%C3%B1o [17:10:18] LuchoCR: what do you suggest? [17:31:16] LuchoCR: if you can explain how that attack was specific to PAWS and not something that anyone with access to a computer and pywikibot could do that would help us understand what defenses to add to PAWS. [17:38:01] At least not make it easier for them to have a launch and vandalise? [17:38:16] There's no reason PAWS should be used by non-autoconfirmed [17:38:48] in the alternative, maybe we should limit the rate limits [17:38:55] *edit rate limits [17:49:05] hauskatze: well, I'd say there is no reason for the wikis to allow automated edits from non-autoconfirmed and that can be handled on the wikis by the wikis and not pushed down to any service that makes it easier to actually contribute. [17:50:34] the place were access controls can reasonably be applied is on the inbound API calls rather than pretending that every client should implement its own controls [18:01:37] !log tools deploying calico v3.21.0 (T292698) [18:01:41] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools/SAL [18:01:41] T292698: Upgrade calico to 3.21 - https://phabricator.wikimedia.org/T292698 [18:06:37] bd808: I agree that a shared system to control this type of behaviour/restrictions would be good. I was thinking on local abusefilters too, but AF don't currently support the variables needed to filter that kind of edits [18:07:33] hauskatze: *nod* I was surprised when I looked for autoconfirmed in AF and didn't find it when the first PAWS abuse complaints came in. Maybe we should fix that? [18:07:42] how would you use PAWS in hackathons for new users without this? (re @wmtelegram_bot: There's no reason PAWS should be used by non-autoconfirmed) [18:08:13] AbuseFilter can filter on autoconfirmed status by looking at user_groups... [18:08:34] legoktm: I don't think it can filter OATH CIDs [18:08:39] right. [18:08:48] That's what I think it's missing [18:08:55] or tags, that would work as well. [18:09:20] I can't find any phab requests for that [18:09:23] The solution proposed at the phab Task I filed some time ago is that I block all IPs used by PAWS [18:09:42] but that'd be kind of killing flies with sledgehammers [18:10:00] tags are annoying because MediaWiki adds them after the edit is saved, which is too late for AF. But we really should fix that. [18:10:40] oof [18:10:51] would it be doable to add them before? [18:12:34] I think filtering on OAuth CID is going to be much faster/easier to do [18:12:51] filtering on tags is definitely possible, but requires more refactoring [18:13:27] also it would be a best effort thing, since you can always add/edit/remove tags to an edit after its saved [18:13:31] CID works for me I think :) [18:15:31] hauskatze: why does the CID matter? [18:16:33] The thing y'all seem to hate about PAWS is that you can't just IP range block it and walk away. [18:18:09] OAuth CID seems a little bit more annoying to maintain, update AF rules everywhere once PAWS oauth is moved for whatever reason. [18:18:48] but, whatever works faster. [18:18:55] bd808: I don't hate anything about PAWS, nor the people abusing it [18:20:07] !log paws deploying calico v3.21.0 (T292698) [18:20:10] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Paws/SAL [18:20:10] T292698: Upgrade calico to 3.21 - https://phabricator.wikimedia.org/T292698 [18:20:34] but if we're going to suffer from this kind of childish (massive) vandalism I think we should have some tools to avoid it from happening or reducing it [18:21:16] if AF could be used for this, it'd be great; it will give us too the local control for our needs, without having to patch PAWS or OATHAuth [18:22:33] the problem with patching PAWS is that there is an abundance of other ways of achieving the same vandalism and all proposed changes to PAWS come at a significant cost to its mission [18:22:45] ^ that [18:22:47] Yup [18:23:15] So I think we agree the best path would be to allow abusefilter to see past OAuth? [18:23:19] Assuming the AF+CID rules would be used just as an extra variable in gauging likelihood of vandalism and not for an outright block I think it would be helpful [18:24:18] Yep, I don't think we want to block PAWS altogether [18:25:27] like chicocvenancio said, I do think it's important to recognize that PAWS is used for hackathons/new users and not all the time do people remember or even know to mark accounts as manually "confirmed" [18:26:22] Agreed [18:54:00] legoktm: https://gerrit.wikimedia.org/r/c/mediawiki/extensions/OAuth/+/748784/ [19:01:20] hmm, is User::getRequest() any less evil these days? [19:05:55] reviewed :) [19:06:26] good question, I'm trying to look at other examples [19:06:43] can you look at https://gerrit.wikimedia.org/r/748788 (to fix phan) too? [19:21:22] legoktm: responded! [20:11:37] I read an article in the Wikimedia-Techblog about how a batch process was improved. https://techblog.wikimedia.org/2021/10/29/how-we-improved-performance-of-a-batch-process-from-two-days-to-five-minutes/ Is there somewhere a public list of scripts running regurarly on servers like the script that is explained in the text. I currently think about [20:11:38] how far an improvement in reducing runtime of scripts is possible and maybe when I find a script I can try to understand it and make suggestions how to optimize it. [20:25:54] Hogue: if you can read puppet, most of the mediawiki jobs are defined in https://gerrit.wikimedia.org/r/plugins/gitiles/operations/puppet/+/refs/heads/production/modules/profile/manifests/mediawiki/maintenance/ [20:35:48] https://wikitech.wikimedia.org/wiki/Cron_jobs [20:36:26] well, that is very old but some parts are still true [21:15:02] Thank you for the help. I read some of the jobs defined there. Some of the scripts mentioned in the jobs have I found in the MediaWikiCodeSearch. How can I access the path to the script directly through pasting it in the browser. For example here. [21:15:03] https://gerrit.wikimedia.org/r/plugins/gitiles/operations/puppet/+/refs/heads/production/modules/profile/manifests/mediawiki/maintenance/update_flaggedrev_stats.pp [21:34:35] Hogue: https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/FlaggedRevs/+/refs/heads/master/maintenance/updateStats.php [22:55:31] I'm remembering that someone rigged up a Cloud VPS instance to automatically pull new Docker containers for the project and deploy them. Was that you legoktm? [22:55:42] * bd808 wants this magic for a project he is working on [22:55:43] yep! [22:56:13] it's using podman-auto-update [22:56:20] https://manpages.debian.org/bullseye/podman/podman-auto-update.1.en.html [22:56:36] oh snazzy. I will read things [22:58:02] and then you can run `sudo podman auto-update` manually to do the pull+restart, or there's a systemd timer that does it every 24h automatically, which could be sped up depending on what you want [22:59:48] hmmm. yeah I really want "continuous delivery" for the thing I'm looking into. Basically it's a pipelinelib project that is building an image after each merge and I would like that to trigger an update of the demo server [23:00:15] * bd808 may get cheeky and listen to events from the github mirror... [23:00:44] if the timer was every 5m, would that be good enough? [23:00:57] yeah, In practice that would be fine [23:04:10] at a work project a few years ago I did “every five minutes during working hours” ^^ [23:04:52] I don’t think I have the exact OnCalendar= anymore, but something like, every five minutes 9h-19h mon-fr (rough approximation of several team members’ overlapping working hours), and maybe every two hours or something outside of that [23:08:20] heh, that's neat [23:08:59] my assumption is that these requests are pretty cheap against our docker-registry since most of the time it's just checking if the digest matches/is up to date and not actually pulling [23:56:51] phab spammer, any admins around? https://phabricator.wikimedia.org/p/BilalShirwani/ [23:57:26] banned [23:57:30] thanks [23:58:12] blocked the user on wikitech too [23:58:47] huh, why can he still do stuff, or is phab just lagging a bit?