[03:13:42] Could I get someone to make a quick sanity check of ? I want give it a try on test151 tomorrow, though I don't know if I'll have the time with all the other things I have going on at the moment. [05:33:52] Regarding the TODOs, all exit codes should cause a failure unless you're otherwise handling it, passing it to exitcodes is the best way to do that as the script will just handle. [05:34:07] And yes, world should probably reapply all [05:34:37] I will do a proper CR later [05:37:04] I would also rather someone worked on getting the version from pypi deployed [05:37:30] Than keep changing puppet [15:25:00] @rhinosf1 you on? [15:27:24] What did I break? [15:28:33] Wanted to talk to you about a potential issue I noticed with the VIGIL architecture(working on getting docker up to start working on development, the handling part since I don’t have a test instance of mediawiki unless someone wants to volunteer there’s and add the exported to settings [15:28:39] Nothing this time actually [15:28:50] *that I know of [15:28:50] Good, go ahead then [15:29:21] Let me pull up to docs to help explain my reference [15:29:51] https://m.mediawiki.org/wiki/Manual:$wgRCFeeds [15:30:05] The exporter takes a URI, a formatted to shape the data and some parameters [15:30:38] Assuming we use HTTPS to send something like a POST to the server running vig, there’s not really anything we can do for authentication [15:30:50] Which means people can send fake edits to the feed and clog it up [15:31:04] There’s a few ways to fix this I thought of, wanna hear what you think [15:31:37] First of all is simply configuring V to only accept connections from the IPs of mw servers [15:31:54] Or if it’s hosted on Miraheze’s network, firewall external traffic [15:32:29] Second, we could keep the URI (or a key sent in the request from the formatter) in our private secrets repo [15:33:11] More esotericly, making a sort of hash thing of the sent request using a secret in the private repo to validate request origin [15:33:13] Thoughts? [15:35:12] You, didn’t break anything right( [15:49:55] Docker is stubborn :( [16:00:29] Page Forms upgraded to 5.8 now [16:00:41] just not deployed yet [16:05:53] You're smart [16:06:05] Yes we can limit to mw or bast [16:06:25] And having a secret auth key or signature is good [17:25:21] [1/6] I see some issues with Commons [17:25:21] [2/6] It seems that some wiki's are still listed, but not able to reach them as they lack the subdomain in the url: [17:25:21] [3/6] See image [17:25:22] [4/6] So I can not see whether images are in use, or that a wiki is closed, or even on what wiki it is used. [17:25:22] [5/6] Can you tell me what is going on? [17:25:22] [6/6] https://cdn.discordapp.com/attachments/1006789349498699827/1268620599522103308/image.png?ex=66ad1680&is=66abc500&hm=487f13f0a8b2995f054184ee6cbb0a39a81832d2a72e3e7d320b82467d967948& [17:26:20] I think that's dodgy data [17:32:00] ive seen something like that happen with deleted wikis on centralauth [17:33:00] Yeah you mean the Global accounts attached to wiki's, right? You click on those wiki's and they are deleted. [17:33:22] I saw that too, had not connected that together :ThinkerMH: [23:12:19] why is everything so slow jesus [23:12:33] its that slow my 2fa code keeps expiring before the login form has submitted