[00:40:04] The andest and, the xorest xor [00:40:52] use double-xor encryption for the utmost security [00:41:29] Nah just use md5 [00:42:26] i was about to say "you can't decrypt it", but i mean... you sorta can? [00:42:44] Oh, BlankEclair, not sure if the error I encountered with externaldata enabling on rainversewiki is in your remit to poke at, but will definitely require @MediaWiki Specialists attention [00:42:56] ah yep [00:43:04] I think you can [00:43:15] I think the bigger issue with md5 is it isn't collision safe [00:43:21] Which you would expect it to be [00:43:39] yeah but you wouldn't be getting the same data as you put in [00:43:48] for context to mw specialsits: i requested ExternalData to be enabled on rainversewiki, but when NA tried, it isn't enabled and the logs say that NA changed settings for "" [00:44:03] I won't be in a good spot to file a Phorge ticket on your behalf until the weekend, just give me a ping if you need me to do so [00:44:17] https://rainverse.wiki/wiki/Special:Log/managewiki?offset=20240926223259&limit=3 [00:44:24] oh i guess i can file it myself then [00:44:34] If you do so I'll dump in screenshots when I can [00:44:36] No you wouldn't get the same result back [00:44:50] But two different inputs can equal one single md5 which is the bigger issue [00:47:25] filed: https://issue-tracker.miraheze.org/T12653 [00:50:04] cosmicalpha: can i ask a question privately just in case? [00:50:11] (not related to the bug we were discussing above) [00:50:35] Sure one sec [01:02:56] The log doesn’t show the setting changed [01:03:06] Just shows an empty string [01:03:59] amazing [01:04:09] i wonder if that happens elsewhere or on beta [01:51:05] Yeah, when I attempted enabling through front-end it gave a pretty broken error about not enabling successfully [01:53:41] Is it helpful to repro and put a screenshot of the generic broken error or does the team have what they need? [03:43:55] It finally happened [03:43:58] I can't login [03:44:04] :waaa: [03:48:04] waaa [03:54:33] https://tenor.com/view/dance-waluigi-mario-meme-gif-5329543 [03:56:44] seems like some session issue [03:56:48] oh well [05:28:34] BlankEclair: thanks for the review here https://issue-tracker.miraheze.org/T12651#253763 [05:28:44] reception123: you're welcome ^_^ [05:28:49] I wonder if there's an easy way to make it more 'accessible' to select new languages? [05:29:01] since having to do /language isn't nice at all [05:29:07] oh i didn't see you replied, guess i'll answer on task [05:29:32] _is kind of excited about this extension as translation on Meta remains quite messy and unsatisfactory IMO_ [05:43:05] I have big and positive feelings about this. DeepL is imperfect, but a big step forward to our global audience [05:44:32] oh btw i replied [05:46:07] who or what made translations on Miraheze home page tho? [05:47:15] people? [05:49:02] BlankEclair: replied. I think the best thing is try to test it out on Beta now. But I do wonder if there's some config for restricting some pages from using it? [05:49:39] And as a side note for anyone worried here, there will certainly be an RfC on this for everyone to have their voice heard so this isn't something that'll be imposed. We're just testing it out so we can present the alternative on the RfC and everyone knows what they're voting for [05:49:46] you can partially restrict by namespaces [05:50:02] but no, there's currently no option to specify what pages to do [05:50:21] eh, not ideal but that'll do I guess. Perhaps if it's our fork we could work on something like that if we see excessive usage [05:50:35] but my understanding is right that it doesn't automatically translate everything right? it only does when someone accesses that language [05:50:37] i mean we'll have to fork it anyway for proper proxy support [05:50:52] unless if the mw* servers can talk outbound directly via ipv6 [05:50:54] and for mediawiki-repos as we can't use downloaded zips there [05:51:15] I don't think so, a proxy has always been needed since bast [05:51:39] > [27/09/2024 15:50] but my understanding is right that it doesn't automatically translate everything right? it only does when someone accesses that language [05:51:49] yeah, there's also a cache but idt the cache expires when the page is edited [05:52:44] i somehow glossed over this lol [05:52:46] strcmp( $basepage, $subpage ) === 0 [05:52:50] C, is that you? [05:54:18] > idt the cache expires when the page is edited [05:54:25] oh wait it does, bad documentation >:( [05:55:01] if people then probably using google translate without fixes, I'm afraid [05:55:27] oh :( [05:55:53] i mean, to my knowledge miraheze doesn't currently knowingly employ auto translations [05:57:19] No but many translators have unfortunately used them [05:57:29] which is one reason why I (and many others) support a transition to deepl [05:58:37] oh hmm, we might struggle with using templates for auto-linking since the english link will always be bold [05:59:09] probably could use either external links, or some tag surgery [06:01:10] but only 'staff' can edit home page? are we allowed to send corrections? [06:01:20] actually, it might depend, easier to test at runtime than to read the docs tbh [06:01:45] i'm sick of reading docs for like four days straight lol [06:08:36] I think anyone can translate any page, it doesn't depend on protection [06:09:16] I'm talking about non-wiki home page, w/ pink/orange bg [06:10:02] oh, the landing page on miraheze.org? [06:11:53] omg is it really translatewiki? [06:13:28] https://translatewiki.net/w/i.php?title=Special:Translate&group=mwgithub-mirahezelanding [06:13:29] yes it is [06:13:42] also, it's apparently in php, i always thought it was a static site [06:17:08] :ThinkingHardMH: [06:29:08] BlankEclair: you can do your proxy patch now if you want https://github.com/miraheze/SubTranslate 🙂 [06:33:23] uhh what should i do about the versioning? [06:40:56] ^ @cosmicalpha [06:41:10] I'm not sure if we have any rules/conventions about versoning for the extensions we fork/host [06:41:28] just to be sure I'll let CA answer but I don't think it would be a big deal to just leave the version as it is for now [06:41:42] That is fine yeah [07:10:52] uh, what? [07:10:53] [1/3] ```rsync: [sender] send_files failed to open "/srv/mediawiki-staging/1.42/maintenance/motorsportanthologywiki.php": Permission denied (13) [07:10:54] [2/3] rsync: [sender] send_files failed to open "/srv/mediawiki-staging/1.42/maintenance/pilgripediawiki.php": Permission denied (13) [07:10:54] [3/3] ``` [07:11:01] (when trying to run mwdeploy --world) [07:20:01] hmmm [07:20:12] did I break something else with my patches? [07:20:31] maybe, I thought it was weird to see wiki names in maintenance scripts [07:21:20] ah it's because they're set to root [07:21:46] oh wait don't use --world to deploy an extension though that takes longer anyway, and could more likely cause something else to take us down do `mwdeploy --upgrade-extensions= --force-upgrade --versions=all --servers=all` [07:21:51] mainten- what [07:22:49] ohh, that's what I've been doing wrong. For a long time now I never understood why --upgrade-extensions didn't work for new extensions but it's because I didn't know --force-update was a thing heh [07:23:03] I already started it now but will remember next time [07:23:07] and yeah, chown to www-data fixed it [07:23:32] ah lol yeah I added --force-upgrade to skip the check so it can 'update' even if it is already updated lol [07:23:47] we should probably add it to https://meta.miraheze.org/wiki/Tech:MediaWiki_appserver#mwscript [07:23:59] Yes we should. [07:25:12] before --force-upgrade it wss mighty inconvenient to deploy world every time I needed to test a patch lol like my current one with 150+ commits and a lot of testing would be super inconvenient without it lol [07:28:47] it actually seems to have gone pretty fast [07:28:49] it's already done [07:29:18] interestingly enough though, SubTranslate is nowhere to be found in /extensions [07:29:46] I tried your suggestion and that doesn't fix it either [07:29:47] mwdeploy: error: invalid extension choice(s): SubTranslate [07:30:10] Did puppet run first? [07:30:25] Maybe it never got into staging lol [07:30:32] oh heh, that's another thing I didn't know was necessary [07:30:46] I thought mwdeploy took care of that too like it does for config when you do --pull config [07:31:00] funny enough I don't think I've run puppet on a mw servers in ages [07:31:45] Nope because it first has to pull to mediawiki-repos on puppet181 the puppet has to install it into staging. We definitely need better docs for this. [07:31:49] but nope, even doing that doesn't work [07:32:01] and it's also not in staging [07:32:05] run puppet on puppet181? [07:32:27] oh, you have to run it there? So I guess mw-admins can't easily install extensions then unless they wait for puppet to run? [07:33:20] Well it was that or mediawiki-repos being actually in puppet. Maybe I will rework it so it can all be on MediaWiki. [07:33:30] You can deploy individual files and folders [07:33:42] --world should only be used when you really want to deploy everything [07:33:56] Or if you're deleting an extension [07:34:29] Didn't I add a delete command so you don't have to even to delete? Or maybe just in my head I wanted to... [07:34:47] well either way it's not urgent in this case as the extension wouldn't work until BlankEclair does the proxy patch anyway [07:34:56] I guess that was in my head [07:34:59] lol [07:35:01] but I just wanted to figure out how to quickly do it for the next extension since until now I've just waited a while after merging [07:35:22] We definitely need better docs yeah [07:35:29] I think it's more an rsync thing tbh [07:35:37] but no, even after running on puppet it still doesn't work [07:35:47] I want to try and make world faster at some point [07:35:55] it's not that slow anymore I feel [07:36:12] oh mediawiki-repos is wrong [07:36:14] It's still pretty slow [07:36:30] Cause like it does everything in series [07:36:35] And some of it doesn't need to be [07:36:35] @reception123 you used _branch_ which we don't have [07:36:55] ohh, it should've been master yeah [07:37:10] my bad then, it's easy to copy stuff over and forget they're not actually on REL [07:43:33] hmm, even now it says it's invalid [07:45:10] That’s strange that the wiki name is in the maintenance script for [07:49:00] I think that was CA's patches [07:50:46] https://github.com/miraheze/SubTranslate/pull/1 [07:51:30] thanks! [07:51:47] if only mwdeploy/puppet would play nice... [07:52:03] untested, but we have production mirabeta for that [07:52:24] Oh, @cosmicalpha ^ [07:52:27] yeah, I'm planning on having some test pages there so that the community can see how it works when the RfC vote is up [07:52:46] I did fix the issue though, it was just that they were root and had to be chown'd to www-data [08:07:33] BlankEclair: I don't want to get ahead of ourselves here but do you think this error would occur even if we had the deepl API set up (https://meta.mirabeta.org/wiki/Community_noticeboard/es)? [08:07:53] it doesn't seem related to that to me, it seems like an issue that would happen even if we had all the setting done [08:07:54] probably [08:08:09] could it be a compatibility issue? [08:08:16] yeah the code is rather old [08:08:53] well the extension was created in september 2023 [08:09:08] https://doc.wikimedia.org/mediawiki-core/REL1_40/php/classWikiPage.html#a4135fb90ccd4bf0a927a341a3c2f2654 [08:09:45] i mean, it aims for >= 1.35 [08:10:52] ah, so we just need newFromTitle [08:10:57] I hope there's not any other hidden errors [08:11:28] this is also funky lol: https://test.mirabeta.org/wiki/Special:Log?logid=425 [08:11:31] anyway, imma go eat now [08:24:23] oh yep, that seems to have fixed it [08:24:40] I guess "There is currently no text in this page" is just because the API isn't functional yet [09:00:38] https://issue-tracker.miraheze.org/T12653#253766 [09:01:06] so uh... are we going to be downgrading ExternalData? or are we going to update ManageWikiExtensions.php [10:06:26] We can update ManageWikiExtensions I think. [10:06:54] This log is probably my fault with my CreateWiki patch on beta. [10:08:00] Nevermind it is 2 weeks old lol [10:08:29] It looked similar to one of my issues I ran into (and thought I fixed) so assumed it was but guess not lol [10:45:18] mysterious .php files are scary [10:45:52] reminds me of when we had that incident at WT where pages would randomly be injected with raw Google Tag Manager code [10:47:04] > [27/09/2024 20:06] This log is probably my fault with my CreateWiki patch on beta. [10:47:18] nah, just me messing around and trying to set permissions i can't [10:47:33] I see you're all having fun [10:47:56] mhm, i have to write documentation now [10:48:28] @cosmicalpha about https://github.com/miraheze/MirahezeMagic/pull/500#issuecomment-2370267682, I can't push to that branch anymore for obvious reasons, so I can't fix it [10:50:54] wish we had a herald rule to auto-remove PR Ready when a task is resolved [13:04:30] https://issue-tracker.miraheze.org/T12654; do i security review myself? [13:08:43] If it's your own extension probably not a good idea [13:08:55] hm okay [13:09:28] I'd say it's kind of like checking grammar mistakes, sometimes it's hard to see your own mistakes and re-read your own stuff [13:09:47] So probably better to have a second reviewer just to be extra sure [13:10:17] yeah makes sense, just making sure [14:33:21] BlankEclair: I've given it more thought and realized that actually we've pretty much allowed extensions created by reviewers to be installed so it's probably fine but it would still be ideal to have someone else at least give it a quick look [14:34:11] i feel like the creator doesn't matter for the choice on whether or not it should be installed [14:34:29] it should be the security review, and that hinges on a reviewer who can see the code at their best state [14:37:54] Well yeah the reason why I didn't think of it at first is because it's never been seen as a review, I guess we've mostly thought of it as "since the creator is trusted to review their extensions are probably safe" [14:38:04] But indeed, it probably shouldn't [18:52:27] [1/2] Has been over a month since my PR been here... I guess my changes for UPV2 collapsed with my branch lol [18:52:27] [2/2] https://github.com/miraheze/mw-config/pull/5645 [19:47:20] sorry about that, I'll try to test UPV2 soon [19:48:38] [1/2] np, but you can still merge this since I removed all UPV2 mentions in there [19:48:38] [2/2] UPV2 are from another PR [21:21:31] @cosmicalpha , looks like recent createwiki changes may have reverted the 250 character limit hack yet again. [21:22:06] I don't think so but I'll look. I think the core hack was accidentally reverted [22:26:25] Security update on Monday [22:26:51] [1/18] Hi all, [22:26:52] [2/18] On Monday we will be issuing a security and maintenance release to all [22:26:52] [3/18] supported branches of MediaWiki. [22:26:52] [4/18] The new releases will be: [22:26:52] [5/18] - 1.39.9 [22:26:53] [6/18] - 1.41.3 [22:26:53] [7/18] - 1.42.2 [22:26:53] [8/18] This will resolve one security issue in a bundled extension, along with bug [22:26:54] [9/18] fixes included for maintenance reasons. [22:26:54] [10/18] This security issue affects many unsupported versions of MediaWiki. [22:26:54] [11/18] This release may or may not be made with a CVE number formally attached, [22:26:55] [12/18] due to the recent delays in receiving them from MITRE. [22:26:55] [13/18] We will make the fixes available in the respective release branches and [22:26:56] [14/18] master in git. Tarballs will be available for the above mentioned point [22:26:56] [15/18] releases as well. [22:26:57] [16/18] A summary of some of the security fixes that have gone into non-bundled [22:26:57] [17/18] MediaWiki extensions will also follow later. [22:26:58] [18/18] […] [22:31:38] Wtf is MITRE [22:31:47] No idea [22:31:51] Dumb ass acronym [22:31:56] I’m assuming the blokes who assign CVEs [22:32:09] [[w:MITRE]] [22:32:10] [22:32:28] Interesting that mediawiki 1.39 will be on .9 [22:32:39] Just in time for 1.43 what good timing [22:55:01] MITRE is a government funded research lab [22:55:27] Ah [22:55:30] Technically private but gets all its funding from the government so the lines are blurred [22:55:35] I was thinking they mean the cut you make in wood lol [22:55:47] Which government [22:55:50] Sounds very dubious [22:55:56] United States federal government [22:56:00] IRS must love them [22:56:01] Of course