[05:18:23] https://issue-tracker.miraheze.org/T12587; this bug is truly elusive now wtf [07:47:32] when does miraheze make security tasks public btw? [08:04:02] When they are fixed [08:04:21] Assuming nothing we can't release is there [08:06:34] BlankEclair: but which task do you want to see [08:07:18] RhinosF1: https://issue-tracker.miraheze.org/T12371 [08:08:55] BlankEclair: do you see it linked anywhere [08:09:08] RhinosF1: wdym? [08:09:39] BlankEclair: how did you discover its existence? I also can't see that task [08:09:47] Is it linked on another task [08:09:53] oh, i made that task and it's fixed [08:10:04] Oh [08:10:07] but like... i want it public? wikimedia task is blocked on miraheze making it public [08:10:20] BlankEclair: where have wikimedia said that [08:10:25] They aren't normally that nice [08:10:38] https://phabricator.wikimedia.org/T370632#10003640 [08:12:24] BlankEclair: I assume we've updated [08:12:39] yep [08:12:55] The wikimedia ticket is waiting on my reply [08:13:20] wait, what reply? [08:13:40] BlankEclair: well I'm the only Miraheze tech member that can answer Scott's question [08:13:50] BlankEclair: can you add me to the Miraheze ticket [08:13:50] ah [08:14:05] done ^^; [08:16:48] BlankEclair: commented on both [08:17:05] I love how well we do things some days [08:17:27] tempted to try to make the wikimedia task public myself [08:20:17] anything w/ localisation issues? [08:20:57] i suspect poisoned parser cache [08:21:39] BlankEclair: don't think you can [08:21:46] I concur with BlankEclair [08:21:49] RhinosF1: do you want me to try? [08:21:59] BlankEclair: nah wait [08:21:59] re localisation: as in, i believe that the root cause is fixed, but we're still getting bad cached pages [08:22:05] oh okay [08:22:34] BlankEclair: you are probably 100% right [08:22:53] Especially if you're seeing the bug on old servers [08:23:03] it was uniform on all the servers tbh [08:23:07] (in my experience at least) [08:23:25] phorge fun fact: you can set the view and edit policy, even if the fields are hidden [08:23:25] Do you need me to do something? I can be in for another hour and a half or so then will go to dinner [08:23:51] cosmicalpha: why is https://issue-tracker.miraheze.org/T12329 perma private [08:24:05] CosmicAlpha: not sure we want to purge PC tbh on all wikis? [08:25:14] We do once a week anyway [08:25:36] https://github.com/miraheze/puppet/blob/f8338b4206191a8e2ea0a448f6028495f0e95e09/modules/mediawiki/manifests/jobqueue/runner.pp#L121 [08:26:09] why loginwiki only? or am i misinterpreting that parameter [08:26:13] CosmicAlpha: no we don't [08:26:26] That doesn't purge all parser cache [08:26:26] It is one parser cache. [08:26:33] It should [08:26:37] p sure the cache is set to expire in 7d in mw-config [08:26:41] but like... that's seven days [08:26:46] Tbats also daily not weekly [08:26:56] CosmicAlpha: no it purges expired parser cache [08:27:01] That's what --age is for [08:27:25] It runs daily but purges every 604800 seconds of parser cache (which is 7 days to expiry thus purged every 7 days) [08:27:39] yes [08:27:54] But we don't purge the entire farm's PC at once [08:28:06] Also that just drops it from the database, it doesn't actually expire it [08:28:11] Right [08:28:13] It will have already expired at that point [08:28:20] Oh right [08:28:28] So what do you want me to do? [08:28:44] nothing [08:28:48] BlankEclair: I'll answer on the task in a bit I may have put that tag on the wrong task actually. [08:29:00] Cause I'm not convinced I want to drop all of our PC [08:29:05] It would probably kill us [08:29:10] i don't know whether or not to respond with "lmao" or "bruh", so i'll just say "lmao bruh" [08:29:30] CosmicAlpha: it probably should go on the two other private cf migration tasks [08:29:30] Oh so I am lost lol, wherent you just asking me to do something or telling me what we shouldn't do? So confused. [08:29:36] Which are waiting on your reply [08:29:57] CosmicAlpha: I said not sure we want to do anything to solve that bug [08:30:05] It kind of was a question [08:30:23] Cause the only way we fix it now, cause I believe resetWikiCaches has ran twice and finished [08:30:25] Oh lol I would say lets see what it does over the next few days. [08:30:32] Is to purge PC [08:30:37] Which I think is a bad idea [08:30:52] You keep saying PC and I keep thinking purge my computer lol [08:30:58] We could maybe solve the underlying issues [08:31:07] sudo rm -r /var/cache/* [08:31:08] Which is one we lacked documentation (surprise) [08:31:20] underlying issues? [08:31:31] BlankEclair: it's not in /var/cache, PC is an SQL database [08:31:41] Please let me know if you find something that needs documentation. That is one of my priorities right now: getting things documented [08:31:41] in response to this: [08:31:43] > [18/09/2024 18:30] You keep saying PC and I keep thinking purge my computer lol [08:31:43] BlankEclair: yes the why the fuck we missed it and left it broken for so long [08:31:49] CosmicAlpha: this [08:32:16] CosmicAlpha: https://issue-tracker.miraheze.org/T12587 should be added to the create new mw* doc [08:32:31] ./srv/mediawiki/cache has a crap name [08:32:35] None of it is cache [08:32:43] You either need to sync it from an existing server [08:32:49] is it the same as the ~/.newsboat/cache.db misnomer? [08:32:57] Or run resetWikiCaches.php twice on all wikis [08:33:04] (in newsboat, cache.db is actually where all your data is stored) [08:33:04] Okay I put it on my notes, I'll get to it soon. I updated a lot of docs today and yesterday already. [08:33:08] Or you will create wonderfuck [08:33:37] CosmicAlpha: also why do we have no smoke tests for new appservers [08:33:54] pretty sure [[tech:server usage]] is missing mattermost [08:34:09] We could have pretty trivially detected that bug with a few tests that check things work on appservers [08:34:24] It sure is lol Tech:Server usage is very hard to maintain sometimes [08:34:49] Why da fuck is a human maintaining that [08:34:51] should we try to use DPL3 or something? [08:34:56] RhinosF1: exactly! [08:34:58] I've proposed automating that countless times [08:35:14] None of the data is even hard to get [08:35:25] Special:PrefixIndex/Tech:mw [08:35:40] Because its us and we do things backwards often. I just realized today that we somehow decided to upgrade MediaWiki with 90% of the extensions not marked as tested on tbe Extension test task. [08:36:07] That was one sad checklist [08:36:36] CosmicAlpha: yes tech decided (I assumed you) to no longer test every single extension cause too difficult for humans to do and just test the big stuff. [08:36:42] You really should be making that call [08:36:47] That is a huge risk to accept [08:37:02] Okay maybe not every one, but I didn't mean none, like we could have done better than that. [08:37:02] And yes we do things very backwards [08:37:16] I'm looking forward to doing my cyber review [08:37:24] Cause it's mostly about process and we suck at it [08:38:10] * RhinosF1 hides the fact that his work has an entire team called developer services that's job is to make all this shit for developers and techies easier [08:38:16] For web platforms [08:42:45] [1/2] I will accept the risk that not every extension can be tested to perfection. But there should be at least some active effort of attempt. Maybe upgrade experimental wikis first and open that up a bit more, do some live opt in testing on key wikis before deciding a full upgrade. We should phase it in if we can't test absolutely everything. But just updating all of production with h [08:42:46] [2/2] ardly any testing is something I can not accept the risk for. [08:43:09] I agree [08:43:24] There's a very obvious process and human issue here [08:43:29] @cosmicalpha [08:44:44] Indeed there is. Before the 1.43 upgrade there will be a new process. At least I hope I finish it by then... should be able to... [08:45:21] I propose I bang my head against a wall [08:45:30] I knew we were backwards but ... [08:45:42] The WMF don't do everything perfect but I do like their method of upgrading in "groups". [08:45:59] Their upgrades are much more incremental [08:46:09] Upgrading in groups has caused issues before [08:46:16] Hence why we split beta off [08:46:37] Among a million other reasons [08:46:47] I could upgrade beta to 1.43 now actually and then testing can also start there. [08:47:05] Yes branch is cut [08:47:19] Please do [08:47:23] And create the upgrade task [08:48:07] 1.43 is LTS also. Maybe after we can reevaluate if we even keep upgrading to stable or keep LTS (although keeping LTS has it's own problems also IE much harder to upgrade when it is time) [08:49:11] Branch isn't cut but I can upgrade it to master. [08:52:50] lurking on phone [08:54:42] Maybe it's out soon then [08:54:48] I at least saw the date of the cut [08:55:05] My sense of time is warped being off work [08:57:51] Beta is officially on 1.43 [09:21:35] CosmicAlpha: i replied on irc btw [09:21:52] what client btw? [09:28:11] IRCCloud [10:00:04] CosmicAlpha: can we have a debate about naming things [10:00:11] my favorite topic [10:00:13] https://issue-tracker.miraheze.org/T12601#252566 [10:01:25] I don't like wmg which we use a lot due to its meaning as wikimedia global, I don't like wg that mediawiki uses either as it means wikipedia global lol [10:02:17] In otherwords yes let's have a debate lol [10:02:24] lmao i see wikis have codemirror defaults in wgDefaultUserOptions [10:02:58] BlankEclair: https://bash.toolforge.org/search?p=0&q=name [10:03:06] what about mhg for new variables, and keep the old ones for back compat [10:03:20] I like mhg [10:05:27] CosmicAlpha: [10:05:44] I have always liked that as well [10:06:06] Its just not super easy to update our current ones is why I never changed it lol [10:06:42] * RhinosF1 thinks BlankEclair can take the rename all the things task [10:06:49] It should be fine [10:06:51] We've got beta [10:06:54] To blow up first [10:07:04] what about former managewiki exports? [10:07:10] actually, how do we even import managewiki backups? [10:07:17] I'm not sure we do [10:07:22] You can export [10:07:24] lmao [10:07:27] But I don't think you can import [10:07:36] Or at least not last time I looked at the code [10:07:39] * RhinosF1 looks at CosmicAlpha [10:07:56] don't forget about setting the variable for older wikis [10:08:13] I created a script that can import them [10:08:37] BlankEclair: you can update it with difficulty [10:08:46] how much difficulty [10:08:56] ManageWiki's schema is evil [10:09:05] https://github.com/miraheze/MirahezeMagic/blob/master/maintenance/restoreManageWikiBackup.php [10:10:03] i feel like i should make a managewiki setup that doesn't return 10 warnings every time you go to Special:ManageWiki/namespaces [10:10:05] but i'm lazy =w= [10:10:19] Oh good we did fix that [10:10:38] i don't know whether or not it's because of my jank setup [10:10:47] BlankEclair: we should make ManageWiki not evil to install [10:11:19] fun fact: my test mediawiki servers would freeze in place if you make too many requests at once [10:11:49] BlankEclair: fun fact: ours have randomly froze before tbh [10:12:03] i assume our causes are different though [10:12:11] It didn't used to be that uncommon for php fpm to lock up [10:12:11] i'm running the mediawiki docker image under an arm64 system [10:12:14] the php binaries are x86 [10:12:31] And to have to go and reboot it hard from promox [10:51:12] BlankEclair: if we set msleep high enough it might be possible [10:51:57] And temp set parsercache to like a day [10:52:11] No that wouldn't work [10:52:15] Bugger [10:52:18] Ye not sure [11:06:27] msleep? [12:48:17] We already had a massive discussion about this [12:50:10] is testing open and how long is it open [12:50:12] There is no scope to test every single extension, not when Miraheze insists on updating to the newest mediawiki version as soon as its releaed [12:50:22] Yes I know [12:50:35] Apparently @cosmicalpha wasn't fully aware of how much wasn't tested [12:50:37] and still give my +1 to lts model [12:50:38] We proposed a list of extensions we will commit to testing but that again was largely ignore by the board [12:51:00] Well it was discussed for a few days extensively before the 1.42 update so [12:51:29] If the board aren't aware of the risks they are owning, that's a problem [12:51:38] We know we don't things properly [12:52:38] Oop [12:53:33] This is an organisational problem [12:53:38] Not any individuals fault [12:53:54] And one that's going to mean we miserably fail the upcoming cyber assurance work [12:53:56] I concur [12:53:58] Burn it down! [12:54:15] Lets rebrand to wikia! [12:54:20] is lts a serious idea then, to give miraheze a fighting chance to test more comprehensively from jump to jump [12:54:25] Wikia (powered by miraheze) [12:54:35] +1 [12:55:01] Yes it is [12:55:10] I dont really see any problems arising with that since MediaWiki supports updating between from 2lts ago, yeah maybe the sql patches will be a bit more difficult but trade offs must be made [12:55:12] I like it too from a simple stability sell; I don't see a lot of people fussed about latest and greatest, I know by a long shot stability is a huge issue for us both in practice and just in the image of the platform [12:55:49] and the extensions that don't like lts I'm pretty sure lean strongly to the ones that are developed in house anyway [12:55:50] Most of our upcoming cyber assurance is a bit ISO 9001 like. We can be shit as long as the policies and procedures are there, we understand and accept risk and we regularly review it. [12:55:55] We really don't. [12:56:05] There is a baseline level of shitiness [12:58:26] Yeah I think Miraheze established that baseline [12:58:45] what's the cyber assurance work? [12:59:25] I agree with Raidarr [12:59:30] Upcoming work to try and improve our assurance of cyber security [12:59:39] And ensure we are doing things right [13:21:40] [1/2] The board doesn't decide when we upgrade, that's not our function. This is in DTech's remit to set policy, and they now have bandwidth to develop. [13:21:41] [2/2] If you have a proposal, please work with them to implement it. [13:25:34] Maybe not, but att the time, however the DTech was unavailable, which means that really the board should've had an opinion or we would have gotten nowhere. [13:25:36] [1/2] I also agree that rushing to be 'always latest' for latest's sake isn't wise, hoping we can set a healthier middle ground. [13:25:37] [2/2] I like some of the proposals around how to bolster our testing practice I've seen so far [13:25:53] Lets go back to 1.35 [13:25:59] We can ask shoutwiki for help! [13:26:01] 1.21 or nothing [13:26:08] wait which version is editthis on [13:26:14] 1.15 i think lol [13:26:18] yeah that one [13:26:36] It amazes me that they're still on that. Thats even worse than fandom was [13:26:38] but yeah, seems like a whole thing to be resolved in tech [13:27:33] there are other things the board can more directly apply its fingers to and the version thing might be something to look at, but not something they should necessarily do the full push on [13:27:45] ideally it all runs through the inside tech channel and an internal consensus be found imo [13:27:58] I thought one of those exists at least [13:28:10] It does, yes. [13:28:35] so it seems to me it should be formally raised in there and talked out, probably as a clear transition and not something done right on the spot [13:28:44] say, continue with current upgrade pattern, but switch gears on the next lts [13:28:50] Too modern [13:29:01] no idea when that is so if that's this version or next I defer [13:29:03] Go back to the ways of our founders [13:29:07] Mediawiki 1.1 [13:29:16] As Magnus intended [13:29:17] .1? that's not a founding version [13:29:27] I think they skipped 1.0 [13:29:32] oh I see [13:29:46] MW pre-alpha when? [13:29:47] maybe a 0.x beta would be better then [13:29:58] Yeah that [13:30:00] Waits for mediawiki 2.0 [13:30:16] coming with miraheze 2.0 [13:30:32] Fuck mediawiki, go back to [[w:UseModWiki]] [13:30:33] [13:31:14] Remember the Lucy experiment at fandom [13:31:16] We should go to that [13:33:48] ? [13:34:44] Was some wordpress looking ass platform that they tried to bring in forna brief time and you had the option of a mediawiki wiki or a "lucy" wiki when you created a wiki [13:34:58] Went down horribly and Fandom was accused of trying to move away from MediaWiki [13:35:30] Lmao [13:35:32] This was very much pre-gamepedia acquisition thougu [13:38:55] You know what mediawiki is missing [13:39:01] A Wiki Farm consortium [13:39:01] games [13:39:16] A group of farms that come together and advocate for improvement to wiki farm support [13:39:19] But also games yes [13:39:34] okay but like why are we joking around [13:39:57] Hub [13:40:03] Huh* [13:40:06] community stars or whatever when [13:40:10] Realised the edit doesnt go through to irc [13:40:23] Miraheze Pro+? [13:44:46] Definitely [13:44:56] Also getting indie wikis would be awesome [13:44:59] Like NIWA [13:45:04] Or MCW [13:45:10] I came up with a name for it [13:45:18] Wikiverse [13:45:27] Trademark it [13:45:28] Or wikiversal if we won’t get sued [13:45:31] RIGHT THIS SECOND [13:45:50] Can you imagine trying to advocate to the WMF though [13:45:56] I don’t know how [13:46:00] Would be like watching paint dry [13:46:03] Hey siri [13:46:06] How do trademark [13:47:48] Sounds fun /no [13:48:18] Maybe more effective if we get prominent Wikimedians or even some WMF employees into it [13:52:03] not many specific wikis get near to Wikipedia's level of standards and rules ... [13:53:15] idk, there's a feeling wikimedians look at us fan wiki makers as plebs lol [13:55:08] I may ask the Wikipedia admins I know what their thoughts on fan wikis like soft cell PTW etc are [13:56:25] don't bring soft cell it's still in 0.* phase lmao [13:56:56] although someone has attached a link to my stub article in Soft Cell Wikidata entry LOL [14:45:11] <_changezi_> Hello! Does anyone know why my wiki isn’t generating a sitemap? I mean I checked `https://indicaonline.miraheze.org/robots.txt`, and it shows the sitemap line: `https://indicaonline.miraheze.org/sitemap.xml`, which, as far as I know, means the sitemap should be available? But when I visit the URL, it returns a 404 error. Can anyone help with this? I’d really appreciate it! [14:45:39] <_changezi_> :pupCoffeeMH: [14:46:35] _changezi_: it might be because the wiki was created fairly recently [14:51:29] <_changezi_, replying to BlankEclair> I mean, that makes sense. Is there a specific time after which the sitemap for a new wiki usually generates? [14:52:51] _changezi_: every saturday i believe [14:55:20] <_changezi_, replying to BlankEclair> [1/2] Hmm [14:55:20] <_changezi_, replying to BlankEclair> [2/2] Isn't it also possible that maybe the sitemap feature isn't enabled for my wiki? [14:55:32] it's enabled for all wikis tho [14:55:42] all public wikis i mean :p [14:58:03] <_changezi_> Aight, thanks. I'll wait for a few days and see if the sitemap generates. [15:02:49] that's next year's April Fools joke [15:02:53] Miraheze+ [15:02:57] ad-free experience [15:04:15] sponsored by democratically dictated president soby sobea [15:08:21] Apex* [15:08:55] that's our new tech director [15:09:10] Waiting on this to be actioned [15:13:12] [1/2] I stand by my vote [15:13:12] [2/2] https://cdn.discordapp.com/attachments/1006789349498699827/1285981961180942356/image.png?ex=66ec3f87&is=66eaee07&hm=7839813c7fff15c0d52624bf3e3e407603a8552639557aeae20463b884119acd& [16:32:49] I like the idea. A community with other wikifarms could be good. But i think current situation doesn't have a group or alliance between wikifarms [16:33:28] if miraheze wanted to pursue a chat with IWF, telepedia, shoutwiki whatever it wouldn't be impossible [16:33:37] but what exactly would be the goal/commonality of it [16:34:26] I do like the idea of a shared backchannel to raise cvt alarms, discuss issues of common interest and share technical novelties [16:36:35] I think the goal is Utopic mutual cooperation. If you feel pretty idealistic if a happy thought. Otherwise, if you think most of that wiki as competence, then is not a good idea [16:37:07] what is utopic cooperation [16:37:26] and are we thinking of it by idealist standards, or by what we could realistically try and get together for [16:37:45] utopic is an adjective that define the state in which that cooperation is done [16:38:07] I think as far as trying to find common points of interest by lines of technical interest, vandalism and that sort of thing, there's common ground to build a channel for [16:38:24] no farm likes vandals so there may be an interest to leave a 'hey btw watch out for...' sort of message [16:38:37] Obviously the thought "a consortium of people that help each other" is not: what we could realistically try [16:38:40] and that offer could well extend to independent communities or collectives [16:38:50] my problem with that is that it's too vague [16:39:10] 'help each other' could summarize practical things that the commonality does, but isn't anything on its own [16:39:28] the sell will come from specifying what the collective can do for each other [16:40:04] a common standard for certain issues might arise but it would fail the moment any of its members try to force a standard/policy so to speak [16:41:11] it would also fail if its membership devolves into bickering on the differences/problems of each other's projects, it would have to focus on securing the common ground [16:42:09] [1/3] My impression is that Miraheze did a lot of good for the services of wiki hosting. And did a lot of code on helping people host their wiki. [16:42:10] [2/3] And because the content we've developed is some of it open source or applied to extensions or processes that other users also use, we've benefited others, even if we don't have any dealings with them. [16:42:10] [3/3] So even without that group of allied wiki admins, we've already helped the bigger picture. [16:42:55] for sure [16:43:09] For details on the idea that @ Original Authority suggested you'll have to ask him. But I think he said it as a joke. Without giving it much thought. [16:44:23] I wouldn't mind working with parties from other projects for the record, to collaborate/pass universally good thoughts between projects [16:44:49] a League of Wikis minus the unfortunate historic record of the LoN [16:47:03] I think, if that group of wikis is ever done, it would be appropriate for it to be independent wikis, and not include WMF, FandomWikia or Wikidot, as these companies have their own interests and not those of making wikis up-to-date and at the service of local admins and users. [16:47:33] Does anyone (cc @agentisai or maybe BlankEclair) know what happened to beta? I've edited LocalSettings and even made some intentional syntax errors but the wiki is fine as if it doesn't recognise test151 anymore [16:47:36] did I miss something? [16:50:10] I can't even login to beta at the moment [16:50:56] (constantly getting No active login attempt is in progress for your session.) [16:52:10] isn't OA head of indie wikis federation? [16:52:47] that would be m3w [16:53:52] Did you edit /srv/mediawiki/config/LocalSettings.php? [16:54:15] yeah, and also some extension files [16:54:23] and nothing happens [16:54:47] not sure then 🤔 [16:54:51] maybe it's the cache \:P [16:54:54] CA also said he upgraded it to 1.43 but I see it as 1.42 [16:55:08] I think it would know if I removed a bracket in Localsettings heh [16:55:42] Wait [16:55:44] it seems like all wikis are still running 1.42 [16:55:49] Paladox was fiddling with stuff [16:55:50] due to multiversion [16:55:54] Oh wait I have an idea [16:56:05] @agentisai heh I just renamed LocalSettings to NoSettings and even now it's not taking notice [16:56:13] lol [16:56:21] I'm trying on https://meta.mirabeta.org/ by the way [16:56:30] try updating metawikibeta to 1.43 in ManageWiki [16:56:44] (I did also try to edit a 1.43 extension page and that did nothing too) [16:56:48] but I'll try [16:57:11] See #tech-internal on mattermost [16:57:21] it's not an option [16:57:29] there you go [16:57:40] it should be added to MirahezeFunctions [16:58:21] @agentisai see mattermost [16:58:24] I think there's a bigger issue [16:58:55] ohhh [16:58:59] that makes sense [16:59:16] yeah, I guess you were right about cache! [16:59:31] I thought I was going crazy when nothing was happening [16:59:35] haha yeah, wrong one though [16:59:42] @paladox see mattermost if you have time please [17:01:20] We really need a break for this [17:03:04] To die() if this happens [17:08:29] and we're back [17:08:39] I never thought I'd be so glad to see a successful 500 error [17:09:33] that's why 1.43 wasn't working either, I was wondering how it was possible that CA said he upgraded it but it was still 1.42 [17:10:57] Wai [17:11:55] to confirm that test151 is actually listening to LocalSettings I removed a random } [17:12:09] [1/2] This has nothing to do with the board. This is something that is on me that happened. And there will be things in place before the next upgrade. Also the always latest commitment is taken far to seriously for far to long. That has never meant we have to upgrade immediately regardless of the security or stability of the project caused by that. It mea [17:12:10] [2/2] ns upgrade to the latest when we can and when things actually work. Not actually the cost of 90% of stuff untested. The whole latest model will also probably be reevaluated after 1.43 (which is LTS) [17:13:21] I upgraded all beta wikis to it, and MW VERSION was showing 1.43-alpha [17:13:36] 1.43 being lts is an excellent opportunity [17:14:19] I agree with no upgrading immediately. If we don't have enough people on tech to upgrade. [17:15:15] yep, it was because it was listening to mw* before [17:15:34] Wait what? [17:15:49] [1/2] I didn't really put much thought in it, but my initial idea was similar to what you have surmised, passing along information that may be valuable re: CVT etc. [17:15:49] [2/2] But also stuff like "this is a really neat extension we want to share with you that might help in WikiFarm activities" etc etc. [17:16:17] @cosmicalpha https://issue-tracker.miraheze.org/T12614 [17:17:31] T12122 is working on that, actually [17:17:53] the goal is to completely section off beta [17:18:18] @agentisai that's more urgent [17:18:32] What we've noticed now is the existing separation doesn't work [17:19:12] Making beta ready for access without an NDA is a lot wider than our current separation has a million holes in it [17:19:12] There was never any separation to begin with [17:19:34] Beta having access to prod was on purpose, previously [17:19:45] No it wasn't [17:19:50] I wrote the original plan for beta [17:19:56] It wasn't? [17:20:03] Beta was always designed to be isolated [17:20:17] because we'd use test1xx to test how wikis would render on a new version of MediaWiki [17:20:31] originally we had test3wiki [17:20:41] Which was connected to prod [17:20:49] But that caused outages [17:20:53] I remember always producing through test servers to see how new MW would render on some wikis [17:21:01] And we wanted to test more and more [17:21:12] So we decided to launch a seperate beta cluster [17:21:22] And seperated as much as we could resource allowing [17:21:32] Interesting [17:21:41] not sure if there was any ever separation though [17:21:48] It was setup to have a different db suffix and global db so it could use a different db grant [17:21:57] It's always been inconsistent and crap [17:21:57] In any case, the plan is to have test only be able to access its own separate db server, db172 [17:22:05] When I upgraded beta to 1.43 prod listened to beta and it broke by upgrading apparently that is the issue. Or do I misunderstand? [17:22:17] You misunderstand [17:22:24] Paladox wanted to test something [17:22:25] Confused lol [17:22:32] I think Paladox did some work on Varnish and routed test to mw* [17:22:51] OHHH I understand... [17:22:59] Paladox wanted to see if having a varnish instance as a middleware would be faster [17:23:08] so cf -> varnish -> mediawiki [17:23:19] Oh makes sense. [17:23:20] But beta wasn't set back up varnish [17:23:34] Which means even the existing separation is non existent [17:23:49] The prod mw* being able to show beta is a bug [17:24:47] Then technically this was expected at current setup. You can proxy to test151 from prod wikis which is technically expected. [17:25:18] no it's not [17:25:38] That should have been firewalled off a long long time ago [17:25:56] seems no one knew that was unexpected behavior [17:26:05] I wonder if I still have the original planning document [17:26:16] It probably states that should have been done about 2 years ago [17:31:48] I mean that doc also states we'd have full isolation prior to multiversion [17:33:14] https://issue-tracker.miraheze.org/T12122 isn't even possible the way it's described [17:33:18] Cc @reception123 [18:02:49] Lmao