[01:10:34] uh I got it to work myself [01:11:15] I thought about trying createwiki again (in 1.43 this time), however it's saying that the db my wikis uses doesnt exist even though it clearly does [01:11:42] [1/15] ``` [01:11:42] [2/15] root@vm117709:/var/www/html# php maintenance/update.php [01:11:43] [3/15] [bec4a0facd412dfc17a4638f] [no req] Miraheze\CreateWiki\Exceptions\MissingWikiError: The wiki 'test_wiki' does not exist. [01:11:43] [4/15] Backtrace: [01:11:43] [5/15] from /var/www/html/extensions/CreateWiki/includes/Helpers/RemoteWiki.php(86) [01:11:43] [6/15] #0 /var/www/html/extensions/CreateWiki/includes/Services/RemoteWikiFactory.php(22): Miraheze\CreateWiki\Helpers\RemoteWiki->__construct() [01:11:44] [7/15] #1 /var/www/html/extensions/CreateWiki/includes/Hooks/Handlers/Main.php(87): Miraheze\CreateWiki\Services\RemoteWikiFactory->newInstance() [01:11:44] [8/15] #2 /var/www/html/includes/HookContainer/HookContainer.php(159): Miraheze\CreateWiki\Hooks\Handlers\Main->onSetupAfterCache() [01:11:44] [9/15] #3 /var/www/html/includes/HookContainer/HookRunner.php(3487): MediaWiki\HookContainer\HookContainer->run() [01:11:45] [10/15] #4 /var/www/html/includes/Setup.php(456): MediaWiki\HookContainer\HookRunner->onSetupAfterCache() [01:11:45] [11/15] #5 /var/www/html/maintenance/doMaintenance.php(83): require_once(string) [01:11:46] [12/15] #6 /var/www/html/maintenance/update.php(308): require_once(string) [01:11:46] [13/15] #7 {main} [01:11:47] [14/15] root@vm117709:/var/www/html# [01:11:47] [15/15] ``` [01:35:06] also I could imagine the extension not being able to locate the wiki's database which could explain why it's throwing error after error [01:35:45] to be fair I dont have an idea either what the path to the db is LOL [01:58:20] CreateWiki is hard to get setup. Hence, we don't provide too much in depth support outside of Miraheze use cases [02:15:18] I seeeee [02:15:35] is there a way to use managewiki without createwiki? [02:15:48] or is there any similar extension perhaps [02:16:09] not quite sure if CA finished up with making MW independent from CW [02:32:40] Supposedly, they are now independent [02:32:51] but again, probably hard to set up in a non-Miraheze enviornment [02:37:15] at least figuring it out does not violate the dmca [03:02:07] Not forever. It still has a lot of config dependency but not required on CreateWiki at all. It uses absolutely nothing from CreateWiki and CI for ManageWiki even runs without CreateWiki now. [09:19:43] it does? Where can I find the latest version of ManageWiki? [09:30:55] the git repo? ^^; [10:07:05] [1/4] Are there any notable tech updates this month besides @pskyechology's DiscordNotifications change and automated SSL? The ManageWiki rewrite counts as well, but I'm not sure how to phrase it in non-tech terms to non-tech readers. [10:07:05] [2/4] It would also be nice to note who automated SSL so that tech's work gets some recognition outside of the tech bubble. Judging from the commits, I believe it's Agent and/or CA? [10:07:05] [3/4] https://meta.miraheze.org/wiki/User:Raidarr/Miraheze_Monthly/August_2025 [10:07:06] [4/4] https://cdn.discordapp.com/attachments/1006789349498699827/1411291125792243733/image.png?ex=68b41ec8&is=68b2cd48&hm=53d3fc2ea106e3d8aa818485449c0999640e8c2556f73befbdd9ef49843c5b9d& [10:14:07] https://www.xataka.com/empresas-y-economia/laliga-va-paso-alla-amenaza-legalmente-a-clientes-cloudfare-que-comparten-ip-webs-para-ver-futbol [10:15:05] Spanish major league soccer seems to be actively blocking Cloudflare whenever there's a match [10:15:49] Even if the customers aren't hosting anything related to piracy of matches [10:21:09] At least they have an email for appeals: afectadoscloudflare@laliga.es [10:46:17] reminds me of chaos monkey for some reason [11:03:18] that's ironic since our own attempts to appeal to CF itself over a mistaken abuse designation took months to resolve [12:35:59] It was over a year and they gaslighted it us and claimed it never existed [12:37:30] wasn't it also some AI response and never even a human? [12:37:47] Don't think so [12:38:17] Maybe I'm thinking of something else [13:01:38] They’re still doing this? Insane [13:01:59] I remember hearing about this back before summer [14:11:22] Custom domain automation was mostly the work of Agent, with some polishing by CA [14:12:04] i cant really think of other ext changes [15:14:27] not exactly for appeals [15:15:00] it's for LaLiga to collect emails from webadmins that could be affected to use as ammunition against Cloudflare [15:15:57] they'll block your site anyway if your IP also happens to be shared with a pirate site [15:16:10] Oh right this is your country isn’t it [15:16:22] yes 🇪🇸 [15:18:26] I should see that court case LaLiga claims to have on their side, should make for a fun read [15:29:22] Yeah, I don't think there have been either besides that [15:34:45] also with the DN update came the ability to toggle all the available hooks [15:37:36] [1/2] so basically they got some judge in Barcelona to order some ISPs in Spain to block IPs and domains (as well as to monitor attempts to use different IPs and update blocklists weekly) involved in supposed "piracy" because it was too hard to get the info behind. [15:37:36] [2/2] frankly, I care not for LaLige pissing and crying about their IP being breached, but I also cannot cheer on Cloudflare since I really, really do not like them for both technical and personal reasons [15:38:22] may they both eat each other 🙏 [17:49:46] Agent did the automation, I just renamed the extension [17:51:13] But primary automation work was Agent. [17:56:11] Also are we almost ready to do MediaWiki 1.44 upgrade globally? I was planning it for next week. But if we aren't ready can push it back one more week. [17:57:51] I totally forgot about it [18:00:40] I havent noticed anything too broken on PTW which is running it, I know someone had said the task had been updated with testing results from beta [18:02:07] Ill check the issues later today and see if anything totally blocking will be left. [18:02:30] I also plan to remove thr YouTube extension and replace it with EmbedVideo later today or tomorrow. [18:02:52] I think that may be one of the only true blockers [18:03:35] Cause DPL was “fixed”/replaced [18:05:48] Yeah DPL4 was written to fix for 1.44 but 1.45 I havent a clue how I'll fix it since a lot of the links tables have been moved to virtual domains which means entirly seperate databases. Plus the categorylinks migration is a total nightmare. [18:06:28] From what I see all ext wise is either disabled globally stuff, waiting upstream, or needs specific config setup, and a couple non-working or fatal exception issues [18:06:52] I can fix it but it'll be impossible to get actual counts so all the count stuff won't be accurate/fully work in some cases. [19:17:22] They did what to do links tables [19:28:08] Made them virtual domains. [19:34:33] Is there a Phabricator task I can read on this I'm not finding anything [19:56:46] I think I did see one. It was meant to move commons link tables to seperate clusters or something. I noticed the changes in the code though. [19:58:25] So this one [19:59:22] Oh I see [20:02:40] But like, if the tables are still in the same database, this changes nothing? [20:03:46] It's only a concern if you're actually using the virtual domain [20:04:29] [1/2] Great naming scheme you got there Wikimedia [20:04:30] [2/2] https://cdn.discordapp.com/attachments/1006789349498699827/1411441467989430345/image.png?ex=68b4aacd&is=68b3594d&hm=db8916708e69a1e31a4cc77946ac01a588bb3dd57ff7e79abc55bc0c6d6731cf& [20:04:41] Nobody will be confused by this ever [20:24:02] @posix_memalign ooh a good thing to mention would be the whole issue with TemplateStylesExtender getting pissy about using vars in rgb() [21:35:43] Well yeah in theory but if it isnt being used it will break. And a lot of core methods seem to rely on the seperate database connection to the virtual domains. [21:44:17] Thanks for the info @pskyechology @cosmicalpha. I updated the MH Monthly section. [21:53:06] If the virtual domain isn’t set it just falls back to the same database so I don’t see the issue? [21:56:17] The issue is supporting when it is set. Or at least gracefully exiting otherwise thats just functionality from core we can't support and if set would cause dpl to fatal due to connection issues. [22:00:14] I don’t understand why — why can’t you just connect to the database where the link table is? [22:03:26] Yeah I'm not sure I understand either, they didn't change the table structure or anything really [22:05:24] because then it requires changing a lot more in DPL, since it uses sql_calc_found_rows to get things for like %COUNT% which means requires one connection which connecting to a seperate database well doesn't work properly. [22:05:54] so count limit etc... will become inaccurate [22:06:02] But you don't need to connect to a separate database if virtual domains are not used [22:06:33] Right that's not my concern my concern is somehow supporting when they are, but it might simply not work. [22:06:38] But how will it? You’re connecting to a separate database to count the number of rows in it? How is that going to break? [22:07:02] Well if they are I guess that's a bummer but it seems like a pretty niche use case [22:07:35] Wikimedia's use case is like, hey we have this database that is 2TB in size, maybe we should split it [22:07:54] And if your database is 2TB in size maybe you shouldn't be using DPL [22:07:57] Because its not just one in DPL. Its a mixture of parameters if you use multiple thats more than one database. And you cant really calc on the total results accorss all the connections easily without loosing a lot of performance by seperate calc and then add them all together. Its not ideal and isnt worth the performance cost IMO. [22:08:24] So the issue is a performance one, not that it’s “broken” like you said. [22:08:48] not really its broken its just fixable but the only fix would be a performance cost. [22:09:00] it would require significant changes to achieve the fix also. [22:09:29] Not sure I agree that getting an additional database connection would be any noticeably more performance heavy — getting a connection is relatively cheap in MediaWiki. [22:09:30] Anyway I do think its not worth even trying to fix at this point. [22:09:38] its not. [22:09:46] sql_calc_found_rows can be. [22:11:06] to have many database connections and still support things like coint and limit in DPL would have to have seperate sql_calc_found_rows for each database you connect to, store it to a variable and add those all up in the count at the end while also manually calculating limit since ->limit only works for one database at a time. [22:12:32] I think I should focus more on fixing categorylinks for 1.45 then the virtual domains anyway as im not even sure if ever really will make those work properly. [22:12:32] [1/2] But DPL already uses that function? I’m just not sure I agree. It’s probably 1 or 2ms to get an additional database connection, probably less if it’s coming from the connection pool/load balancer. Even if you have to do 5 queries on that new connection at different points, the connection isn’t going to be destroyed immediately. It will simply be returned to the LoadBalancer [22:12:33] [2/2] until the entire request is finished. [22:12:55] no but you cant join on multiple databases [22:13:08] so you'd have to reconstruct the whole thing [22:13:17] and thats where it gets tricky [22:16:59] But yes I agree with this. [22:17:36] I'm worried that they could degrade performance issues for us not using virtual domains for link tables but I haven't properly looked into these changes [22:18:38] Yes I was wondering that too but theoretically shouldnt since no virtual domain = current database. Theoretically maybe one extra check a few ms cost but nothing to much. [22:19:31] I imagine all of the tables are going to be that way soon. The next one will probably be the user table with the removal of $wgSharedDB [22:19:42] yep. [22:20:15] well not all but many that have a use case. Highly doubt like text and revision will [22:20:18] but not sure. [22:22:40] Text/Revision is a likely candidate imo. That would allow them to deprecate wgExternalStore [22:24:35] God I wish the introduction of virtual-user is when we finally get proper shared user table support without CentralAuth [22:25:40] I guess thats possible yeah. [22:58:24] Soooo. What are virtual domains again [23:13:02] Basically a way to tell MediaWiki that a particular table lives on a different database than the one defined in wgDBname [23:34:50] Do we use that in our stack? [23:35:22] Yea - ie for telling CreateWiki where the `cw_tables` et al are [23:35:26] BotPasswords also uses it [23:38:56] A lot do. [23:43:54] [1/2] That'll be great. I spent some more time on this today and finally got xdebug working on my host's MediaWiki installation. It seemingly gets stuck on `$xdebug->check();`, but that's at least some progress. [23:43:54] [2/2] https://cdn.discordapp.com/attachments/1006789349498699827/1411496683363369080/image.png?ex=68b4de39&is=68b38cb9&hm=3674b9e19836bfe41a79dba3bf820507ff02e563b99e664bdf786c6e99255b7c&