[02:01:08] [1/2] the permissions strike back [02:01:08] [2/2] https://discord.com/channels/407504499280707585/407537962553966603/1288678608566616117 [02:03:45] It looks like some weird JSON was ran in the database [02:03:54] And replaced to much of it [02:04:18] Causing things to escape quotations, and merge entries of permissions. [02:24:21] A permission error is still displayed if I try to visit https://holidays.miraheze.org/wiki/Special:Watchlist or https://holidays.miraheze.org/wiki/Special:DeletedContributions/GTrang. How could this be fixed? [02:24:55] seems my fix broke some formatting [02:32:55] Unfortunately, the permission issues still have not been fixed on my wiki yet. [11:55:28] @reception123 your home dir on mwtask181 is over 100gb [11:56:14] @cosmicalpha your home dir on the same server is over 50g [11:56:41] 90% of space has been used. [12:10:06] Hmm yeah, I need to upload wikibackups to IA but the command didn't work for some reason [12:10:07] I'll try again [12:10:45] oh it's also because the actual dir wasn't deleted so there's both the full xmls and the compressed so for now I'll just delete the actual dir [12:10:53] down to 31 now [12:11:36] thanks! [12:12:14] 45 now, it seems there were a lot of wiki images left too [12:12:24] hopefully I can get the backups uploaded too [12:22:57] [1/3] yeah that still doesn't seem to work [12:22:57] [2/3] ```requests.exceptions.ConnectTimeout: (MaxRetryError("HTTPSConnectionPool(host='archive.org', port=443): Max retries exceeded with url: /metadata/miraheze_wikibackups14072024 (Caused by ConnectTimeoutError(, 'Connection to archive.org timed out. (connect timeout=12)'))"), 'https://archive.org/metadata/miraheze_wikibac [12:22:57] [3/3] kups14072024')``` [12:23:17] and I did do export HTTP_PROXY='http://bast.wikitide.net:8080/' [12:26:24] @reception123 i'm not sure why we have https://github.com/miraheze/mediawiki-repos/blob/master/mediawiki-repos.yaml#L1442 and https://github.com/miraheze/mediawiki-repos/blob/master/mediawiki-repos.yaml#L1094 [12:26:28] which conflict with each other [12:26:34] https://github.com/snap-blocks/mw-snapblocks/pull/6#issuecomment-2374670071 [12:26:52] hmm [12:27:06] what's the command you use? [12:27:24] maybe HTTPS_PROXY needs setting as well? [12:33:39] hmm, probably a mistake [12:33:49] oh, I'll try that too [12:33:52] I used `iaupload --title=miraheze_wikibackups14072024 --description="miraheze_wikibackups14072024" --file=/home/reception/wikibackups14072024.tar.gz` [12:34:56] @reception123 seems the script does https://github.com/miraheze/puppet/blob/52eed3bd1427f4d18b3a0d8b61c1a01f5ddb3f40/modules/mediawiki/files/bin/iaupload.py#L56 [12:34:59] so that is strange [12:36:29] hmm [12:40:32] @reception123 Does HTTP_PROXY && HTTPS_PROXY work fro you? [12:44:08] @reception123 alternatively does https://github.com/miraheze/puppet/pull/3971 this work? [12:47:27] doesn't seem to [12:48:39] hmm :/ [12:48:43] i'm not sure then [16:15:39] @reception123 would you be able to grab something from mr data and mrs base for me πŸ‘‰ πŸ‘ˆ [16:16:27] what would that be? πŸ™‚ [16:17:07] A list `cw_timestamp` from every single row in `cw_requests` table [16:17:47] [1/2] so you just want a [16:17:47] [2/2] `SELECT cw_id, cw_timestamp FROM cw_requests;`? [16:18:10] I'm curious, why do you need that much raw data? [16:18:53] 😭😭😭😭😭 [16:19:27] That query is bound to kill whatever database its run on lol [16:19:49] eh, our DBs aren't _that_ bad [16:19:54] Basically. Probably a where clause for not deleted or oversighted [16:20:24] why do you need it though? And that actually would make it much more complex because cw_requests doesn't know about deleted or anything [16:20:38] Yeah youd need a left join [16:20:40] cw_requests is distinct from cw_wikis so we'd need a join and all that [16:20:43] Or a right join (do those exist) [16:20:46] yeah [16:24:54] I wanna chart [16:24:54] Requests over time, look at the slope and spikes, connect them to events like fandom close waves or those anti fandom videos [16:25:13] Wow discord sent the message [16:25:17] Only 7 minutes late [16:25:37] Think so [16:28:29] urgh, the file is so big that I'd need to SCP it (I hate SCP!) [16:29:00] I'm not too good with JOIN queries so I can look later unless you want to figure it out in the meantime [16:29:24] ah that is interesting, especially to see how few requests we had back in 2015 and how many we have now [16:29:56] but actually it wouldn't make sense to have a join for deleted that much because deleted only includes marked as deleted anyway and there's few of those since I just permanently deleted like 5k+ wikis a few weeks ago [16:32:36] What do you normally use [16:32:52] @pixldev well here you go https://issue-tracker.miraheze.org/P520 [16:32:53] I meant the requests we still want deleted wikis [16:33:29] well requests can't be 'deleted' per se (or at least shouldn't) [16:33:36] πŸ”₯ [16:33:38] and I don't think there's actually been an OS'd request [16:33:43] I’ll work on parsing when I get home [16:33:46] lmao [16:34:05] oh wow, from August to December 2015 we had 278 requests [16:34:09] not bad for a first year [16:34:37] then in 2006 already 2284 [16:35:28] [1/9] oh you might also want to keep in mind: [16:35:28] [2/9] ```MariaDB [metawiki]> SELECT COUNT(*) FROM cw_requests WHERE cw_status = 'declined'; [16:35:29] [3/9] +----------+ [16:35:29] [4/9] | COUNT(*) | [16:35:29] [5/9] +----------+ [16:35:30] [6/9] | 14904 | [16:35:30] [7/9] +----------+ [16:35:30] [8/9] 1 row in set (0.029 sec) [16:35:30] [9/9] ``` [16:35:35] and we also had one or two instances of major request vandalism/spam [16:35:50] so a spike somewhere might just be the result of that [16:42:06] I remember when the majority of requests would be declined [16:42:26] Oh before needs more details you mean? [16:42:45] probably [16:44:48] <_arawynn> oh for a change I understand some of the tech-speak 🫣 [17:03:54] Maybe should have included the state [17:04:02] Does phorge let you update a paste? [17:07:16] I can put a red black and green line [17:50:07] a growing number of reports indicate there is aftershock of the permissions shenanigans of yesterday, both in the support section and on the community noticeboard [17:50:38] tried clearing cache in one place, haven't seen if that did it yet [19:34:22] <.labster> new variable just dropped: `$wgAllowRawHtmlCopyrightMessages` [19:35:33] <.labster> I'm pretty happy, it isn't great how login pages can be XSSed by interface admins [19:39:49] We need more arbitrary code insertion points in MediaWiki [19:41:28] https://youtu.be/POB3Dr0uonc?si=L_M53S3WDA_02YQu "More asbestos! More asbestos!" [19:54:41] I love asbestos! [19:56:33] Is this an alagory for dumb mediawiki design choices [20:34:07] Why [20:36:23] Why not! [20:38:09] cancer [20:47:25] What are you even doing [20:47:43] Hush you /j [20:47:48] It adds flavor [20:48:01] Rhinos when am I ever not like this [20:48:11] Very true [20:48:15] Do I want to know [20:49:00] Know what [20:49:16] What you are doing with asbestos [20:50:42] Don’t blame me it’s mediawiki [20:51:44] Ok now I really want to know [20:52:00] ^ [20:54:35] It’s mediawiki [20:58:39] How [21:01:20] <.labster> just set `$wgAsbestosInsulation = false;` in LocalSettings [21:03:04] Peanut butter and asbestos πŸ˜‹ [23:05:22] til: php has two different "or" operators: https://www.php.net/manual/en/language.operators.logical.php [23:06:22] claire1 [23:06:24] !!!! [23:10:19] ? [23:10:28] i have questrion [23:10:29] i just woke up and decided to do a security review [23:10:32] ok [23:15:10] why [23:15:12] anyways [23:15:18] where is the nearest church of python [23:15:21] for no reason [23:15:32] i have confessions to make [23:16:36] i... don't know [23:16:40] you can confess your sins to me though [23:17:35] are you expanding your occupation into clerywomen? [23:17:54] would fit considering the amount of exorcisms you do to cargo [23:18:45] heh true [23:18:53] nah, i just like bad decisions [23:18:57] see: SpeciallyCursed [23:24:04] I've never seen xor in use ever [23:24:29] same for "or" until today [23:24:44] Cargo needs completely rewriting by someone who actually cares about security [23:24:59] I wish there were more structured data extensions [23:25:14] You've never seen someone use or in php? [23:25:19] nope [23:25:22] always || [23:25:44] the data trinity rules unopposed [23:25:53] Oh i didnt read it properly [23:26:02] I didnt realise that you can literally write or lol [23:26:11] Yeah ive never seen that either thats bizzare [23:28:12] imagine being the person who made "&&" [23:28:27] like, "hmm, what should we use for bitwise and? idk, & seems logical" [23:28:40] "...oh fuck what about logical and... eh, just do it twice lol" [23:28:57] And is good [23:29:02] But it can be ander [23:31:47] Anyways I managed to figure out how to parse a simple txt into a dict [23:31:54] Now we can get fun [23:31:57] \o/ [23:32:07] Have you done data visualization? [23:32:53] uhh i made two graphs i guess [23:33:02] those two were with excel though :p [23:33:10] Lmao [23:33:29] I’m basically remaking what I did in a speadsheet for my class into python [23:33:38] But I used google sheets not excel [23:34:17] May need to ask Reception to update the export though [23:35:36] Gotta say [23:35:53] Am surprised how fast python parses 50k lines with regex [23:36:42] Although i should probably export the dict to json or pickle it to save those few seconds each time I rerun