[05:35:47] where can I find checksums for the archives listed on https://www.mediawiki.org/wiki/Download [05:41:09] It seems that the server terminates the connection in a way that the download appears to succeed but in fact the file is truncated. reproducable with "trickle -d 50 https://releases.wikimedia.org/mediawiki/1.29/mediawiki-1.29.2.tar.gz" [05:42:07] * traumschule tried it with composer and ansible's uri module with the same result described above. Only ~25mb of 40mb are downloaded each time. [07:16:36] traumschule: hello [07:16:55] traumschule: there are gpg signatures that you can use to verify the download [07:18:45] traumschule: here's what I get: https://paste.fedoraproject.org/paste/JNE9f9lMndGakTqAkpUNNw/raw [07:36:12] pity, that's one step more. this module does not accept gpg signatures for verfification: https://docs.ansible.com/ansible/latest/get_url_module.html [07:37:03] for some reason the server return 200 even if the download was interrupted/the downloaded file is incomplete. how can this be? [07:37:48] calculate the checksum yourself? [07:38:08] e.g. https://github.com/wikimedia/mediawiki-docker/blob/master/stable/Dockerfile#L39 [09:42:43] hi [15:29:49] looks like redis cache layer does not support clustered redis instances. I need to patch this [15:30:27] I'm wondering if there is already a work going on support for clustered redis? [15:49:40] eren: I know of no plans to use clustered redis at Wikimedia [15:49:48] So I'm guessing it's not likely anyone is actually working on it [15:49:58] I don't know if it's even considered at all [15:50:05] I suspect, there's a potential that it's something we could use [17:04:14] what's the easiest way to get a plaintext list of pagenames or Template: pagenames from a wiki, ideally one template per line? [17:04:46] Special:Allpages is three per row which is inconvenient [17:05:43] What's your programming skills like? [17:06:00] Slim to none [17:06:09] Maybe I can whip together an API url [17:08:24] Yeah, api is probably easier, even for basic "scraping" [17:08:30] Depends if you need to paginate though [17:09:52] http://starter.wikia.com/api.php?action=query&list=allpages&namespace=10 Almost there... Actually trying to list all templates [17:12:04] use &aplimit=max [17:12:24] http://starter.wikia.com/api.php?action=query&list=allpages&apnamespace=10&aplimit=max [17:12:38] looks like there's not many [17:14:03] Yes it is the default wiki for the wikia network [17:14:24] It has only one article space page and a few templates. Trying to list templates [17:14:58] you needed the ap prefix on the namespace paramater [17:14:58] I think I will just do the tedious work [17:15:25] Look at my link [17:15:26] Oh shit you found it [17:15:32] I didn't even see you had a link. Very good! [17:15:35] lol [17:15:46] I've been doing this 10 years... hopefully I know what I'm doing [17:15:49] ;) [17:15:51] With a reasonable text editor... [17:15:57] And a simple regex, you can probably get the list [17:18:06] * legoktm points to names at the bottom of http://starter.wikia.com/api.php?action=help [17:18:31] I try not to spend much time on action=help [17:18:37] Wikia makes me cry [17:32:15] It makes me cry too. [17:32:54] I am trying to make their Templates wiki sensible. Right now there are so many redlinks if you copy to your wiki [17:34:06] Anyone know of any vanilla Mediawiki sandbox wikis? [17:35:07] what are you trying to do? [17:36:32] I would love to make Templates Wikia useful for Mediawiki users as a whole, by stating whether a template works in vanilla Mediawiki or not - or if you need Wikia's nonsense [17:37:05] Maybe I just make my own. Sudo apt get... [17:37:47] http://templates.wikia.com/wiki/Template:Is_SPW Sure smarter than editing every template to say... [17:38:33] I run a test wiki which you may be able to use. It isn't entirely vanilla, but probably close enough [17:39:03] If you say so, I will be respectful - and where I can, I will simply preview and not even save [17:39:21] https://www.thetestwiki.org [17:39:29] it's literally for testing random stuff, feel free to save [17:40:14] also hand out +sysop like candy, if you need/want admin [18:11:06] hi [18:12:09] If you guys accepted bitcoin you guys will make a lot of money. -Edward Kang have a nice day! [18:14:55] Edward__: See https://wikimediafoundation.org/wiki/Ways_to_Give#Bitcoin if you wish to donate Bitcoin to the Wikimedia Foundation. [18:14:56] Edward__: Wikimedia Foundation accepts Bitcoin donations. https://wikimediafoundation.org/wiki/Ways_to_Give#Bitcoin [18:15:05] * James_F grins at MatmaRex. [18:15:08] aw, i'm late. [18:18:40] Lol [20:33:39] Hi everyone :) [20:37:29] oh noes [20:47:24] can somebody point me to a public wiki that uses SubPageList, so i can see what it looks like? [20:52:07] Reedy, why noes? :D [20:55:14] hop: Which SubPageList? [20:59:57] Reedy: i am only aware of https://github.com/JeroenDeDauw/SubPageList [21:00:23] There's also https://github.com/wikimedia/mediawiki-extensions-SubPageList3 [21:00:54] i might be wrong, but there's a 3 there that i didn't mention ;) [21:01:13] but sure, i mean specifically the one from the link above [21:02:08] i faintly remember SubPageList3 being deprecated or replace by the other one? [21:02:33] we use 3 on wikimedia wikis [21:02:47] which ones are those? [21:03:11] 'testwiki' => true, [21:03:12] 'wikiversity' => true, [21:03:13] 'cswiktionary' => true, // T40903 [21:03:13] T40903: Enable SubPageList3 on cswiktionary - https://phabricator.wikimedia.org/T40903 [21:03:15] disclaimer: i have _no_ clue, i just want to avoid composer if at all possible [21:03:28] https://wikiapiary.com/wiki/Extension:SubPageList [21:03:37] That gives you a list that have that non 3 [21:04:01] i have to get back to you in a bit. tunnel coming up [21:22:46] Which vagrant version should I install on Ubuntu, the .deb package version or the one from apt repo? [21:25:52] What's the difference in versions? [21:29:50] The one from apt is 1.8.1+dfsg-1, while the provided deb package on their website is 2.0.1, so I guess the newer is required, right? [21:30:02] Not necessarily required [21:30:59] Reedy: that wikiapiary link seems really great, but how do i get from https://wikiapiary.com/wiki/GATE# to the actual wiki? sorry if i'm being daft [21:31:16] URL: [21:31:16] http://gate.unigre.it/mediawiki/index.php/Main_page [21:32:30] technically not an answer, but i take it (: [21:32:56] Well, it lists the url on the page... [21:33:24] where? i cmd-f'ed for url and there is no mention [21:33:51] neither for unigre or http [21:34:04] Um [21:34:05] Name: [21:34:05] GATE [21:34:05] URL: [21:34:06] http://gate.unigre.it/mediawiki/index.php/Main_page [21:34:10] Reedy, anyway I found this neat unofficial apt repo, works fine for me :) [21:34:12] https://vagrant-deb.linestarve.com/ [21:34:25] divadsn: Be careful using random apt repos [21:34:28] On the overview tab... [21:34:52] Reedy: i swear, that isn't there [21:34:58] Reedy, I know, I already had some trouble when doing a dist-upgrade on a old Debian OpenVZ VPS ^^ [21:35:02] Reedy: do i have to be logged in? [21:35:20] I'm not logged in [21:35:32] But the packages have the same checksum as the one from the official download page (: [21:36:18] hop: https://phabricator.wikimedia.org/F11055630 [21:38:10] Reedy: what's there? that presents me with an option to register [21:38:34] Try again [21:41:52] Reedy: http://30hopsmax.at/~hop/wikiapiary.png [21:42:10] Looks like a browser issue [21:42:15] Look at that settings [21:43:51] Reedy: ha, no. it's a server problem: mixed http/https content [21:44:04] Weird that it works fine in Chrome though [21:44:04] together with forced https on the page itself [21:44:13] unless HTTPSE fixes it [21:44:31] Except it's not enabled [21:46:20] for a second i thought it might be httpseverywhere, but it's inactive [21:47:03] anyway, that's yak shaving [21:49:55] Might want to report it to them :) [21:52:11] i want to go to sleep, mainly [21:57:44] hop, good night then [21:59:45] divadsn: the implication was that i can't yet [22:00:51] hop: oh, sorry, I sometimes don't see the irony [22:02:24] divadsn: you and me both