[08:01:19] Hi, have an API question [08:02:05] At https://www.mediawiki.org/wiki/Extension:CentralAuth/API, how can I do list=globalallusers with only user ID? All options seem to point to usernames [11:20:53] Hello folks. I was wondering - why is the code for mediawiki & co not hosted on github, but rather gerrit? [11:20:53] For context - it took me a while to set up everything to be able to contribute to the code, whereas github was already set up and is used by a lot of developers. [11:29:46] armin_r: because that would force our developers to use propiertary software and not give us full control of our data, while hosting it on our own servers does not have those downsides [11:34:37] Majavah - could you explain that a bit more, why is it bad to use proprietary software? [11:34:38] From my perspective, the goal is to write good software that empowers wikipedia. Sticking to OSS would be a secondary goal. Also, IMO, the best OSS is privately supported (e.g. Linux via Redhat / MS / Canonical etc.) [11:39:28] https://meta.wikimedia.org/wiki/Mission [11:39:52] Majavah: GitHub exists as an on-premises solution for those who need it [11:40:22] and https://meta.wikimedia.org/wiki/Right_to_fork [11:44:21] Nemo_bis: for clarification, you referenced the "Right to fork" as an argument _for_ Github, right? That's what I understand from the document, specifically the _right to fork towards [mediawiki]_ part [11:49:48] Leaderboard: which still is non-free (both as in price and freedom), which would require us to make our tools for software that we could lose the right to use at any moment and not give us ability to modify it if needed [11:53:28] armin_r: https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Guiding_Principles#Freedom_and_open_source explains my thoughts about it fairly well [11:54:17] Majavah: "we could lose the right to use at any moment and not give us ability to modify it if needed" that is a common anti-pattern of software development: over-engineering something for _possible_ future needs and Not-invented-here syndrome. [11:54:18] Github uses git for versioning, so getting the code in/out is trivial. [11:54:18] Import / Export / Sync of issues and issues / discussions and projects is also available. [11:54:19] Any external CI tooling can be used, if you don't want to use Github Actions (which are fairly new anyway). [11:54:19] You could at any moment switch to an OSS alternative, given the right setup. [11:56:04] https://mako.cc/writing/hill-free_tools.html [11:57:29] armin_r: it's not a hypothetical situation, github has removed projects before and even entire organizations when one of its members tried to use github in a country that has trade sanctions with the US [11:58:39] armin_r: updating all of our tooling and all contributors migrating the URLs of their clones and to use different APIs is non-trivial [12:00:10] Majavah: [12:00:11] re guiding principles - that's a good read, thanks. "Freedom and open source" seems to contradict "Serving every human being". E.g. "We endeavour to create the structural support and the necessary preconditions for bottom-up innovation by others.". But I assume "Freedom and open source" supersedes the latter. [12:01:09] Majavah trade sanctions is a fair point, didn't know about that! Thanks! How about gitlab then? That is open source, and the major competitor to github, with a large community behind it. [12:02:19] armin_r: we're already in progress of moving to a self-hosted gitlab instance :P [12:02:43] Majavah I don't dispute that migrating away from Phabricator / Gerrit would be a non-trivial effort. I do believe that the benefits would outweigh the initial cost. [12:02:44] ATM, the barrier to entry for code contributions is high and very different from what _I_ would consider industry standard. [12:03:30] Majavah "we're already in progress of moving to a self-hosted gitlab instance :P" well, that's awesome! Do you have an ETA on that? No pressure, just curious about it. [12:03:54] (btw, gitlab ci has some awesome features, I really like it) [12:05:15] armin_r: still in very early stages for now, https://www.mediawiki.org/wiki/GitLab/Roadmap is the best I can give you [12:05:43] phabricator is out of scope for this migration, at least for now [12:06:40] one step at a time :) [12:07:17] Oh, an squashing is on the horizon too, that's awesome! 😍 [12:09:54] Majavah is there an official channel to listen in / pitch in on these technical discussions and decisions? [12:09:55] e.g. I would have like to pitch in on the "modern javascript library" discussion - the assessment that only react.js and vue.js are valid candidates is correct, but I believe the arguments for Vue.js are faulty. [12:17:32] armin_r: wikitech-l mailing list, and the newly reformed decision making process https://phabricator.wikimedia.org/tag/tech-decision-forum/ [12:18:09] Majavah thanks and thanks for the nice discussion! [12:18:52] Majavah which mailing list do you suggest? [12:24:02] armin_r: wikitech-l, https://lists.wikimedia.org/mailman/listinfo/wikitech-l [12:25:16] Thanks! [12:25:16] I got to https://www.mediawiki.org/wiki/Developer_Advocacy/Developer_Portal and the first 2 quotes immediately resonated with me: [12:25:17] "Upon arriving on the MediaWiki and Wikitech home pages, I was instantly lost." [12:25:17] "MediaWiki documentation is not only infamously incomplete, but also terribly scattered." [19:18:59] RhinosF1: what's $wgMainCacheType [19:19:35] Nemo_bis: https://github.com/miraheze/mw-config/search?q=MainCacheType&type= [19:19:37] (on your wikis affected by https://phabricator.wikimedia.org/T277417#6912139 ) [19:20:08] I'm confused, Miraheze survived thus far without caching? :) [19:20:51] Nemo_bis: I'm confused but instant commons ain't cached [19:21:09] memcached https://github.com/miraheze/mw-config/blob/master/GlobalCache.php [19:21:57] uh but not for $wgParserCacheType [19:21:58] That's where it is [19:22:11] Memcache is new [19:22:16] It used to be redis [19:22:55] Nowadays it should be easy to use CACHE_ACCEL for small wikis, but miraheze got quite large [19:23:13] I'm not sure what's the eviction for DB $wgParserCacheType [19:23:29] Neither do I [19:23:33] paladox: ^ [19:23:49] ? [19:23:49] See -operations for first half of convo [19:24:11] paladox: talking about instant commons not seeming to be cached and bringing us down [19:24:27] And probably other third party wikis [19:25:00] oh [19:25:08] If you have default InstantCommons (hotlinking the images rather than thumbnailing them locally), the parser cache should make sure you're not calling the Commons API all the time [19:26:49] https://github.com/wikimedia/mediawiki/blob/bbfc91be430d60e4f7f5271836bc183087c1350b/includes/Setup.php#L393 according to that the api cache is disabled [19:26:54] unless i'm reading it wrong [19:27:05] https://github.com/wikimedia/mediawiki/commit/a75973d8833277eb0683749a37bb0d517cdd3116