[02:56:40] Hi, Im trying to get the umask for mediawiki or apache to be 027. I've tried adding a umask to the systemd unit and then the LocalSettings.php file with unexpected results. It seems to work for directories, but files that are created are still wr-r--r-- . Any ideas what might cause this? [02:57:14] Files that are uploaded to the images directory. [02:57:56] I did them individually but cannot seem to affect file creation perms, [08:12:22] *!%i@%h [09:38:22] addshore: should this be closed or is there work remaining? (Babel DB) https://phabricator.wikimedia.org/T243726 [09:39:36] Krinkle: resolved [09:41:54] Thx :) [10:00:15] Hi, I'm having issues setting up Extension:Math with Mathoid on my server... When I try rendering a tag it generates a fatal exception of type "Wikimedia\Rdbms\DBQueryError". I've tried setting $wgShowExceptionDetails = true; but it gives me a white page when crashing instead. [10:01:10] This is my LocalSettings.php: https://paste.ubuntu.com/p/WmTB6qz5d4/ [10:19:03] Nevermind, entirely my fault... I forgot to run update.php [11:40:37] trying to get ldap login working after upgrade (which requires mie to use the new LDAP Stack) - "[PluggableAuth] ERROR: Please choose a valid domain" - any tipps/examples/pointers? [12:48:15] Ah, one can set a default domain ... now things work ... onto the next plugin ... [13:12:59] Hey all, I'm following up on a question from yesterday that may have gotten lost in the shuffle. We're seeing like 50+ JS loads on our main page alone, compared to Wikipedia's under 10. Is it possible we have something misconfigured with our Varnish or MediaWiki configs? https://i.imgur.com/h3CjL4z.png [14:46:40] justinl: looks like resource loader isnt bundling your stuff like it should [14:58:17] Betacommand what would cause it to do that? I'm not very familiar with the details of the resource loader. [14:58:32] FWIW I just upgraded from MW 1.30 to 1.34 and SMW 2.5 to 3.1 [14:59:38] !resourceloader [14:59:38] ResourceLoader is the delivery system for JavaScript/CSS in MediaWiki. First released in MediaWiki 1.17. See also https://www.mediawiki.org/wiki/ResourceLoader , https://www.mediawiki.org/wiki/ResourceLoader/Migration_guide_%28users%29 and https://www.mediawiki.org/wiki/ResourceLoader/Migration_guide_for_extension_developers [15:00:56] I've been managing these wikis since around MW 1.16 and never really had to dig into this before. I don't think this problem has existed in our wikis until now, but I could be wrong. I'll take a look at the links you provided. Thank you. [15:06:57] Not even debug mode does that, AFAIK [15:07:13] Cause the JS seems minified [15:13:17] One of our editors mentioned that it seemed like debug mode was on, but I didn't see anything enabled that would do so, to the best of my current knowledge. [15:15:43] $wgResourceLoaderDebug would be the related variable [15:16:02] But like I say, if $wgResourceLoaderDebug was true, it wouldn't be shipping you minified JS (unless the source file was minified) [15:16:40] is $wgResourceLoaderMaxQueryLength set to some very low value? [15:17:09] Default is 2000... [15:17:13] So like, <100? :P [15:17:58] i can't think of anything else that would cause this behavior [15:19:10] justinl: ^ [15:31:13] None of the resource loader variables have been changed from their default values. [15:31:59] Timo would probably be the person you want to ask, but he's marked /away atm [15:32:01] actually I take that back [15:32:04] # Query string length limit for ResourceLoader. You should only set this if# your web server has a query string length limit (then set it to that limit),# or if you have suhosin.get.max_value_length set in php.ini (then set it to# that value)$wgResourceLoaderMaxQueryLength = -1; [15:32:14] That'll be it [15:32:15] # Query string length limit for ResourceLoader. You should only set this if# your web server has a query string length limit (then set it to that limit),# or if you have suhosin.get.max_value_length set in php.ini (then set it to# that value)$wgResourceLoaderMaxQueryLength = -1; [15:32:22] eh syntax [15:32:24] I suspect -1 doesn't make it unlimited [15:32:43] Hmm maybe the behavior changed. Anyway, I'll dig into that on my dev wikis. Thanks for the pointer! [15:33:34] // If the url would become too long, create a new one, but don't create empty requests [15:33:34] if ( currReqModules.length && l + bytesAdded > mw.loader.maxQueryLength ) { [15:33:34] // Dispatch what we've got... [15:33:35] doRequest(); [15:34:51] wtf Reedy - so many spaces [15:35:00] DSquirrelGM: Copy pasted from the source [15:35:05] The only recent change is https://github.com/wikimedia/mediawiki/commit/194bdc09a8cc36741b9595eefc7b1aab26187b8b [15:35:11] But that's in 1.33 too [15:39:04] Interesting, the https://www.mediawiki.org/wiki/Manual:$wgResourceLoaderMaxQueryLength says the default value is -1 but it's 2000 in DefaultSettings.php as of 1.34 at least. [15:39:15] Commenting out my setting fixed my dev wikis. [15:40:21] lets see [15:40:53] It's false in 1.33 [15:41:34] And nothing in RELEASE-NOTES [15:42:12] And false in 1.31 [15:42:28] # grep wgResourceLoaderMaxQueryLength DefaultSettings.php $wgResourceLoaderMaxQueryLength = 2000; [15:42:32] # grep wgResourceLoaderMaxQueryLength DefaultSettings.php $wgResourceLoaderMaxQueryLength = 2000; [15:44:05] $ git blame includes/DefaultSettings.php | grep ResourceLoaderMaxQueryLength [15:44:05] 3ac385a0c39 (Timo Tijhof 2019-08-31 23:30:22 +0100 4072) $wgResourceLoaderMaxQueryLength = 2000; [15:44:20] https://github.com/wikimedia/mediawiki/commit/3ac385a0c39 [15:45:27] justinl: For clarification, was it explicitly set in your LocalSettings.pph? [15:46:35] Yes, though I don't recall why. I don't have that change history any more since I'd ported our Salt repo from our perforce to github server. [15:47:51] I've updated https://www.mediawiki.org/wiki/Manual:$wgResourceLoaderMaxQueryLength [15:49:36] I wonder if https://gerrit.wikimedia.org/r/#/c/mediawiki/core/+/533722/3/includes/Setup.php is what bit you explicitly [15:49:41] I'd done a pretty thorough review of all of my LocalSettings.php stuff for this upgrade, especially since I was taking the opportunity to merge a whole lot of dev and live static LocalSettings.php files into a single SaltStack jinja template. [15:54:01] It looks like it was set to false in the MW 1.30 DefaultSettings.php so that block of code would have indeed done the right thing. It's removal was causing the value to stay at our -1 setting, so yeah, sounds right. [15:59:11] >If set to a negative number, ResourceLoader will assume there is no query string length limit. [15:59:15] That's wrong [15:59:40] So I guess you set it per the docs [16:00:27] Yeah I must've set it years ago. [16:00:57] Well I'm glad my issue helped find this documentation issue so it'll help people going forward. [16:11:49] I've made some updates and I've poked the person who made the change to review and make further changes [16:20:26] On a totally unrelated subject, how well does MW work under Nginx as compared to Apache? I'm looking at significant architectural changes to my MW environments this year, now that we're upgraded to 1.34 in AWS, and as part of possibly splitting off Varnish and PHP-FPM to their own sets of servers, I was considering the general performance [16:20:26] differences of Nginx vs. Apache. Is there any strong reason NOT to do that? [16:27:02] Probably splitting frontend cache to seperate boxes will be beneficial [16:28:42] currently i have varnish + apache + php-fpm stacks on a few ec2 instances, which is basically what I've had since the beginning, but I'd recently read that multiple varnishes could be made to look like a single instance (not sure how to do that yet) so doing so could allow for better app and OS tuning as well as instance types better aligned with [16:28:42] app needs [16:29:10] the main issue with varnish in the past, esp with respect to how I configure my systems with saltstack, has been $wgSquidServers. [16:29:31] If that could be made a single unchanging value, that would be so much simpler. [16:31:05] and then splitting off php would help improve actual web server tuning and performance and, given how huge and complex are wikis are with SMW and layers of templates, even small changes can create tens of thousands of jobs, so a PHP farm could also be better designed to hande the job queue more efficiently. [16:31:27] s/are wikis/our wikis/ [16:34:57] but going back to my original question, are there any significant benefits, negatives, or tradeoffs in moving from apache to nginx as far as mediawiki is concerned? [16:35:57] well, apache works for wikipedia ;p [16:38:05] Yeah I know it's typically the standard server to use for MW, but I hadn't looked into it in a while, didn't know if over the last few years if anything had changed to warrant serious consideration of Nginx instead. I'm fine staying, considering our current Apache config works well, just wanted to make sure I did my due diligence for research all [16:38:06] components of our wikis when designing our new environments. [16:38:47] The other big part I'm trying to do is eliminate redundancy. I have 7 wikis, so 7 copies of mediawiki per web server. [16:40:00] It's gotten much easier to manage over the years, though, both using SaltStack and with a couple of build and deploy shell scripts I wrote that create an entire /var/www/sites directory containing all 7 wiki subdirs, each a copy of MW with its specific config, and then rsyncing them to the appropriate set of web servers. [16:41:33] One thing I may test, though, is moving to a single extensions directory that /var/www/sites/$wikiname/extensions links to, rather than needing 7 whole copies of it. I'd considered having a single MW directory using the "wiki family" approach, but that would affect my apache vhost configs, so simplifying stuff underneath is the next best thing. [16:42:05] Symlinks should work fine [16:42:19] You can do git linked clones too.. Depending on how you're setting things up [16:42:31] So you're not carrying N git repo clones etc [16:42:36] I do that with the images dir now, /var/www/sites/$wiki/images is a link to /var/www/images/$wiki, where /var/www/images is an AWS EFS NFS mount. [16:45:09] Well the problem with the git part is the configurations like LocalSettings.php, robots.txt, and a couple other things in each wiki's top-level dir, are managed by Salt on the "wiki manager" server where I build the sites directory and deploy it to the web servers. So if I want to upgrade, say to 1.34.1 or 1.35, I build a new directory on the build [16:45:09] server, run Salt there, then do the rsync script. [16:59:23] Actually now that I'm thinking about it more carefully, the git linked clone idea is starting to make a lot of sense, but there are details to figure out. If I understand, upgrading base mediawiki would be basically "git pull" and "git checkout ", right? [17:00:09] I'd still need to figure out how to handle skins and extensions, esp extensions that must be installed via composer. [17:06:03] Installing extensions via composer? [17:06:05] * Reedy shudders [17:12:43] Unless there's a better way, I've needed it for SemanticMediaWiki, SemanticResultFormats, Maps, PageForms, Validator, smarty, and most recently due to a bug, AntiSpoof has some dependencies in its own composer.json [17:13:02] All others I just git clone out of gerrit. [17:13:52] Yeah, extension dependancies are fine [17:14:12] It was decided a long time ago not to support installing extensions via composer... But some groups carry on the practice... [17:14:45] Semantic MediaWiki has the source of much complexity. [17:14:51] "has been" [17:18:48] Honestly our wiki configs and procedures could probably stand some expert review. :P [17:19:41] If you don't mind a slightly bloated vendor checkout on some wikis.. You could use/abuse composer-merge-plugin to grab all the dependancies from the different extensions on different wikis and combine them [17:19:44] Even though I've been managing them for years now, built them up almost entirely from scratch, it's been quite the learning process and I've generally been the only one doing it. They've iteratively gotten better but I'm sure there are still lots of places with room for improvement. [17:20:05] There's numerous things that don't really have "best practices" [17:20:13] I mean, have you seen how we configure wikipedia et al? :P [17:20:36] Yeah, but Wikipedia is it's own beast with scale requirements that frighten me. [17:21:55] Our biggest wiki is "only" around 220k pages and about 82k uploaded images. :P [17:23:38] As for the composer thing you mentioned, I've only worked with it enough to get it working for our wikis. I'm not a PHP dev and don't really understand the intricacies of composer.json, etc. I've gotten to where I am creating a composer.local.json for each wiki, generated by the build script from templates, and then running composer against each [17:23:38] directory, but that's about as deep as I've gotten. [21:24:19] Hi, I'm having trouble getting Mathoid and VisualEditor to play nice together... tags don't render until I "Save changes". Is this because I'm using Mathoid's "cli.js" instead of running a server? [21:29:26] divx: Yes. [21:29:47] I'm not sure if it's possible to do it otherwise. [21:30:18] Alright, thank you. I'll try setting that up now [21:32:24] James_F: Do I also need RESTBase or is that unrelated? [21:32:50] divx: I think using RESTBase is easier than trying direct linkage. [21:33:11] But all of this stuff is currently subject to flux as we replace technologies. [22:01:35] Hi guys, I#m trying to use Replace Text, but it's only working for 1-2 pages. If I try to run it for 2k+ pages the command line replaceAll.php simply freezes without an error. And the special page simply does only replace 20 pages. But it shows that it finds a lot more. Does anyone has an idea what the problem could be? [22:04:58] should I worry about the following? [core:error] [pid 19560:tid 1576] (20024)The given path is misformatted or contained invalid characters: AH00127: Cannot map GET /mediawiki/index.php/Special:RecentChanges HTTP/1.1 to file, referer: http://localhost/mediawiki/index.php/Main_Page [22:19:40] Anyone? [22:39:08] James_F: Well I managed to get it running without setting up a RESTBase instance. I just didn't specify $wgMathFullRestbaseURL in LocalSettings.php and it seems to work [23:35:09] ..?