[00:01:32] opcache issues? [00:04:21] Probably. By all rights, this shouldn't be busted. And I don't know how, but it looks like it's just calling /wiki/ve instead of /wiki/vendor/etcetcetc [00:04:46] it's right in the ones above [00:04:53] so it seems odd that one is suddenly truncated [00:05:09] Yeah, that's why I'm really confused [00:05:20] None of this should be stored in the database, right? [00:05:53] $ php maintenance/eval.php [00:05:53] > echo class_exists( 'Liuggio\StatsdClient\Factory\StatsdDataFactory' ); [00:05:53] 1 [00:06:09] Nope... [00:06:31] Like I say, possibly opcache (what was apc historically) [00:07:49] When you disabled extensions... did you restart your webserver before trying again? [00:09:00] And similary... When they are enabled, does it fail immediately after a reboot? [00:09:06] No, I'll give that a shot [00:10:04] For reference... WMF prod... https://github.com/wikimedia/puppet/blob/abb4f946b8d1784e2df56fcb6a0a3688edb9168b/hieradata/role/common/mediawiki/appserver.yaml#L50 [00:10:16] https://www.php.net/manual/en/opcache.configuration.php [00:10:25] We bump 128 to 1024 [00:10:38] You probably don't need it quite that high... But we often run two versions of MW in parallel etc [00:12:25] But if you're running a lot of extensions [00:14:12] definitely isn't opcache [00:14:43] Now granted, I took a fresh 1.35 and pointed it to the database I had ported over, since I frankensteined the files together before and quickly realized that was a bad idea [00:15:12] But that shouldn't affect anything, since the path is the same [00:17:07] Maybe there's a point I can dump a variable to see if the vendor folder is being overwritten or transformed? [00:17:59] I tried that do that at the include function of the classloader, but i couldn't return any data. I suspect that's intentional. [00:24:08] I don't see how the ve bit would change [00:24:28] I don't think we make the vendor dir configurable... [00:24:29] / Load composer's autoloader if present [00:24:29] if ( is_readable( "$IP/vendor/autoload.php" ) ) { [00:24:29] require_once "$IP/vendor/autoload.php"; [00:24:29] } elseif ( file_exists( "$IP/vendor/autoload.php" ) ) { [00:24:30] die( "$IP/vendor/autoload.php exists but is not readable" ); [00:24:32] } [00:24:34] etc [00:27:39] Oh [00:27:51] jfolv_: Do I remember you saying you were extracting locally, and FTP-ing? [00:28:31] I might have. I could try throwing the tarball up and unzipping on server. [00:28:38] Possibly, yeah [00:28:39] Warning: include(/(wiki path)/vendor/composer/../liuggio/statsd-php-client/src/Liuggio/StatsdClient/Factory/StatsdDataFactory.php): failed to open stream: No such file or directory in /(wiki path)/vendor/composer/ClassLoader.php on line 444 [00:28:40] ffs [00:28:43] https://phabricator.wikimedia.org/T166969#3334000 [00:28:54] There's definitely people that have had trouble with this file in a similar workflow [00:29:41] Oh my god. It's a 7zip bug isn't it? [00:29:49] Or that, yes [00:29:54] One of the two [00:32:05] Well, I'll check it out. I'll try one I 7zip beforehand and one I unzip on server [00:33:16] TBH, I took "checked the referencing file" as you'd properly checked the file :D [00:38:00] Oh wow [00:38:19] I looked right at it in my terminal and didn't pick up on the fact that it has no file extension [00:38:25] heh [00:38:31] that'll do it :) [00:38:40] Yep, definitely archive bug [00:38:53] I got that from unpacking on server though [00:39:07] standard gunzip then tar -xvf [00:39:19] you don't need to seperately gunzip fwiw [00:39:23] tar will do it [00:39:39] Really? Didn't know that. I've been doing them separately. Could've saved some time haha [00:39:49] yeah, been like that for years [00:39:57] many years [00:39:57] don't even need any extra arguments [00:40:04] but that is curious [00:40:10] What version of tar? [00:40:11] legoktm: ^^ [00:40:47] 1.29 [00:40:53] GNU [00:41:00] whaaaaat [00:41:09] what OS? [00:41:18] Ubuntu... 18.04? [00:41:22] Pretty sure 18.04 [00:41:29] cat /etc/os-release [00:41:39] Yeah, 18.04.5 [00:41:53] Created through AWS [00:42:14] And you definitely untarred on the server? either originally, or just now to test? [00:42:28] Just now to test [00:42:39] Luckily I have some 18.04 vms around [00:42:41] * Reedy also tries [00:43:53] $ ls -al mediawiki-1.35.0/vendor/liuggio/statsd-php-client/statsd-php-client/src/Liuggio/StatsdClient/Factory/StatsdDataFactory.php [00:43:54] ls: cannot access 'mediawiki-1.35.0/vendor/liuggio/statsd-php-client/statsd-php-client/src/Liuggio/StatsdClient/Factory/StatsdDataFactory.php': No such file or directory [00:43:54] Hang on, I think I looked in the wrong directory. The server unpack has the correct files. I copied them over but they didn't successfully copy everything [00:44:03] Wait, you just got the same thing? [00:44:27] That path I used doesn't look right [00:44:44] Reedy: you have two statsd-php-client directories [00:44:44] Yeah, you doubled statsd-php-client [00:44:56] $ ls -al mediawiki-1.35.0/vendor/liuggio/statsd-php-client/src/Liuggio/StatsdClient/Factory/StatsdDataFactory.php [00:44:56] -rw-rw-r-- 1 reedy reedy 2718 Jun 23 01:27 mediawiki-1.35.0/vendor/liuggio/statsd-php-client/src/Liuggio/StatsdClient/Factory/StatsdDataFactory.php [00:45:03] Yeah, definitely there [00:45:07] Let me test this with 7zip now [00:45:09] $ tar --version [00:45:09] tar (GNU tar) 1.29 [00:45:42] Like I said, I probably looked in the wrong directory. I have a bunch of mediawiki 1.35-* installs from testing [00:45:49] Should probably prune some of the old ones now [00:46:22] ok, we've had actual bugs with the tarballs on Windows and macOS so that's why :p [00:46:35] legoktm: 7zip on *nix does break too, I think though [00:46:59] oh [00:47:04] well I'll look into that later then [00:47:30] Someone definitely reported 7z on either macos or *nix broke [00:48:06] Good to know [00:48:37] https://www.mediawiki.org/w/index.php?title=Template%3ADownloadMediaWiki&type=revision&diff=4154269&oldid=4134437 [00:49:08] Wow, I might need to take a break. When I copied over the server-unzipped data, I forgot to do /* after so I just copied the fresh unzip into a subdirectory of my wiki. [00:49:17] That's why it still failed >.> [00:50:46] Yep, just properly copied and everything works again [00:50:50] Thank god [00:51:06] Might be worth adding a caveat to the download/install page? [00:51:43] Because I'm pretty sure I unpacked that first install locally and uploaded it [00:52:01] In fact, just for science, I'll do it again [00:53:42] * Reedy stands back [00:53:57] Yep. Unpacking with 7zip is bugged [00:54:00] Same result [00:54:30] That file has no extension and the Interface one is straight up missing [00:54:50] I'm sure if I dove into the other directories, I'd find corrupted or missing pieces too. [00:55:08] Aye, there'll likely be some [00:55:17] especially deep nested/long resultant filenames [00:56:33] Oh man, it's already right there on the download page, too. "It is not advised to use 7zip to decompress the tarball due to a bug with the PAX format." [00:56:43] Well, this has been an exercise in me failing to read [00:56:49] I blame stress [00:56:53] you're not the first [00:56:58] you won't be the last :) [00:57:27] On the bright side, lesson learned [00:57:46] see also, https://en.wikipedia.org/wiki/Rubber_duck_debugging [00:57:51] !botbrain [00:57:51] Hello, I'm wm-bot. The database for this channel is published at https://wm-bot.wmflabs.org/dump/%23mediawiki.htm More about WM-Bot: https://meta.wikimedia.org/wiki/wm-bot [00:58:03] !rubberduck is https://en.wikipedia.org/wiki/Rubber_duck_debugging [00:58:03] Key was added [00:58:12] !dump [00:58:12] For information on how to get dumps from Wikimedia Wikis, see http://meta.wikimedia.org/wiki/Data_dumps . For a how-to on importing dumps, see https://www.mediawiki.org/wiki/Manual:Importing_XML_dumps . [00:58:50] Oh I love rubber ducking. [00:58:54] It's really helpful [00:59:56] Well, now back to re-enabling and updating extensions [01:00:22] Thanks again, Reedy; you're a lifesaver as usual [01:00:35] (And you too, legoktm) [01:00:49] I did nothing :) it was all Reedy this time [02:15:40] Is it possible in any way that a var_dump of the extension messages files could end up getting cached and spat out every page load? [02:16:07] Because I think that's what I'm encountering here. All my custom extensions' message files are being outputted at the top of the pages in full JSON format [02:16:14] But ONLY those [02:17:05] I did some debugging earlier, but I'm 100% undoubtedly sure that the var_dump has been removed. I've grepped, grepped, and grepped some more to be sure. [02:18:02] jfolv_: do you have filecaching enabled? [02:18:16] $wgUseFileCache, because that could get cached [02:18:21] Let me check [02:20:36] Nope, even tried php maintenance/rebuildFileCache.php to be sure, and it tells me nothing to do [02:21:30] Actually, looks like it can't possibly be a cache. I disable one extension and the JSON string disappears [02:21:54] So now why are my custom extension message files being dumped raw? [02:24:39] Here's the even weirder part. Sometimes it shows up, sometimes it doesn't. [02:30:50] Alright, I've got a better idea. Let's tackle a different issue first and I'll come back to that later. For now, I've got an extension that adds parser functions but I get an error telling me "invalid magic word: sc" for this code https://pastebin.com/AZ2Ssc9P [02:31:32] This extension was written by someone else and I'm trying to update it to the new format. I thought I followed the instructions [02:32:52] I just realized there's no closing brace for the class in that pastebin, but it is there in the file. I just truncated the rest because it's redundant [02:50:23] If it makes a difference, this is the error I get: https://pastebin.com/fJPu2hBK [04:07:46] Took a break and came back, found that the parser hooks weren't working because I didn't have the $magicWords array set in a different file. This presents a question, though. How do you define both a messages file and a magic words file in the extension.json? [04:17:44] Nvm, figured that out too [04:31:01] Is there some flag I have to set in the magic file to declare that I'm setting a tag? I swear I found it in the documentation but I can't find it again [04:33:30] I checked the https://www.mediawiki.org/wiki/Manual:Tag_extensions manual, but it warns that it's outdated and following that format just makes MediaWiki break saying that my "magic words" aren't defined [04:41:06] Figured that out too; was using setFunctionHook instead of setHook. [05:07:26] is there any way to use templates from an interwiki? so like {{:wikipedia:foo}} [05:08:37] aha, there's a setting for it [05:09:45] Last I remember, that was really buggy [05:13:07] it may be [05:13:11] it's ok, this is only temporary [05:13:19] I need it for testing only [05:14:08] Ah, okay. If I remember correctly, our wiki network had an idea to try using that to use each others' templates, but we scrapped the idea because it was just a nightmare [05:14:23] (This was years ago, though; it could be much better now) [05:15:19] indeed that seems like the right choice :) [05:15:45] the issue here is I am waiting for a patch to mediawiki core so I want to use a temporary patched mediawiki install until I can get the patch in for wikisource [05:15:56] was hoping I could get it to work but might be hard [05:16:35] there are other workarounds I can try [05:20:57] basically I'm trying to call a parser function with an argument that's the transcluded content from a page. but the page doesn't exist on the temporary install, just wikisource [05:21:10] I could just fetch the wikitext and insert it myself, seems easiest workaround [06:59:41] i have image with size 500x20 pixels [07:00:48] when i put it in [[File:Image.png||Capption...]] [07:01:00] and mediawiki upscales it to be really big [07:01:16] how to avoid it [07:01:29] i have problems with server so mediawiki cannot make thumbnail [07:06:20] bassicaly how to make image fit on page if it is too large [13:16:16] Hi, I want to show as part of Cargo query whether _pageName is part of a category (predefined one). Anyone has a simple idea how to do that? [13:17:39] I tried to do that with creating a list of the pages in the category using DPL (userformat), format=,%PAGE% .., and using CONCAT(#arraysearch:_pageName), But something seems to not working [13:18:12] There is anything similar to: {{page_in_category:_pageName, categoryName}} [17:58:13] bd808: Re: your Vagrant email, I get a nice Restricted Application message at that link [17:59:07] Lcawte: ack. You are the second person to report that. I'm trying to figure out why. It looks like some folks can see it and others can't but I'm not sure why yet. [18:04:38] Lcawte: can you see the poll at https://phabricator.wikimedia.org/T265164 ? [18:05:11] Nope [18:05:21] hmmm. mysterious! [18:05:29] Has someone changed the slowvote app permissions to staff/admins only or something? [18:05:53] I was just wondering if you need to be in one of the acl* groups [18:06:03] * bd808 looks for config settings [18:06:08] https://phabricator.wikimedia.org/applications/view/PhabricatorSlowvoteApplication/ [18:06:37] ah ha! You need to be in Trusted-Contributors to use it [18:06:54] ... am I not trusted? *storms out* [18:07:23] * bd808 pushed buttons [18:07:30] Lcawte: you are now trusted :) [18:10:06] ty ;) Did my URL hack actually work? I assumed it hasn't changed from whatever it is in the ShoutWiki Phabricator. [18:10:21] Lcawte: it went right to the place :) [18:23:41] I'm running a poll on MediaWiki-Vagrant adoption at T265164. Your participation would be appreciated! [18:23:42] T265164: Evaluate usage of MediaWiki-Vagrant by technical contributors - https://phabricator.wikimedia.org/T265164 [21:26:15] If you are a "Trusted-Contributor" [1], can you please add me? (LDAP User: Freephile MediaWiki User: GregRundlett) [21:26:15] [1] https://phabricator.wikimedia.org/project/profile/3104/ [21:27:31] I want to respond to Bryan Davis' poll about MediaWiki-Vagrant [21:27:39] Added [21:27:52] Thanks #Reedy [23:36:27] So, I'm using a transclusion template on page A which includes page B, which uses the same template to include page C -- but that's as far as it goes; it's not really a loop... but I'm still getting the "Template loop detected" error. [23:36:29] Any ideas? [23:41:11] Oh, I think I see what's going on -- the template in page B is operating relative to page A, instead of relative to page B. Meh. [23:41:56] That's... a problem. :-/ [23:43:59] woozalia: you can think of template inclusion as just copying the internal wikitext, not the content [23:46:37] Special:ExpandTemplates can help seeing how the page gets rendered