[00:47:00] Jdlrobson: trying to isolate the mw.util.$content issue. I guess something is (wrongly) using it at load time. Passes in CI on mediawiki-core plainly and for me locally with various extensions. I guess one of the gated extensions are at fault. Which ones do you have installed locally? [00:47:39] I just installed Minerva and noticed its test fail to load due to "TypeError: M is undefined" - I guess a MF dependency may have snuck in? I'll install MF right after now, but fyi in case that isn't intended. [01:12:44] k, I found it. some non-test code is loaded calling mw.notify() [01:12:46] oh well, easy fix [14:32:01] Hi there, with [[Extension:External Data]], how to specify HTTP headers like "Accept" and "Accept-Language"? [14:41:32] Hm… Gess, MediaWiki [[Extension:External Data]] doesn't support specifying headers nor sets "Accept" header according to format :S [14:43:32] adfeno - I am looking now. Hold on and let me see if I find anything [14:48:11] adfeno - Look through the conversations here https://www.mediawiki.org/wiki/Extension_talk:External_Data and if you do not see an answer, you can ask it on that page. [15:00:08] Galendalia: Thanks, I'll look that up [15:00:20] I must test it somewhere [15:01:35] Yeah you can talk to the people in there (including the creator) s/he may have a place to do so. [15:17:23] Galendalia: Is there a wiki with recent version of [[Extension:External Data]] enabled, and that allows unregistered user to make Sandbox page with any query? [15:17:55] I must test something but the wikis I know of don't have updated version :S [15:19:34] Or can anyone confirm that that extension now has a |options= argument which accepts declaring HTTP headers through arrays? [15:20:05] (In which case I don't need to test by myself) [15:22:25] adfeno - again that is your best place to ask those questions [15:30:48] I have a wiki under a domain, say, example.com (full url will be example.com/wiki). I made a subdomain for my wiki, wiki.example.com, that has a documentroot in /var/html/www/wiki. Now, if I access wiki.example.com, it will just redirect me to example.com/wiki. How can I make it so wiki is accessible both from example.com/wiki and wiki.example.com, without any redirecting? [15:33:31] Hi Jedenastka - Wikipedia is not a webhost solution. We can only support Wikipedia and help out with anything that is within our namespace. [15:34:22] Galendalia: this is the MediaWiki support channel :/ [15:34:46] which is still the same thing and part of wikipedia [15:35:38] wait what, mediawiki is not wikipedia [15:35:53] No it isn't? MediaWiki is the software used by Wikipedia, other WMF projects and many other third-party sites [15:36:06] mediawiki is used by wikipedia and partially made by wikipedia [15:36:20] but it isn't wikipedia [15:36:54] It is all under WMF :) [15:36:56] We even have a page telling about the differences at https://www.mediawiki.org/wiki/Differences_between_Wikipedia,_Wikimedia,_MediaWiki,_and_wiki [15:39:10] Jedenastka - https://www.mediawiki.org/wiki/Project:About#What_MediaWiki.org_is_not [15:40:00] you are only pointing out your own mistakes... [15:41:29] for anybody having a small irc client window, there is a question up there. just sayin. [15:43:28] jedenastka - I have asked an admin to intervene. They will be with you shortly [15:43:48] jedenastka: Technically you could do some hacking in LocalSettings.php that changes $wgArticlePath based on $_SERVER['HTTP_HOST'], but not sure if that would break something [15:44:25] Galendalia: if a question about configuring MediaWiki isn't suitable here, where it would be? [15:46:38] jedenastka: I guess that redirect is made by the webserver itself (apache, nginx...), not MediaWiki [15:46:49] Have you checked your configuration? [15:48:03] Vulpix: it isn't - i checked: the virtualhost is bassically the same for every subdomain, and every other subdomain doesn't redirect me. [15:49:06] Majavah: so it isn't just an option and I must do it myself? ok so, I will try. [15:49:59] jedenastka: afaik that isn't supported my MW itself [16:03:17] majavah - when I come back can I ping you with some questions I have please? [16:04:05] Galendalia: sure, given it's not too late here in Europe then [16:04:41] okies. I just need to take my counter vandalism final - shouldn't be too long [16:05:09] sure, ping me when you need [16:05:25] Thanks [16:20:01] jedenastka: i think you're going to have to accept there will be one authoritative URI for your wiki. mediawiki will generate URIs based on the $wgServer setting [16:22:28] that is, as soon as someone navigates to the 'other' URI, they'll get pushed over to the $wgServer either immediately, or after one click [18:10:47] duesen: reminder on https://phabricator.wikimedia.org/T250261 - would be good if we can close up that incident by next week. [18:19:06] Krinkle: so, HTMLCacheUpdateJob::invalidateTitles should never purge history pages? It's never used to purge the page that was edited, only pages that were affected indirectly? [18:19:51] In other wirds, this method MUST not be used to do the purging for the edited page? Is this documented? It wasn't obvious to me during review [18:21:06] duesen: I know that on direct edits we handle this directly, there is no job involved. I also know this code is already assuming it is for recursive updates. But as for whether it is ever used for non-linksupdate purposes, I'm not sure. Let me check. [18:23:54] Krinkle: I'm worried that perhaps we may in the future change the code to rely entirely on this class to do all purging, for simplicity and consistency or other reasons. It should be stated clearly what it does and does not purge. [18:25:30] duesen: No, that wouldn't be in the job, that would be in the service [18:26:01] I've conirmed all uses in Codesearch including third parties. It's only ever used with newFromBacklinks or otherwise for something that is unambigiously a re-render for a cascading update. [18:27:08] duesen: I was about to update the doc but: [18:27:10] " * Job to purge the HTML/file cache for all pages that link to or use another page or file" [18:27:14] so, LGTM :) [18:27:52] Ok then, let's do it :) [18:28:24] duesen: agreed on naming being confusing. assuming we go ahead with the idea of moving the purges to be post-reparse from LinksUpdate, then this job will disappear by being folded into refreshlinksjob [18:28:48] yay [18:29:40] both patches have a +2 now [18:29:45] thx [20:43:57] cscott: does something stand out here in your commits as a probable cause? https://phabricator.wikimedia.org/T250568 [20:52:48] Krinkle: not obviously -- seems to be something in Language.php not Parser.php? Looking into it. [20:53:45] you should be able to push throwaway branches with the intermediate commits to get travis to tell you exactly which is at fault, though, no? [20:54:24] cscott: Yes, good point. [20:55:26] Krinkle: none of those patches should have had any effect; they were all deprecations or affected unused code [20:55:32] difference between theory and practice, of course [20:55:47] but there were lots of changes to extensions around the time of landing the firstCallInit() deprecation [20:56:15] it's *possible* that travis is using some older/unsynced version of mediawiki-vendor, and so is picking up some breakage? [20:56:30] s/mediawiki-vendor/mediawiki-extensions/ i guess. [20:57:35] does anyone in here know a lot about writing extensions? [20:57:58] i know a bit about rewriting extensions ;) [20:58:09] * Krinkle pushes [21:00:30] cscott, Krinkle: Do you know how to make an extension set a mediawiki config setting when the site loads? (ex. set $wgSitename) [21:01:01] Krinkle: the valid namespaces seem to come from the localization cache in Language::getNamespaces(), so it's possible that setup differs between travis and jenkins CI? [21:01:47] cscott: tests should not be influenced by local settings, but yes, they differ slightly. [21:02:06] That's in part its added value, to catch flukes that accidentally code something against that. [21:02:15] Often stuff failing in travis is stuff failing for local devs [21:03:05] Travis runs Ubuntu with uptream packages for php and LocalSettings is just from the installer + DevelopmentSettings.php, nothing else. [21:03:33] https://github.com/wikimedia/mediawiki/blob/master/.travis.yml#L54-L65 [21:04:11] Krinkle: yeah, wasn't saying that to be dismissing, maybe we've accidentally broken some localization cache setup which travis uses and jenkins doesnt [21:04:26] although i can't see exactly how in the limited number of patches you've fingered [21:04:34] They're both 'en' and ArrayLCStore [21:04:36] *dismissive [21:04:50] Krinkle: oh, too bad then. so much for that theory. [21:05:08] however CI runs the tests in two chunks [21:05:09] Examknow: Krinkle and I had a longish debate over that recently but [21:05:13] all non-db'ed and all db'ed [21:05:25] Examknow: the short answer is to add a hook and set the global variable [21:05:30] Examknow: let me see which hook it is that you want. [21:05:49] Examknow: if the extension needs to set or change configs, that's best from from an extension.json "callback" function. [21:05:52] Krinkle: oh, maybe one test run is loading (or corrupting) the cache for the other? [21:06:09] cscott: yeah, it's possible that in WMF CI the poison only runs in the batch without this one. [21:06:23] the end goal of the extension is to set config values based on database entries [21:06:40] so it should run as soon as the site starts loading [21:06:46] Examknow, Krinkle: ah, yes. Look at the code for the Translate extension [21:06:53] "callback": "TranslateHooks::setupTranslate", [21:06:57] in Translate's extension.json [21:07:04] ok [21:07:16] in your callback/setup function, re/set the global variables you need to. [21:07:55] Examknow: but in general I think that's the sort of thing which the BootstrapConfig / ConfigRegistry is supposed to solve [21:08:03] ok [21:08:08] Examknow: which is exactly the thing Krinkle and I were debating last week-ish [21:08:15] ok [21:08:23] https://gerrit.wikimedia.org/r/587550 if you're curious [21:08:36] cscott: that's probably not possible because that creates a circular dependency. To use the DB layer, you need the config. [21:08:40] Examknow: ^ [21:08:49] but the end result was that setting things up via the ConfigRegistry is currently just a half-competed project, so you're almost certainly better w/ the approach we just described [21:08:58] register an extension callback, and set your variables there [21:09:03] ok [21:09:19] oh, except Krinkle is smarter than I am ^ [21:09:36] but yeah, if you accept the risk that 1) it only works for variables not used during setup, and 2) that if something fails, it will render an error page without those variables applied, then that's fine. [21:09:49] i guess you'd have to set up your own connection to the DB so you can bootstrap your configuration before proper DB setup has been done? [21:10:04] so if you used a variable from your database to set security settings like who can do what, then that would be a problem, because when anything fails, it will fail before that point and then initialise the skin/output/api without that callback considered. [21:11:12] To manage this in a database today, I would recommend using the DB only as management and not as live source. E.g. post-save, persist it to something else, like etcd, or a .json file. and have the real config read from that. [21:11:24] don't wikifarms generally do this sort of thing by adding a hook in LocalSettings.php? [21:11:56] LocalSettings.php is itself that hook, yes. [21:12:38] cscott: ok so, I have good news and bad news. [21:12:55] The good news is, it probably isn't any of your commits. [21:13:04] the bad news is, it is now failing even on the one that was last passing [21:13:08] https://travis-ci.com/github/wikimedia/mediawiki/jobs/326010454 [21:13:19] https://travis-ci.com/github/wikimedia/mediawiki/builds [21:13:23] (i guess you could consider mediawiki-config an elaborate version of this, where we set various things in our production wikis based on a wide variety of different data sources) [21:13:28] cscott: y'know, you should really ask on #miraheze what wikifarms are doing instead of LocalSettings.php. You could ask me, but I'm kinda clueless. [21:13:50] Krinkle: this was my suspicion in the first place, since i didn't see how the fingered commits could have had the given effect, so it's not terribly surprising. [21:14:35] robla: they use LocalSettings.php [21:14:41] https://travis-ci.com/github/wikimedia/mediawiki/jobs/320535419 [21:14:42] robla: how long have you been lurking here? i don't think i've talked to you on IRC for months... or years? have you been here all this time? [21:14:43] https://travis-ci.com/github/wikimedia/mediawiki/jobs/320581599 [21:14:44] their setup it very hacked together [21:14:48] - PHP 7.3.16 (cli) (built: Mar 18 2020 13:08:58) ( ZTS ) [21:14:53] + PHP 7.3.17 (cli) (built: Apr 16 2020 21:04:54) ( ZTS ) [21:15:02] cscott: yay [21:15:28] cscott: it's been a while. We should yammer over on #robla about what I've been doing. Is #cscott a channel? [21:15:51] there were terrible gc issues in PHP 7. that Parsoid stumbled across, but I'm pretty sure those were 7.2.x [21:15:58] php broke the build :) [21:16:24] robla: I used #wikimedia-badattitude ;-p but only when i'm in that particular mood. [21:16:29] s/used/use/ [21:22:41] https://www.php.net/ChangeLog-7.php#7.3.17 [21:22:44] There's not many changes [21:23:01] ~16 bugs fixed [21:25:34] Yeah, [21:25:36] SimpleXML changed [21:25:43] we use that in the xml iimporter right? [21:26:39] only ExporterTest apparently... [21:26:40] https://codesearch.wmflabs.org/core/?q=simplexml&i=nope&files=&repos= [21:28:51] hm.. we use it to test but not for realz? [21:29:24] ah hang on [21:29:38] djvu uses it [21:30:14] https://codesearch.wmflabs.org/core/?q=simplexml&i=fosho&files=&repos= [21:30:17] Not much more [21:34:03] But it is broken in the right place [21:38:45] Reedy: ExportTest fails for mw on PHP 7.4.5 in the same way [21:38:49] me* [21:38:51] locally [21:39:40] That's not a surprise, the same bugfix is there too :) [21:40:18] Krinkle: There's already upstream bugs for it I think [21:40:23] https://bugs.php.net/bug.php?id=79528 [21:40:30] https://bugs.php.net/bug.php?id=79485 [21:40:41] I'm dumping $sink to repro on 3v4l.org [21:41:17] Looking at https://bugs.php.net/bug.php?id=79528, it looks like they've confirmed it [21:41:31] >To be clear, this should have changed only debug output, not anything else. [21:41:32] But.. [21:41:32] "wontfix" [21:41:35] -Status: Open +Status: Not a bug [21:41:37] yeha, I think we can retry [21:43:43] https://3v4l.org/HtN9R [21:43:47] here's a nice diff [21:44:22] php's xml and dom handling is really terrible [21:44:27] just sayin' [21:44:32] from personal experience [21:44:38] lol [21:44:54] is someone working on a patch? [21:45:03] I've already had to learn and deal with one arcane system today (Debian bug tracker). [21:45:05] i can probably try to hack one based on the php bug reports [21:45:10] Someone want to bug that upstream for this one? [21:45:46] i'd be more inclined to waste the effort on upstream if wmf used simplexml more broadly [21:46:06] but if it's only this code and maybe djvu, i think we can work around the new behavior [21:46:20] but think of the children cscott [21:46:37] i'm half persuaded that the new behavior is 'better' although I think they should revert it in 7.3 and only bring in the new behavior in 7.4 (or 7.5) [21:46:45] bug fix releases shouldn't break working code [21:47:17] i've been thinking of the (dom) children. i think parsoid should stop using php's XML support entirely and switch to a pure PHP implementation [21:47:17] I'm looking at https://3v4l.org/HtN9R only [21:47:29] I don't really understand what changed from this code's perspective for it to break [21:47:41] it's too hard to be wedded to a handful of baroque C parsing libraries that haven't been updated in a decade [21:48:41] Krinkle: $xmlNamespaces = str_replace( ' ', '_', $xmlNamespaces ) [21:48:47] Krinkle: is expecting an array of strings [21:49:28] if i understand the PHP bug report correctly, PHP is now giving it an array of objects [21:49:35] (i'm trying to figure out exactly what type) [21:49:46] and php is coercing the objects to strings and getting '' as a result [21:50:53] That's a totally different API. [21:50:58] that would be PHP 8.0 territory [21:51:09] yeah. bad maintainership. [21:51:48] i especially like "we usually do not document bug fixes" in the response. [21:57:21] "We must not treat a node as string if it has attributes, unless it is [21:57:21] an entity declaration which is always treated as string by simplexml." [22:15:36] Krinkle: ok, https://gerrit.wikimedia.org/r/#/c/mediawiki/core/+/593624 ought to work on PHP 7.3.17 and PHP 7.4 [22:15:53] Krinkle: pls test, I've only tested on PHP 7.3.14 locally [22:16:18] but it turns out that PHP < 7.3.14 already returned *some* elements of the array as SimpleXMLElement, just not all of them [22:25:07] cscott: upstrea re-opened [22:25:09] https://bugs.php.net/bug.php?id=79528 [22:25:12] there is hope after all [22:25:34] but yeah, if there's some way of calling the code that works non-hackily both ways, I suppose we can do that now [22:25:39] Krinkle: my patch is proof against whatever they do, i'm using a different code path that's not affected by the behavior change [22:26:12] if you iterate over SimpleXMLElement::children() you always get a SimpleXMLElement [22:27:16] cscott: LGTM https://3v4l.org/ooS6G [22:28:44] see https://3v4l.org/ooS6G/perf#output for which versions it tested but TLDR: all of them [22:29:20] Krinkle: just uploaded a new version, because I realized I didn't have to do the hacky is_string() test [22:29:28] since i'm always getting SimpleXMLElements now [22:29:54] that makes it a little cleaner, and we don't end up with cruft in our codebase [22:30:26] cool, still LGTM https://3v4l.org/NuYoW [22:30:57] cscott: so this casts string to string in old PHP, and and SimpleXMLElement to string in newer? [22:31:12] only question left is whether the djvu extension has any affected code [22:31:47] it should be straightforward to convert it to iterating over ::children() if so, but it might be hard to find where the implicit stringification might be lurking [22:50:51] Krinkle: no, in old PHP ::children() actually always returns a SimpleXMLElement [22:51:05] so it behaves exactly the same before and after [22:51:28] it's the "casting an element to an array" hack in the original which triggered the weird "sometimes convert an element to a string" behavior [22:52:04] so we don't do that anymore. [22:55:45] cscott: unable to find the djvu use of simplexml [22:56:02] https://codesearch.wmflabs.org/deployed/?q=simplexml_&i=nope&files=php%24&repos= [23:03:02] Krinkle: No _ [23:03:34] https://codesearch.wmflabs.org/deployed/?q=simplexml&i=nope&files=php%24&repos= [23:03:38] still no djvu? [23:04:00] https://github.com/wikimedia/mediawiki/blob/master/includes/media/DjVuHandler.php#L333 [23:04:09] SimpleXMLElement [23:04:49] i see [23:04:50] okay [23:05:01] so that's fine then, right? Already uses children() explicitly [23:05:02] thx [23:39:37] a user in #wikipedia-pl reports he wrote an 8-kB-long article in VE and got „Error loading data from server: 404: parsoidserver-http: HTTP 404” [23:39:40] what to do? [23:40:22] (or similar, he has problems with providing the full error description) [23:40:38] he cannot save nor switch to the source editor