[00:00:47] Well, that's not an MW question, that's basic PHP [00:01:00] Make the function static, call it via self::redirectTo() [00:02:52] But does redirectTo even exist? [00:04:11] hm, what's the recommended way to handle a redirect to any external site? [00:04:39] i have been using header( location [00:04:50] which works, but not sure if that is the best way to redirect [00:05:55] Probably using OutputPage and calling redirect on that [01:57:24] So I asked this a while ago, but I wanted to confirm, is using composer for extension not supported, unless it's a SMW extension? [01:59:31] It's still mostly in the same state, yes [01:59:44] There's discussions underway about relaxing that and formalising it to some extent [01:59:57] But nothing solved or completely decided [02:01:21] relaxing it how? making non-smw extensions more composer friendly? [02:01:58] because I thought the RFC was declined as of https://phabricator.wikimedia.org/T467 [02:02:07] More formalising what is actually supported [02:02:24] Because some of the current implementations do some very quirky thigns [02:02:48] https://phabricator.wikimedia.org/T249573 [02:03:13] and... [02:03:23] https://phabricator.wikimedia.org/T250406 [02:04:16] so like right now I have a painstakingly set up composer.local.json for every MW extension I have now, even non-SMW ones [02:04:35] i just want to know if I should nuke that and just go back to using git submodules or something [02:04:45] I don't know if that's necessarily going to change [02:06:30] alright, i won't rock the boat there [02:07:11] though it would be nice to not have a thousand lines of repositories in there and actually have extensions in packagist, I can't really expect the most [02:40:34] I get the feeling that it's moving towards being more supportive of extensions in packagist, but yeah it's not resolved yet. [02:57:13] that would make my day [02:58:12] i mean in a perfect world i could so something like composer create-project mediawiki/mediawiki to bootstrap a mediawiki project but baby steps [03:04:51] Sazpaimon: there's been some attempts at that, e.g. https://gerrit.wikimedia.org/r/c/mediawiki/composer/project/+/434764 [03:15:26] nice [05:51:02] samwilson: Yo. [05:51:18] Who's maintaining https://meta.wikimedia.org/wiki/User:Community_Tech_bot ? [05:52:21] Savannah: the whole CommTech team is. Are you seeing an issue with it? [05:55:35] samwilson: The talk page has a couple posts, yeah. [05:55:46] I went to go look up how many Category pages we have and the report is busted, heh. [05:55:59] Someone was bothering me about a separate broken report. [05:56:36] Savannah: oh thanks, I'll have a look. [07:29:32] Sweet, thanks!! [09:54:04] Hi! I got an email from someone external using MediaWiki on their website. They have a problem with page sorting in categories [09:54:22] the case is that when a page is added to a category, it sorts under the correct letter, but not alphabetically under that letter [09:54:54] so the page "Ramen" comes after "Ritz" and "Ruth" under "R" in the same category [09:55:19] Does anyone know why this could be? Is it a maintenance script or something that they aren't running perhaps? [10:00:56] Jhs: was the wiki upgraded recently? When I upgraded from 1.32 to 1.34 this happened as well [10:01:14] Vulpix, could have been. How did you solve it? [10:01:37] They are on 1.29.1 [10:01:54] I had to run maintenance/updateCollation.php [10:02:18] with the --force parameter [10:03:06] Vulpix, thanks! and after that was done, newly created pages sorted correctly afterwards? [10:04:16] Yes. Apparently during some upgrades, category collations change and cause those issues on existing pages [10:04:38] I see. Thank you very much Vulpix! [10:04:40] new pages compute a different sort key than old ones [10:05:33] Apparently, when this happens, it isn't mentioned in RELEASE-NOTES :( [10:07:50] I guess it's language-dependent. still ought to be mentioned though [13:09:46] hi, is there a way to update a big mirrored wiki that was replicated using importDump, like wikipedia? if I use import again, will pages be updated or duplicated or what will happen? It takes months anyway so it's not a good way. [13:12:09] vinvin_: I think there's no good way. Wikipedia has lots of edits per minute. You probably don't have a server capable to replicate that edit rate and also maintain the wiki usable for readers :) [13:12:19] indeed [13:14:09] I guess i could manually select a few pages to update, based on theme or section for example, and update them individually, if I find the way to automate that [13:16:43] vinvin_: If you don't mind history (I don't know if that's OK unless properly attributed with other means) you can directly access Special:Export for a specific page https://en.wikipedia.org/wiki/Special:Export/IRC [13:18:07] right, thanks [13:41:28] So I'm setting up a test restbase instance so I can try out Mathoid. I can the spec/help page working, but, none of my routes seem to work? [13:55:24] My restbase config: https://pastebin.com/AaBRH4Ks and Nginx config: https://pastebin.com/mXZztwXK for RESTbase stuff if anyone is able to help me out with that. [14:12:51] Lcawte: in the restbase config I miss a | - path: projects/example.yaml | (or a renamed one), under | x-modules: | with all the options of the actual apis enabled behind it [14:13:09] look at config.example.yaml [14:14:12] Ok.. maybe that config is not in the example.yaml anymore :) [14:15:34] looks like someone made the sample configuration a lot less obvious and removed everything that made it useful https://github.com/wikimedia/restbase/commit/5cdd0051cf316a3109ab5d4d79786fa2e0cafe3b [14:16:21] Oh "fun" [14:26:00] Vulpix: I'm looking at the example project from like... 10 months ago or something, and I can't see anything that isn't in the new config.example.yaml, or anything that immediately screens page routes? [14:26:37] Obviously at this stage, I'm just trying to /page/html/Main_Page to make sure the thing works... [14:30:34] It seems those paths are defined in the yaml: See line 57 https://github.com/wikimedia/restbase/blob/master/config.example.yaml#L57 [14:32:52] Yep, but, I'm getting route not found issues with them, even after getting the spec page displaying. [14:35:01] Maybe you need to also adapt line 184 to be your actual RESTbase URL https://github.com/wikimedia/restbase/blob/master/config.example.yaml#L104 [16:04:43] Hi folks. Is there any way to query for each user in a list's last contribution? Querying using usercontribs with uclimit set to 1 just returns 1 contribution and has you continue like that through the entire list, whereas what I want to do is return one contribution per user. [18:29:57] legoktm: is the corruption here from a processing step limited to that area, or might it have affected other parts as well? https://phabricator.wikimedia.org/T205361#6150835 Hoping it won't require a re-run of capturing.. [18:41:56] cscott: The instanceof check that https://gerrit.wikimedia.org/r/#/c/mediawiki/core/+/598960/ adds is what we have in prod on wmf.32, correct? [18:42:21] If so, I'd rather keep that as-is and let your new fix roll out in the next branch after testing in master and confirmation beta. There's no rush on that right? [18:59:09] Krinkle, cscott: So let's land https://gerrit.wikimedia.org/r/c/mediawiki/core/+/598960 and backport to wmf.34, and https://gerrit.wikimedia.org/r/c/mediawiki/core/+/599071 and then revert 598960 for wmf.36 if wmf.35 rolls out OK? [19:00:28] Right, that's effectively what I'm proposing yes. The former could also be cherry-picked on wmf.34 only rather then merge and revert in master. Fine either way by me. [19:02:15] I'll proceed with the master merge to keep master "working", hopefully. [19:02:32] Also, this should happen in -operations. :-) [19:06:10] James_F: works for me. i just don't want a FIXME lingering forever in the codebase [19:06:31] James_F: it's actually a quite tricky bug -- i did some code archeology and i couldn't exactly pinpoint where it stopped working [19:06:33] Oh, totally [19:06:58] it seems to have "always" been broken, in that Parser::fetchCurrentRevisionOfTitle could always historically return false [19:07:08] which would have triggered the same error in coreparserfunctions [19:07:11] Zero-blame environments don't mesh perfectly with `git blame`, but the SVN transition helps hide the cause. [19:09:19] well, we didn't see crashes in the logs before (AFAIK) so it is related to the Revision->RevisionRecord transition [19:10:43] * James_F nods. [19:10:55] Yeah, that's been pretty bumpy in various places. [19:11:04] but it seems to be a more or less subtle thing with parsing pages which haven't been saved yet, which have RevisionRecords but not Revisions [19:12:11] Oh, right. Joy. [19:23:43] James_F: do you have a patch to revert c45ccd7ca8e6f2f643c3c89cf8a2bd1e936e6dea ? [19:26:47] cscott: … yes? [19:27:05] Are you not getting e-mails from gerrit? [19:27:21] i was just looking at my git log, i probably just missed it [19:27:28] it didn't get merged in the order i expected, that's all [19:27:40] (i get *so many* mails from gerrit. *so many*.) [19:31:47] Also, this is *still* always the wrong channel. [19:39:22] Krinkle: uhhh, I will need to look closer. [19:39:26] that's not good though [20:56:11] hello! I'd like to take a mw action api parameter in as an assoc array object [20:56:18] the best I got right now is [20:56:25] use ApiBase::PARAM_ISMULTI => true [20:56:41] and then request the query param like [20:57:03] param_obj=key1=val1|key2=val2 [20:57:14] then I do custom parsing of param_obj to convert it into an assoc array [20:57:24] [key1 => val1, key2 => val2] [20:57:30] is there a better way? [20:57:43] i'm trying to see if there is some kinda of ApiBase setting to let me do this [21:00:26] bd808: got any hints maybe? ^ [21:00:40] I don't think the api is setup for that [21:00:58] You could just take a json blob or similar... [21:01:22] i know php by default will let you do things like ?param_obj[key1]=val1¶m_obj[key2]=val2 [21:01:27] json blob in the GET url? [21:01:35] ottomata: no idea. I know who I would have pointed you too in the past, but ... yeah. [21:01:40] :/ [21:01:49] Well, you can POST too [21:01:53] yeahh>..... [21:02:13] i'd have to change a bunch of other stuff to make code to POST, which would be ok i guess, but isn't great [21:02:22] i don't mind parsing this out of the mluti val array i guess [21:45:26] I like how someone found a clever way to syntaxhighlight these bits of wikitext documentation [21:45:26] https://www.mediawiki.org/wiki/Help:Extension:ParserFunctions##time [21:45:32] using xD [23:34:24] upgrading 1.31.7-->1.32.6 ERROR 1072 (42000) at line 1: Key column 'rc_user_text' doesn't exist in table [23:35:36] should I add that manually? Why didn't update.php do it for me? [23:35:59] Probably not [23:36:44] Something seems odd there [23:37:00] The patch to drop rc_user_text was in 1.34 [23:37:31] hmmm... maybe I have confused things and need to go back... [23:37:47] But it's existed since 1.1 [23:37:50] I originally tried to go from 1.30 to 1.34 [23:38:02] Yeah, that'd probably explain this [23:38:17] We don't support downgrading [23:38:23] heh [23:38:34] Who would! [23:43:05] That (starting from the 1.30 database) worked! [23:43:13] thanks!