[03:37:54] tgr|away: re: OATHAuth, is core still going to be calling the old hooks? [17:14:39] legoktm: some of them, yes [17:51:51] greg-g: IMO we should merge https://gerrit.wikimedia.org/r/#/c/264974/ before wmf11 is cut, to avoid deploying an API bug. [17:52:52] marxarelli: thcipriani ^^^ [17:53:01] * greg-g is in a meeting [17:53:58] +2'd [17:54:15] anomie, legoktm: was just about to start cutting. i'll wait for jenkins [17:54:38] marxarelli: see also my other note in -releng about Randomrootpage [17:55:45] saw that. let's move the deployment discussion there [17:56:07] too many channels, too many channels [19:03:18] tgr: Meeting? [19:08:45] anomie: forgot about it being on a different floor [19:12:59] bd808: Any chance I could get you to merge https://gerrit.wikimedia.org/r/#/c/261768/ and https://gerrit.wikimedia.org/r/#/c/261769/ if I rebase the former? please [19:13:39] Reedy: will it break Cirrus? [19:13:50] * bd808 doesn't have that setup for testing [19:13:54] DCausse +1'd it :P [19:14:29] It's tested against the version of ES we're running [19:14:54] I presume it's pretty stable, as it's had no commits since august [19:15:12] well, that's not quite true. It's a tag, not a branch [19:15:14] No 2.2 branch [19:15:16] * Reedy squints [19:17:18] 83 + return (bool) ($info['http_code'] == 200); [19:17:23] People write some interesting code [19:17:45] that looks like the sort of thing a java dev would write in PHP [19:18:04] Reedy: Are you gonna keep rebasing it until someone merges? [19:18:21] Reedy: I can merge if you get it rebased, yeah. Branch is already cut right? [19:18:23] ostriches: Why do I feel like if I rebase it, you're going to merge something else? :P [19:19:59] wmf.11 doesn't look to have been done yet [19:20:32] Guess we should wait for that [19:20:45] translatewiki run ES too, so that's some testing [20:14:18] anomie: the cross-wiki API thing is in the "Is Gather only for en wiki?" thread if you want to take that [20:14:57] tgr: Where's the thread? [20:15:12] IMO there will be trouble with cross-project-cross-language pages, but in the end it just seems like an attempt to force two use cases with minimal overlap in the same API [20:15:23] reading-wmf [20:17:44] anomie: also https://phabricator.wikimedia.org/T123377 although the apps people apparently prefer mail [20:41:26] ostriches: Is there a reason https://github.com/wikimedia/mediawiki isn't replicating? [20:47:26] bleh, looking [20:47:42] Oh, yeah, cuz I tried to get Phab to take over replication. Is that not working? [20:48:45] I wonder if that doesn't trigger when it's not in hosted mode. [21:07:57] Reedy: I unleashed your elastica version bump on beta. [21:08:07] (or at least told Jenkins to get to work) [21:08:42] heh [21:08:43] thanks [21:10:13] I warned the folks in #-discovery too [21:13:51] Woo [21:13:51] https://github.com/hhvm/oss-performance/commit/e41b960461241ca955a160bb03bdbfca9782dcbe [21:13:59] 1.26.2 in oss-performance now [21:15:02] tgr: I'm finding it impossible to comment on that Gather junk without saying that the real thing to do is to make a new version that fixes all the problems in the current prototype. Also, it seems to me their cross-wiki booksmarks list could as well be served by serializing them to a string somehow, then using action=options to shove it in a preference named "userjs-android-app-offline-list" (at least until the serialized string exceeds 65535 [21:15:02] bytes), instead of piling more cruft on the vaguely-defined pile that is Gather as it exists now. [21:15:20] * anomie read [[mw:Extension:Gather]] and still doesn't know what it's actually supposed to do [21:17:42] anomie: yeah, that's probably not a helpful comment :) [21:18:41] I agree that what the apps folks want is really just a way to store a list of data items [21:19:25] that could be a user option, or a new API, but there is no reason to graft it on Gather [21:20:10] which, yeah, already suffers from a lack of focus and adding an almost orthogonal functionality would only make that worse [21:58:43] anomie: I read through https://gerrit.wikimedia.org/r/#/c/264411/, but I don't get how it's going to be used [21:59:14] mainly why a module would have undocumented parameters [22:00:07] legoktm: I'm intending to have the patch needing it up soon. [22:00:27] ok [22:01:32] legoktm: But the basic idea is that with AuthManager, we don't know if it might be able to accept a username+password, a username+password+domain, an email address, "authWithFacebook=1", or what. [22:02:45] so...more like dynamic parameters rather than undocumentable? [22:04:13] legoktm: Sure. [22:21:03] Is wikitech sending notifications to restbase? [22:21:19] /page edit notices [22:21:46] wmgUseRestbaseVRS [22:52:19] csteipp: this might be vagrant-specific, but I see it a lot that in the final OAuth response the issue time is now + 1 sec [22:52:46] "a lot" = about 3 in 4 OAuth requests fail because of this [22:53:16] does it hurt security to relax that check a bit? [22:53:29] tgr: That's set by the host's timestamp, so if vagrant's time is drifting, then yeah, it will be off [22:53:55] yes, it's some kind of sub-second time drift [22:54:27] but I hav already configured the box to sync frequently and it's not helping much [22:54:28] On the consumer side, yeah, might relax that to checking if the iat is +/- 30 seconds [22:54:46] cool, I'll do that then [22:55:07] even +-1 would catch all the cases I have seen so far [22:57:49] Yeah, the only major threat would be if you realized someone could reuse your JWT's in some malicious way, and you upgrade the signature on them, so you don't want to accept ones that were issued under the old system. [23:02:14] bd808: Oh, that's a reason we can't easily upgrade the semantic dependancy tree [23:02:19] even more dependencies [23:03:43] Reedy: couldn't we make some thing like https://github.com/wikimedia/mediawiki-extensions-Wikidata ? [23:03:53] Maybe, yeah [23:03:58] I've not actually looked into it [23:04:18] validator has "param-processor/param-processor": "~1.1" [23:04:44] as long as we don't put wikidata and SMW on the same wiki I think we could figure out how to manage things [23:04:49] (I think) [23:05:53] anomie: is https://www.mediawiki.org/w/index.php?title=Manual:Pywikibot/OAuth&diff=0&oldid=2024705 normal? [23:06:07] maybe PWB alternates between the two sessions somehow? [23:06:56] (for context: https://lists.wikimedia.org/pipermail/pywikibot/2016-January/009381.html) [23:07:01] anomie was talking about something related to that today I think. That if you send a session id from an old cookie and the oauth handshake in the same request things go boom [23:07:43] 3 versions of localisation cache is sloooow [23:08:13] oh gawd, yes [23:08:32] 2 has been taking 50m [23:08:53] l10n cache hates us [23:10:13] I should see how bad the wfMsg*() situation is in the branches we are deploying [23:48:50] https://phabricator.wikimedia.org/T123599#1946276 [23:48:53] Not that bad to fix them up [23:51:42] patching them up would probably be quicker than trying to find a way to upgrade SMW cleanly [23:52:06] https://gerrit.wikimedia.org/r/#/c/264023/ [23:52:21] That would be quicker still, but I really don't want to merge that one [23:52:40] heh [23:53:26] My mum has stolen my HDMI cable again. ffs [23:53:36] are you grounded? [23:53:41] no video games for you [23:53:42] No, "tidying" [23:53:54] They moved my little sisters PS4 around a couple of times [23:54:10] And unhelpfully unplugged the apple tv I've got plugged in [23:54:33] She "found" the cable once, but I think I left it downstairs, so she's taken it again [23:54:39] No diea why I didn't put it back into the TV at the time [23:54:46] * bd808 looks around his pigsty of an office and wishes for someone to tidy up [23:55:04] bd808, cleanse with fire! [23:55:33] I think I'm looking for a slightly more selective purge [23:55:40] "Add wgMessagesDirs for WMF usage in LocalisationUpdate (noop for older MW)" [23:56:29] hacky pos