[00:05:09] [20:59:44] I wonder if it's something to ask legoktm nicely to add into library upgrader rather than me doing the rest of these repos manually <-- file bugs plz! [00:05:37] ALL OF THEM [00:05:46] * Reedy tries checks which issue he was complaining about :P [00:07:32] legoktm: I think, TLDR, HHVM 4.0 is teh suck [00:08:00] James_F: Lots of tech debt patches incoming [00:08:08] * Reedy uses puppy dog eyes [00:08:23] Reedy: Oh dear. :-) [00:08:36] I'm doing them so they just won't break when we bump AtEase [00:08:44] Rather than all of them depending on the bump patches [00:08:52] Which is just dependancy hell [00:09:04] Also going to clear out the ones just using cores AtEase functions too for now [00:09:16] Sure. [00:09:56] yay for shell commands doing most of the work [00:10:51] Reedy: also, can you make a patch for the bootstrap? [00:10:58] Hm? [00:11:16] Patch of my choosing? :P [00:11:16] one of my dreams is to be able to run the library boostrap over an existing repo to have it update some files that are pretty static [00:11:30] Ahhh [00:12:06] https://gerrit.wikimedia.org/r/c/IPSet/+/489957 [00:12:25] find . -type f -exec sed -i 's/wfSuppressWarnings/Wikimedia\\suppressWarnings/g' {} + && find . -type f -exec sed -i 's/wfRestoreWarnings/Wikimedia\\restoreWarnings/g' {} + && git commit -a -m "Update MediaWiki namespaced AtEase global functions" && git review && git reset HEAD~1 --hard [00:12:29] * Reedy finishes doing this first [00:12:37] https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/tools/cookiecutter-library/+/master/%7B%7B%20cookiecutter.library_name%20%7D%7D/.travis.yml [00:12:38] :D [00:13:29] legoktm: One of *my* dreams is that the forthcoming .pipeline-config files will let us drop per-repo composer and npm config files when we're only using them for CI/CD. [00:14:21] ! [remote rejected] HEAD -> refs/publish/master (internal server error: Error inserting change/patchset) [00:14:21] error: failed to push some refs to 'ssh://gerrit.wikimedia.org:29418/mediawiki/extensions/BiblioPlus.git' [00:14:24] Well, that sounds fscked [00:14:29] Yup [00:14:42] Ha. Is the extension repo broken? [00:14:50] Nope, gerrit is fscked [00:15:09] Oh, yes, often the way at this time. [00:16:35] Reedy: i got that recently https://phabricator.wikimedia.org/T214656 [00:18:46] Reedy see what logstash says :) [00:19:49] "om nom nom" [00:21:30] Im quite interested to know what happened (and why it failed). [00:21:40] though refs/publish/master is deprecated :P [00:21:51] blame gitreview [00:22:24] they should have fixed it by now (that's why upstream deprecated it instead of just removing it :)) [00:22:49] Removing things without deprecating first is usually always bad, mmmkay? [00:22:52] oh [00:22:54] actually [00:23:00] they did remove support for it reedy [00:23:01] https://gerrit-review.googlesource.com/c/gerrit/+/192494 [00:23:18] upgrade to 1.27.1.0a1 :P [00:26:10] Reedy: I'll not be merging stuff for a bit, to give SWAT a chance (and so I can go home). ;-) [00:26:18] Fair :) [00:26:20] Also, slacker. [00:26:36] I'll be working from home merging your code. :-) [00:30:19] I think... That's all the extensions and skins using MW's globalfunctions wfSuppress/Restore *and* MediaWiki\suppress/restore updated [00:35:35] Auto Response: [Gerrit] mediawiki...Comments[master]: Update MediaWiki namespaced AtEase global functions [00:35:36] srsly? [00:39:20] heh [02:26:12] MaxSem: Please don't merge Reedy's patches, he didn't do them well enough. :-( [02:26:35] I don't claim to have done anything more than a simple find and replace :P [02:26:52] Reedy: Yeah, you needed to update the MW requirement to 1.31+ in the same patch. [02:26:57] Lol [02:26:59] Sorry. [02:27:45] It seems rather unmaintained [02:27:48] master TMH... [02:27:48] "requires": { [02:27:48] "MediaWiki": ">= 1.29.0" [02:27:48] }, [02:27:51] Most of them are. [02:27:52] oh ok then [02:28:06] But yeah, it's a pain to write as a regex. [02:28:15] I'm going to have to go cancel a huge number of C+2s. [02:29:02] And some don't even have any requires at all [02:29:39] Yeah, if they don't have requires you can skip it (though it's good practice to add it in if you're feeling nice). [02:29:53] CologneBlue says >= 1.25 [02:29:55] I call bull [02:29:57] But we shouldn't leave the repo claiming it supports version X but only supports version Y. [02:30:18] I think the idea that CologneBlue supports /working/ is clearly nonsense. ;-) [02:30:50] Uh [02:31:56] Oh, hey MaxSem. [02:32:23] We should just bot-update this requirement for every WMF-developed extension after every branching [02:32:43] MaxSem: Well, some extensions keep working with old MWs. [02:32:53] More through luck than judgement [02:32:55] Others very much don't. :-) [02:33:11] Except the Language team who put a lot of effort into back-compat. [02:33:30] But at WMF only L10n team bothers to actually maintain this compatibility [02:34:43] Well, because the development policy explicitly says master<=>master, REL1_32<=>REL1_32. [02:35:50] We tried to maintain compat at the mobile team initially, that was a clusterfuck [02:36:00] Yeah, it's very hard to do. [02:36:19] Either you pay a huge amount of development time to support it, or, you know, other teams have to. [02:37:20] And of course, numerous extensions don't have an extension.json yet either [02:38:11] You know where they should go >:} [02:38:40] Lol [02:40:04] Reedy, James_F: https://phabricator.wikimedia.org/T170215 [02:40:23] legoktm: We need a poop emoji [02:40:26] also MaxSem ^^ [02:40:28] Ha. [02:40:51] * legoktm hands 💩 to Reedy [02:40:59] legoktm: I've redone https://gerrit.wikimedia.org/r/c/mediawiki/core/+/489308 - could you take a look? [02:41:50] Sorry legoktm, I couldn't resist :D [02:42:15] lol [02:43:13] I've redone https://gerrit.wikimedia.org/r/#/c/mediawiki/extensions/UserStatus/+/490211/ [02:44:10] SMalyshev: looks correct, lemme do a bit of testing quickly [02:44:36] good.json should not cover this case [02:51:43] SMalyshev: yeah, I typically run the validator over all extensions in git when testing stuff just as a sanity check [02:52:36] that shouldn't change since nobody has been using @note in v2 before probably... otherwise it wouldn't validate already [02:52:55] indeed [02:57:15] 19:41:33 Building remotely on integration-slave-docker-1041 (blubber instance-type-bigram DebianJessieDocker m4executor) in workspace /srv/jenkins-workspace/workspace/mwgate-npm-node-6-docker [02:57:20] 19:41:51 rsync: failed to set times on "/cache/.": Operation not permitted (1) [02:57:20] 19:41:51 rsync: recv_generator: mkdir "/cache/JSV" failed: Permission denied (13) [02:57:32] You can ignore that [02:58:27] The test failed [02:58:29] So, no [02:58:37] https://integration.wikimedia.org/ci/job/mwgate-npm-node-6-docker/78725/console [02:59:05] Oh that’s different heh [03:03:43] weird [03:04:58] 02:41:51 mkdir: cannot create directory ‘src’: Permission denied [03:05:04] That’s the error Reedy ^^ [03:12:57] Do I want to try and debug it [06:53:36] ugh, backlog [07:39:53] <_joe_> so my question of the day would be: when we transitioned to HHVM, we used the NavigationTiming extension to set a cookie to all users, see https://github.com/wikimedia/mediawiki-extensions-NavigationTiming/commit/6cbf21338846c5fa6966339c925120a48527e843 [07:40:14] <_joe_> but ori admitted it was a bit of an awkward place where you'd do it [07:40:37] <_joe_> so my question would be: what would be the appropriate place to re-do that, only for php7 this time? [07:48:13] <_joe_> I also have a doubt about that code, but I guess I'll have to ask someone with a better understanding of javascript than me - specifically I see it tests if a cookie is present using var hasCookie = /hhvm=true/.test( document.cookie ); [07:48:29] <_joe_> this verifies if the cookie is present in the /response/, not in the request [07:49:08] <_joe_> and I'm not sure it would ever be true [15:28:37] _joe_: milimetric, nuria, or Krinkle would know [15:28:53] they recently moved the sampling code up to either EventLogging or core itself; I can't remember exactly [15:29:13] <_joe_> ori: yeah my bet was on Krinkle tbh [15:29:26] that's usually a safe bet :) [15:29:28] <_joe_> it's sad to discover how little I remember of in-browser js [15:29:59] no you weren't, don't lie [15:30:09] <_joe_> at least most of the js used in mediawiki is agee' enough I am still accustomed to it somehow [15:30:26] <_joe_> jquery feels like home to someone who stopped doing web development around 2010 :P [15:30:28] you were thinking "there but for the grace of god go i" [15:30:31] yep [15:31:51] _joe_: I did the refactor ori's talking about, with Timo's help of course. But did I miss other context? What does that have to do with the NavigationTiming instrumentation? [15:32:01] <_joe_> nothing really [15:32:22] bucketing people into an HHVM experiment was done in NavigationTiming as an ugly hack because I could push code there quickly [15:32:33] <_joe_> exactly [15:32:37] _joe_ was wondering where it _should_ go, since it needs to be resurrected for php7 [15:32:37] aha [15:32:54] gotcha [15:33:04] <_joe_> I mean I'm happy to keep up with the ole tradition [15:33:08] and I was vaguely remembering that the sampling / bucketing code was consolidated / standardized somewhere [15:33:11] yeah, I've found EventLogging instrumentation randomly thrown into extensions that have nothing to do with what's being instrumented [15:33:39] maybe there should be a separate extension for this kind of general instrumentation that's not tied to specific product features [15:33:49] like PlatformInstrumentation? [15:34:03] <_joe_> ori: if you want to laugh https://phabricator.wikimedia.org/T214984 [15:34:15] EtCeteraExtension [15:34:17] <_joe_> milimetric: this is not instrumentation, per se [15:34:24] <_joe_> it's just logical bucketing of users [15:34:29] <_joe_> for $reasons [15:34:41] <_joe_> in a way that's recognizable server-side, and plays nice with our caches [15:35:06] there was a WikimediaEvents extension IIRC for just that [15:35:23] <_joe_> ori: yes, I could add it there as well, sure [15:35:23] ah ok, good to know [15:35:42] it's still enabled, looks like [15:35:45] <_joe_> I was just modifying that this morning [15:35:46] so that's probably the place [15:35:53] <_joe_> yes, we use it for beta features [15:36:51] ori: basically the refactor was moving all validation as an optional subscribe to the .debug module, and all functionality including sampling to core: https://github.com/wikimedia/mediawiki-extensions-EventLogging/tree/master/modules/ext.eventLogging. [15:37:24] I'm going to take a look at this bucketing to see what you're all talking about [15:38:17] * ori has to run [15:40:07] <_joe_> milimetric: https://github.com/wikimedia/mediawiki-extensions-NavigationTiming/commit/6cbf21338846c5fa6966339c925120a48527e843 [15:45:41] _joe_: ok, so this is generating a random token, storing it in localstorage, and checking it against a sampling rate to add or remove a cookie that enables hhvm [15:45:42] cool [15:45:59] <_joe_> well in our case, php7 [15:46:17] the sampling part has indeed been moved to EL core, so if you depend on EL, you can use sessionInSample: https://github.com/wikimedia/mediawiki-extensions-EventLogging/blob/master/modules/ext.eventLogging/core.js#L204 [15:46:46] this works on the user's session id, which I'm not too familiar with, is that persistent enough or would you need to work with localstorage? [15:47:24] the rest of the cookie code seems straightforward enough [15:48:52] as for where to put this... don't you have to change Varnish anyway to route based on the cookie? Why not do the sampling there? [15:52:06] <_joe_> yes, I need to use this for non-logged-in users as well [15:52:32] <_joe_> well, because that won't set the cookie randomly to new users, right? [15:52:43] <_joe_> say you're an anon reader [15:52:52] <_joe_> you go to the enwiki main page [15:52:58] <_joe_> you have no cookies [15:53:09] <_joe_> so you get which version? [15:53:42] <_joe_> frankly I see one advantage in doing any sampling at the varnish layer - when you want to separate traffic and not users [15:53:57] on request varnish sets a cookie, samples, routes you to the right place, and sends the cookie back on response [15:54:15] (if the cookie exists, it just doesn't set it) [15:54:15] <_joe_> yeah I'm pretty sure vcl is way more painful than js to do that [15:54:20] :) for SURE [15:54:43] <_joe_> but the advantage is, we could do the same for API traffic that mostly doesn't honour cookies [15:54:52] so you're right, the sampling from EL wouldn't fit this... unless mw.user.sessionId works for anons... one sec [15:57:00] _joe_: from basic inspection, it looks to me like mw.user.sessionId() is consistent for anons too [15:57:09] so that sampling function in EL should work [15:59:29] <_joe_> great! [15:59:44] <_joe_> I might poke you again tomorrow then milimetric [16:00:59] anytime. Looking to understand this stuff better, so happy to help if I don't slow you down [16:01:17] <_joe_> oh you most definitely won't :P [16:02:19] won't what? understand it better? Or slow you down? ;P [16:06:22] <_joe_> ahahah [16:06:33] <_joe_> AGF [16:06:49] <_joe_> (in a meeting, sorry) [16:08:18] :) ok, looked at the code and it's in sessionStorage so reloading the browser tab won't change the sessionId, but closing it will (https://doc.wikimedia.org/mediawiki-core/master/js/source/mediawiki.user.html#mw-user-method-sessionId) [16:16:59] _joe_: RE: PHP7, WikimediaEvents seems like a good enough place to put it. Especially since the beta switch is there as well. If we're open to other options, I suppose Varnish could work as well. In particular because otherwise we can only switch secondary views of authenticated users (it's unsafe to emit cookies from logged-out user page views, on cache miss - although we /could/ do that). [16:17:58] _joe_: as for cookies. document.cookies is a read-write interface that starts with a summary of current cookies stored in the browser for this origin/path, including any Set-Cookie commands applied from the current HTML response. [16:18:55] <_joe_> Krinkle: now I remember, this morning I was totally confused [16:19:00] <_joe_> just rusty :P [17:08:13] milimetric: Yeah, I think my use of sessionStorage there was a mistake. [17:09:55] I've only last year come to understand fully how it is meant to work. Basically sessionStorage is a way to track an individual tab, a sequence of back and forward navigations, which can survive offline/restarts/upgrades/"restore closed tab", etc. [17:10:13] Has no conceptual ties to browsing session or session cookies whatsoever. [17:10:53] opening two tabs and going to the same url, that's two distinct session storages [17:10:56] <_joe_> Krinkle: before I go afk for the evening: I was thinking of doing pure stochastic traffic separation for API calls that don't have a cookie in VCL [17:11:12] _joe_: sounds good to me. [17:11:15] <_joe_> as some of the api calls come from clients that don't honour cookie requests and don't speak js [17:11:34] we just send the header to the backend, right? [17:11:34] <_joe_> but for the main site, I'd just go with javascript and yes, wikimedia events is the right place [17:11:45] <_joe_> the cookie itself [17:11:58] Yeah, for web we want consistency I think, so bucketing where possible. [17:11:59] <_joe_> on the backend, we look for the cookie, and if it's there, we use php7 [17:12:19] but we may still want to use VCL there so that it starts from the first view, instead of only secondary views. [17:12:31] <_joe_> Krinkle: another q: is there something we need to do in order to have the sampling/profiling pipeline working for php7? [17:12:45] if you want to implement a different sampling method in EL, not using user.sessionId(), we could I guess do the same localstorage trick we used for hhvm, though I should think that through more when I'm not in meetings [17:12:47] but for web, instead of setting a be-resp heade only, for index.php we'd set a cookie as well. [17:13:39] milimetric: sampling can be anything, pageViewToken or newRand is fine for that. But the boolean outcome will be saved as a regular old cookie. Has to, because it must be seen by the server when deciding which PHP engine to kick [17:14:26] we don't need it to be deterministic given once the cookie is set, that's what decides. [17:16:23] <_joe_> exactly [17:16:29] <_joe_> sorry, I'm going afk now [17:16:35] <_joe_> will read the backlog later [17:25:14] makes sense, we can have a cookie-setting sampling method in EL then, if you like [19:45:32] James_F: I saw a comment indicating that all but one wiki using LQT use flow now. Is that one wiki mw.org? [19:45:43] (an old comment from you) [19:46:12] Krinkle: No, it's ptwikibooks I think. The migration is stalled because the maintenance script to migrate kept breaking mid-run. [19:46:26] James_F: OK, so that means there's two wikis :) [19:46:27] Krinkle: There's also active use on Wikinewses, but I don't count those as active. [19:46:42] Krinkle: LQT isn't meant to be usable on MW.org, where are you seeing it used? [19:46:46] Is there anythig blocking to mw.og conversion? [19:46:55] https://phabricator.wikimedia.org/T61791 [19:46:57] The MW.org conversion finished three years ago. [19:47:04] It's loaded and can trigger fatals [19:47:10] Well… no shit it's loaded. [19:47:14] e.g. https://www.mediawiki.org/wiki/Thread:Talk:MediaWiki_1.19/Broken_redirects_and_sidebar_display_after_upgrade_from_MW_1.12_to_MW_1.19/reply_(2) [19:47:18] MW is a POC and you can never ever uninstall software ever. [19:47:34] But it's loaded in non-operational mode for log non-breakage only. [19:47:49] If it's only for logs, I suppose there should be a feature flag for that, and also to remove these pages [19:47:58] There is a feature flag. :-) [19:48:29] But apparently R.oan didn't implement it strongly enough? (All the NSes and logs have to be there forever lest we breach the licence or whatever.)