[00:02:30] Ah, I see [00:02:31] https://github.com/wikimedia/mediawiki-extensions-Scribunto/commit/7418a571ac59cc25b682c681a9c2dd330c4a983a [00:02:45] TimStarling: Looks like Brad has already fixed it, but that didn't make the branch [00:03:02] right [00:03:40] I'll cherry pick it and deploy to fix the tests [00:03:46] it was backported to wmf.17 but not wmf.16 [00:04:19] I literally just clicked to backport to .17 [00:04:41] the test failure shows .16 [00:04:53] ah right [00:05:07] updated 67 seconds ago [00:05:11] Might aswell do both [00:28:51] MaxSem: (late response) https://gerrit.wikimedia.org/r/#/c/370292/2 looks fine to me [00:29:36] muchas gracias RoanKattouw [01:52:39] Reedy: I amended your change to make batch size configurable [01:52:46] (and thus not hardcoded in 2 places that could get outta sync) [15:44:07] no_justification: Re those "" warnings, I found backtraces in error.log on mwlog1001. There seem to be three. Flow should be fixed by backporting https://gerrit.wikimedia.org/r/#/c/374861/. MobileFrontend has SpecialMobileHistory and SpecialMobileContributions, for which T175161 exists. [15:44:07] T175161: Special:MobileHistory warning: Using deprecated fallback handling for comment rev_comment [Called from CommentStore::getCommentInternal in /Users/jrobson/git/core/includes/CommentStore.php at line 200] - https://phabricator.wikimedia.org/T175161 [15:44:28] s/""/"Using deprecated fallback handling for comment"/ [15:52:20] anomie: Tyvm [16:43:28] ApiRevThankIntegrationTest::testValidRequestWithSource is breaking Flow [16:43:33] Flow unit tests [16:43:34] Yay [16:47:38] lol [16:47:42] hat's an old bug [16:48:21] Soooo, recheck? [16:48:23] Force merge? [16:48:42] recheck and it'll probably pass I think [16:49:25] I revoked Jenkins' vote and told him to stfu [16:49:31] Which also worked :p [16:53:27] AaronSchulz: hey! can we briefly talk about rootJobParams? [16:53:37] latency of gerrit comments tends to be high... [16:54:09] so, my problem is: none of the jobs involved in my patch has "a title" associated with it. They all use fake titles. [16:54:28] (the same fake title, even) [16:55:38] Gerrit comment latency? [16:55:53] the time it takes for people to reply [16:55:57] Ooohhh, nvm. [16:56:02] DON'T SCARE ME LIKE THAT :P [16:56:06] :P [17:06:28] DanielK_WMDE: so, there is no association between triggering event and the title batch in a job? [17:07:13] in that case, you can't really use signature+timestamp but just timestamp. [17:07:46] it looked like there was but I guess I misread that code (I'm not super familiar with wikibase there) [17:09:56] DanielK_WMDE: btw, there are slides at https://www.dropbox.com/sh/zwj9kstnass10l4/AAD8oTBc8kaWgey8gsJKhVYsa?dl=0 , I don't know if I put those on wm.org (I can't find them at least) [17:14:35] ... Why do the OATHAuth schemas for PostgreSQL and Oracle seem to think the 'id' field should be auto-incrementing? [17:15:16] AaronSchulz: it would definitly be good to have documentation on the root job stuff and deduplication in general on mediawiki.org [17:15:38] AaronSchulz: the actual root job is the ChangeNotificationJob [17:16:01] it's posted to the client wiki's job queue from the repo [17:17:03] AaronSchulz: check JobQueueChangeNotificationSender::getJobSpecification [17:26:16] AaronSchulz: so... if i could determine a singe Title as the basis of whatever update, even if it's a "remote" title, can I somehow apply it to the LinksUpdateJob, etc? Won't it be ignored (as the documentation says) if there is a page batch in the job? [17:28:19] AaronSchulz: would it make sense to have something like a rootJobTitle parameter? Or is that basically the same as rootJobSignature? [17:29:30] the signature doesn't have to come from a title, it can be any string that defines "X changed so I update reference" [17:29:38] it can be foreign title then [17:31:36] what do you mean ignored? For the >removeDuplicates "hash" de-duplication that's effectively true for batches since they probably won't match [17:31:56] but that's the "other" type of de-duplication [17:40:17] AaronSchulz: RefreshLinksJob sais: "- b) Jobs to update links for a set of pages (the job title is ignored)." [17:40:39] this is the mode i'm using [17:40:48] so the title isn't ignored in the context of dedupllication? [17:40:53] then the documentation should be changed. [17:41:23] AaronSchulz: I updated my patch to use a hash of the affected pages as the signature. [17:41:30] that's the set of pages to be purged on the client [17:41:35] is that correct? [17:43:02] AaronSchulz: i have to run off now, so i get home in time for the TechCom meeting. [17:43:27] Please comment on the patch, so we can get this roleld out. [17:43:55] Please also let me know if there is a way to measure the deduplication ratio for a given type of job [17:44:15] AaronSchulz: isn't the set of pages defined by the entity with the backlinks to begin with? [17:45:01] if you use the batch hash it's prone to not working since if might get offset slightly if the first job started running/sub-dividing before the second was enqueued. [17:46:55] DanielK_WMDE: you can use https://grafana.wikimedia.org/dashboard/db/job-queue-rate?orgId=1 and remove "all" and pick the job type you want in the HTML selector [17:47:37] superceeded=root job de-dup, de-duplicated=hash de-duplication [17:48:52] AaronSchulz: the set of pages is defiend by the entity that was changed, but there are many such entitites in the ChangeNotificationJob. The set of pages also depends on the "aspect" that was changed (label, sitelink, etc). [17:49:35] I could base the signature on the info in the change object, instead of hashing the page list. but to me this seems equivalent, if not more brittle [17:49:54] i don't quite follow what you mean re offset. [17:50:11] anyway, i have to run now. i may not see anything you reply now, please send mail. [19:29:13] anomie: thanks! nice spot too with the s [19:29:40] Reedy: Yeah, I tested it and noticed that bp_user hadn't been changed. [19:32:03] I should've noticed when I copy pasted the output to the task too [19:51:37] Ha, I knew if I left Ifc8ac2f alone for a while that Reedy would merge it and save me the trouble of testing it. [19:51:40] ;) [19:52:17] it's very much a "srsly?" patch [19:52:38] * James_F grins. [19:53:16] Reedy, anomie: Will there be a follow-up to (a) convert TrustedXFF.php, and (b) unblacklist it from the linter? ;-) [19:53:23] Sounds like effort [19:53:37] Gotta wait for .18 or a backport [19:53:44] wait [19:53:54] or just manually change the file in wmf-config [19:54:02] It's not used as an exten … yeah. [19:54:23] But waiting a week is fine too, no rush. [19:54:33] And changing a 600k code file is less than fun. [19:54:52] it's only 2 lines to change [19:55:22] Aren't they nested? [19:56:37] no [19:56:37] https://noc.wikimedia.org/conf/highlight.php?file=trusted-xff.php [19:56:42] return array( [19:56:42] '400C6000' => true, [19:56:42] '400C6001' => true, [19:56:49] return array( [19:56:49] '400C6000' => true, [19:56:49] '400C6001' => true, [19:56:53] ffs [19:56:58] '052DC058' => true, [19:57:00] '052DC059' => true, [19:57:02] ); [19:57:59] Ah then. [21:01:24] RFC meeting starting now in #wikimedia-office: Move most of MediaWiki within a /core folder