[00:22:16] AaronSchulz: I'm not sure I understand the rationale for WANObjectCacheReaper [00:22:23] isn't kafka meant to be reliable by itself? [00:28:17] TimStarling: the kafka thing was held/scrapped in favor of trying mcrouter (though that is *not* reliable in that sense) [00:28:47] (the idea is that we probably want to use it anyway to replace twemproxy) [00:29:16] so there will be no EventRelayer? [00:30:31] reading https://github.com/facebook/mcrouter/wiki/List-of-Route-Handles [01:10:34] ostriches: is the wikidata issue resolved? does it need backport? [01:10:55] Resolved, no backport needed on Wikidata's end [01:11:00] Ended up backing the core changes out [01:11:29] ok [01:11:33] just checking [01:11:36] https://phabricator.wikimedia.org/T155625#2950778 explains it [01:11:44] Thx for checking back in [01:11:47] i suppose the core change affected other things [01:12:07] At least Wikibase + LQT were affected in prod...possibly others, but those were the ones we'd already noticed [01:12:17] Plus who knows how many non-WMF-installed extensions [01:12:46] yeah [01:12:47] ok [01:16:10] AaronSchulz: lulz. wikitech uses JobQueueDB and it does not implement getAllAbandonedJobs() [01:28:59] AaronSchulz: review done [02:38:36] TimStarling: Looking at image/oldimage stuff and updating the RFC page in prep for last call. Noticed that we forgot to mention img_size. It's implicitly planned for removal by saying "the new file will a minimal subset" and we didn't whitelist it as "Keep." [02:38:48] Looking at usage, it's checkImages.php and SpecialMediaStatistics [02:39:01] The former seems like it could query filerevision instead with a JOIN (not too slow?) [02:39:06] Not sure about SpecialMediaStatistics. [02:39:33] Does it make sense for it do the SUM() via a join, too? Or would it be worthwhile to keep that duplicated and uptodate in 'file' per the current revision as well? [02:39:36] There's no other usage. [02:41:00] looking [02:41:57] yeah, I think that can be done with a join [02:42:20] both I mean [02:42:46] Special:MediaStatistics is already a QueryPage with isExpensive() = true [02:43:34] which means the query will only be run periodically [02:53:12] TimStarling: Cool. [03:02:06] TimStarling: Got one last one (really) - img_user: Used by ApiQueryAllImages and NewFilesPager (SpecialNewimages) to join against user, for filtering bots. [03:02:33] The equivalent for new pages doesn't need a third join since it uses recentchanges instead of page+revision to find the page author. [03:02:48] A third join might get much, but again, not sure. [03:03:36] NewFilesPager is already joining on recentchanges if hidepatrolled is specified [03:04:18] actually I don't understand this query [03:04:31] bot changes are those with rc_bot, not those made by bot users [03:04:48] so it really should be a join on recentchanges already [03:05:56] see in EditPage::attemptSave(): [03:05:57] # Allow bots to exempt some edits from bot flagging [03:05:57] $bot = $wgUser->isAllowed( 'bot' ) && $this->bot; [03:06:12] this is also exposed in API edit [03:06:38] $this->bot = $request->getBool( 'bot', true ); [03:07:50] ooohhh, new secret channel ;) [03:08:21] actually it was the team channel for the old MW core team [03:08:25] we just never bothered leaving it [03:08:27] Krinkle/TimStarling: I mentioned to DanielK_WMDE that you were discussing the image/oldimage RfC here. [03:09:14] ApiQueryAllImages has the same error, it mistakes edits made by bots for bot edits [03:10:08] with the complication that it is potentially looking at old images [03:11:07] of course I am looking at the edit code not the upload code... [03:13:32] TimStarling: Hm.. yeah, that makes sense actually. It shouldn't use ug_group=bot. That's quite dated. Especially with User::isBot, global bot groups, and other groups that provide the "bot" user right, as well as the fact that even users with the bot right in the "bot" group can choose to use it on a case-by-case basis. And of course, users can join/leave [03:13:32] the group at any time. [03:14:23] Although except for the latter, upload may not yet expose that ability. Either way it's wrong though. And I certainly hope that rc_bot does get populated correctly regardless of whether users can use &bot with uploads. [03:14:33] Anyhow, I'll consider that a "Remove". [03:15:04] yeah, remove img_user [03:15:37] ha [03:15:55] so upload does support the same opt-out mechanism, but it's very ugly [03:16:20] it's called 'wpUploadReallyDoBot' instead of 'bot' ? [03:16:39] about 10 layers deep into the backend, RecentChange::newLogEntry() gets the bot flag from $wgRequest [03:16:53] so it's generic for any loggable action [03:17:13] Oh wow. [03:17:14] 'rc_bot' => $user->isAllowed( 'bot' ) ? (int)$wgRequest->getBool( 'bot', true ) : 0, [03:22:52] not sure what to do about ApiQueryAllImages, like I say, bot status is properly only in RC [03:23:08] which means the concept stops existing once a row falls out of recentchanges [03:23:31] which means that bot filter in ApiQueryAllImages shouldn't really exist [03:24:06] to make it exist and be correct, you would have to copy the bot flag into filerevision, which is a moderately sized can of worms [03:26:53] doing a LEFT JOIN means that bot uploads will become non-bot uploads when they fall out of RC [03:27:05] maybe that's a reasonable compromise [04:08:49] TimStarling: Aye, yeah, there's been talk about moving _bot and _patrolled to the revision table instead. [04:09:04] For recent changes, patrolling itself is limited to those rows existing, so it would only be for historical record. [04:09:15] However for detecting of bot flag, it's another matter entirely indeed. [04:09:28] When navigating through all images, excluding bots is not RC related. [04:09:45] underlying end-user use case is not very explicit. [04:10:12] Finished editing of the RFC page and sent Last Call. [07:18:02] ostriches: re user-level errors, T154112#2908000 is the same issue with another extension [07:18:02] T154112: Editing session issue (TitleBlacklist related?) on *.wikipedia.org for Kvardek du - https://phabricator.wikimedia.org/T154112 [07:18:14] I'll write a patch [07:18:19] ...eventually [15:44:17] Krinkle, TimStarling: In December 2016, there were only 45 hits using the filterbots parameter (with a value other than 'all'), all with the same user agent that refers to a domain without a website or other obvious online presence. I think we could safely fast-track the deprecation and removal of that parameter. [20:07:03] dr0ptp4kt: Meeting? [21:26:07] MatmaRex: T155780 [21:27:18] Hmm, stashbot disappeared (bd808?). Ok, https://phabricator.wikimedia.org/T155780 [21:27:53] anomie: huh. [21:28:22] anomie: looks like a bug in Special:Contributions, that field is obviously not required [21:28:42] anomie: should i submit a patch or are you? [21:28:59] MatmaRex: It's one of those "conditionally required" situations, it's only required if you select the corresponding radio button. [21:29:43] MatmaRex: You submit, I'll review it. [21:30:10] anomie: submitting the form with it empty doesn't even show an error [21:30:14] https://gerrit.wikimedia.org/r/333121 [21:30:32] It doesn't give any results either [21:32:35] * anomie +2s