[01:27:21] TimStarling: ParserTestTest? [01:28:00] we do need ParserTestRunnerTest, I have some more concrete ideas for that [01:28:02] (we do have a MediaWikiTestCaseTest) [01:28:21] things like making sure certain kinds of data doesn't leak beyond teardown [01:56:10] TimStarling: do you think you could take a look at https://gerrit.wikimedia.org/r/#/c/310306/ and see if you think the architecture is a good direction? [01:59:03] good commit message [02:05:53] oops yeah, that was 7am after an all nighter -.- [02:08:46] looks a bit slow, this is (hopefully) quite a hot path [02:09:26] I mean hopefully as in, hopefully the rest of the link construction stuff is not far far slower [02:11:21] I think I'd be happier if your dispatcher was less generic, like an array of handlers per namespace and an array of handlers for all namespaces [02:12:45] hmm okay. I was also considering having shouldHandle() return self::EXISTS for the super fast ones to be the equivalent of Title::isAlwaysKnown() [02:14:01] are you thinking that the namespace separation would reduce the overhead of all the shouldHandle() calls? [02:14:06] yes [02:14:51] or maybe benchmark it to see if this is a rational concern [02:15:26] but the rule of thumb I usually use is that you pay per function call [02:15:44] you can basically count the function calls as a performance estimate [02:17:12] alright. does isset() count as a function call? [02:17:24] no [02:19:19] so there's 6 subclasses, so 6 shouldHandle() calls for add(), then exists() make 6 calls of shouldHandle() and 6 of exists(), so we're up to 18 [02:22:25] ok, I should be able to cut down the shouldHandle() calls [02:22:53] there are also going to be 2-3 more subclasses added by extensions that currently use the TitleIsAlwaysKnown hook [02:26:25] yeah, all the more reason to make it O(1) in the number of extensions [02:28:16] that's the difference between bloated and fully-featured, right? whether or not an application has a latency proportional to its number of lines of code [02:28:48] heh [05:26:41] TimStarling: https://gerrit.wikimedia.org/r/#/c/310731/ [05:29:44] what was it previously used for? [05:32:42] LBFactory::disableBackend() [05:36:12] which is itself unused in core, btw, now the installer calls MediaWikiServices::disableStorageBackend directly [05:54:15] TimStarling: https://gerrit.wikimedia.org/r/310536 [07:33:04] TimStarling: I'd appreciate your help with figuring out how to benchmark this, and how to interpret the results... I tried using enwp's Australia with templates + InstantCommons and ended up with https://paste.fedoraproject.org/428247/92468114/raw/ (--reset-linkcache is introduced in https://gerrit.wikimedia.org/r/310743) [07:34:23] I also rewrote the patch based on what you suggested and implemented a Title::isAlwaysKnown() like shortcut for the faster existence checks [07:36:16] suspiciously fast [07:37:06] I can't really discuss it in detail right now because I have a very short window of overlap with the europeans in #mediawiki_security [07:37:40] ok, tomorrow or monday is fine [16:41:47] ostriches: ping on https://gerrit.wikimedia.org/r/#/c/309519/ [16:43:20] oh yeah [16:46:14] thanks :) I also have https://gerrit.wikimedia.org/r/#/q/topic:linker-formatsize to deprecate Linker::formatSize() which is a dumb function [16:56:59] ostriches: did you have a dashboard for the globaltitlefail log? [17:01:27] legoktm: no cuz it doesn't go to logstash [17:04:54] bummer [18:47:54] legoktm: around? Do we have a way to resubmit the LocalRenameUserJob jobs ? [18:48:04] hi [18:48:11] there is a few being blocked on mw.org rename due to https://phabricator.wikimedia.org/T145596#2640418 [18:48:14] fixStuckRenames.php in CentralAuth [18:48:22] will try thanks ! [18:51:57] * hashar cherry picks [18:59:48] legoktm: so gotta run the script against each wiki that got the stuck job right ? [19:00:10] yeah [19:00:48] will do the grunt work :]  thx [20:08:49] I guess https://phabricator.wikimedia.org/T41473 can be closed now? [20:08:52] ( TimStarling ) [20:18:23] anybody knows how I can get comments that are in file history? [20:19:09] ? [20:19:15] From where? [20:20:02] Reedy: if I have File object [20:22:33] $file->getDescription( File::FOR_THIS_USER, $user ) [20:22:42] That's vaguely what's used on an image page [20:23:06] // Don't show deleted descriptions [20:23:06] if ( $file->isDeleted( File::DELETED_COMMENT ) ) { [20:23:06] $row .= '' . [20:23:07] $this->msg( 'rev-deleted-comment' )->escaped() . ''; [20:23:07] } else { [20:23:09] $row .= '' . [20:23:11] Linker::formatComment( $description, $this->title ) . ''; [20:23:13] } [20:23:20] SMalyshev: Look at ImageHistoryList.php [20:24:10] Reedy: aha, thanks, this looks like what I need! [21:21:30] Krinkle: closed [21:38:33] legoktm: https://phabricator.wikimedia.org/P3984 went from 101 -> 94 entries (since last week) [21:38:44] yay! [21:39:17] do you want to start filing bugs for those? some of them look pretty easy [21:40:04] Lots of them are probably unmaintained core crap :p [21:41:04] or easy stuff like https://gerrit.wikimedia.org/r/311028 [21:41:08] * legoktm glares at MaxSem [21:43:08] * MaxSem is not ashamed! [21:46:39] TemplateSandbox fixes: https://gerrit.wikimedia.org/r/311029 [21:57:46] > 16 htmlspecialchars/Message->__toString/Message->toString/Message->parseText/MessageCache->parse [21:58:05] something calling htmlspecialchars on a Message object...we need one more layer of the stack trace for that [22:00:13] The GetPreferences hook needs context [22:00:22] ConfirmEdit: https://gerrit.wikimedia.org/r/311034 [22:04:07] ostriches: is your EditPage wfMessage patch not merged yet? [22:04:47] No not yet [22:04:59] https://gerrit.wikimedia.org/r/#/c/309044/ [22:14:57] legoktm: Prolly shouldn't last minute swat these today... ;-) [22:15:14] heh [22:15:29] Much as I would like to see that log empty out some more. [22:15:31] https://gerrit.wikimedia.org/r/#/c/311042/ fixes 2 extensions! [22:17:31] 1 wfCollectionSuggestAction/Message->parse/Message->toString/Message->parseText/MessageCache->parse [22:17:34] Ewwww, Collection [22:18:51] Ewww, it uses AjaxResponse * $wgAjaxExportList [22:18:54] * ostriches runs and hides [22:19:35] ostriches, it also has $wgCollectionStyleVersion [22:20:05] I miss $wgStyleVersion [22:21:07] That $wgCollectionStyleVersion is completely useless. [22:21:20] Basically appends a ?N to a single image. [22:21:29] it's used in oly one place, feel free to remove [22:21:29] ostriches: Krenair has a patch for that, but it's stuck on some other things :( [22:23:54] https://gerrit.wikimedia.org/r/311047 [22:25:23] congrats, pending patches to that repo just got even less mergeable :} [22:38:12] one of the AF warnings: https://gerrit.wikimedia.org/r/311049 [22:42:48] ostriches: we could backport https://gerrit.wikimedia.org/r/311052 [23:13:59] is there any way to check out the subversion repository? [23:14:03] phabricator-ssh-exec: This repository ("rSVN") is not available over SSH. [23:14:15] says svn checkout svn+ssh://vcs@git-ssh.wikimedia.org/diffusion/SVN/ subversion [23:14:39] TimStarling that is better if you ask #wikimedia-devtools [23:14:47] since that channel is for phabricator [23:15:06] It may be because ssh instead setup for svn+ssh [23:15:10] try the http one [23:16:21] what HTTP one? [23:17:22] It should have offered you the http one [23:17:30] Oh maybe it is hidden [23:18:23] Oh they need to add the http links to https://phabricator.wikimedia.org/diffusion/SVN/manage/uris/ [23:19:43] oh sorry TimStarling http access does not work for svn, guess you want to ask twentyafterfour who would know if something needs a config change for this too work