[00:23:24] I tried profling some tests: [00:23:24] 10742.02% 11670.879 7742 - DatabaseBase::query [00:25:00] ouch [00:28:27] AaronSchulz: https://gerrit.wikimedia.org/r/#/c/181006/ [00:31:59] bd808: i'm a bit confused by _joe|off's comments -- xenon is not the hotprofiler, and not related to xhprof, and doesn't use getrusage() at all [00:32:10] but xenon is what he disabled [00:32:37] they turned of xenon and ProfilerXhprof [00:33:02] I think it was grab bag attempt at stopping the lockups [00:33:35] I19fae102655182d98c07b0160704e9787ed763ec [00:33:37] I haven't grepped the hhvm source yet, but it's probably xhprof that is to blame [00:34:11] There was a followup too [00:34:54] https://gerrit.wikimedia.org/r/#/c/180806/ [00:35:16] And this big hammer -- https://gerrit.wikimedia.org/r/#/c/180790/ [00:35:20] 3MediaWiki-Core-Team, Librarization: Librarization wishlist cleanup - https://phabricator.wikimedia.org/T1118#934497 (10aaron) [00:56:28] bd808: https://gerrit.wikimedia.org/r/#/c/181009/ didn't really speed up the tests, but it dropped memory usage by 3MB [01:04:35] 3MediaWiki-Core-Team, Wikimedia-Logstash, operations: Improve logstash - https://phabricator.wikimedia.org/T84895#934609 (10Gage) I've created a separate task for {T84958} [01:44:47] <^d> robla: SOA Auth is T84962. [01:45:18] cool beans :-) [01:45:38] !b T84962 [01:45:44] heh...that didn't work [01:46:08] <^d> !phab T84964 [01:46:14] <^d> no bot? [01:46:21] wikibugs doesn't respond to humans. [01:46:31] <^d> wm-bot3 is here. [01:46:35] oh [01:46:42] 3MediaWiki-Core-Team: Authn and authz as a service - https://phabricator.wikimedia.org/T84962#935456 (10Chad) 3NEW [01:46:43] 3MediaWiki-Core-Team: Authn and authz as a service - https://phabricator.wikimedia.org/T84962#935456 (10Chad) [01:46:47] !phab is https://phabricator.wikimedia.org/$1 [01:46:47] You are not authorized to perform this, sorry [01:46:51] You are unknown to me :) [01:46:51] @whoami [01:46:55] lame [01:47:57] I am running http://meta.wikimedia.org/wiki/WM-Bot version wikimedia bot v. 2.6.2.0 [libirc v. 1.0.2] my source code is licensed under GPL and located at https://github.com/benapetr/wikimedia-bot I will be very happy if you fix my bugs or implement new features [01:47:57] @help [01:48:14] I trust: ori!.*@wikipedia/ori-livneh (2admin), [01:48:14] @trusted [01:49:44] I wouldn't. [01:50:14] I am running http://meta.wikimedia.org/wiki/WM-Bot version wikimedia bot v. 2.6.2.0 [libirc v. 1.0.2] my source code is licensed under GPL and located at https://github.com/benapetr/wikimedia-bot I will be very happy if you fix my bugs or implement new features [01:50:14] @help [01:50:50] Successfuly added legoktm!.*@wikipedia/Legoktm [01:50:50] @trustadd legoktm!.*@wikipedia/Legoktm admin [01:51:27] @trustedadd .*@mediawiki/demon admin [01:51:36] Successfuly added .*@mediawiki/demon [01:51:36] @trustadd .*@mediawiki/demon admin [01:51:54] !phab is https://phabricator.wikimedia.org/$1 [01:51:54] Key was added [02:02:25] ^ ^d [02:14:32] 3MediaWiki-Core-Team, operations: HHVM is segfaulting 1/hour/server in production - https://phabricator.wikimedia.org/T78771#935516 (10ori) [02:14:49] thanks [03:30:18] TimStarling: any idea why my change you +2'd just now went V-1? [03:30:58] test failure [03:32:11] TimStarling: yeah, i got that much, but that change shouldn't have caused that [03:32:11] AutoLoaderTest::testAutoLoadConfig [03:33:44] well, maybe no scribunto changes have been merged since that test was introduced [03:33:59] * jackmcbarn makes a dummy change to test t hat [03:34:57] try adding those classes to $wgAutoloadClasses in Scribunto.php [03:35:18] if that works, then we can merge that and then rebase the other change on top of it [03:41:28] TimStarling: https://gerrit.wikimedia.org/r/#/c/181029/ [04:12:04] jackmcbarn: just en and qqq? [04:12:32] is that a documented way to remove a message? [04:12:38] it's easy enough to remove the rest of them with sed [04:12:54] TimStarling: according to https://www.mediawiki.org/wiki/Localisation#Removing_existing_messages that's all i should do [04:13:05] (and until recently, presumably when that test was added, it said not to even remove qqq) [04:13:16] ok [04:35:44] jackmcbarn: you've got a few changes that have had negative reviews and should probably be abandoned [04:36:21] * jackmcbarn looks [04:36:33] normally the author would click the abandon button [04:37:56] for example your oldest open change, https://gerrit.wikimedia.org/r/#/c/107009/ [04:38:10] that one i plan to fix [04:38:13] just haven't got around to it [04:38:19] right [04:38:47] another is https://gerrit.wikimedia.org/r/#/c/115696/ [04:40:22] ok, i abandoned all the hopeless ones [04:40:27] thanks [04:40:34] np [04:40:37] thanks for the merges [04:41:24] no problem, sorry about the delays [05:29:45] jackmcbarn: your changes often seem to stall for want of a rationale [05:30:11] e.g. https://gerrit.wikimedia.org/r/#/c/178760/ [05:30:20] most (all?) of them do have a rationale; i just forget to mention it [05:30:43] you can always link to an en.wp module which would benefit from it, or something like that [05:31:19] ori: https://gerrit.wikimedia.org/r/#/c/180702/ [05:33:37] TimStarling: actually, i just remembered a question i've been wanting to ask you. you know how does side-effecty stuff when it's parsed, and it causes all kinds of oddities? what do you think of making the side-effecty stuff happen when it's unstripped instead of when it's first seen? [05:33:47] (that would remove the need for the argKeys change you just reminded me of) [05:34:30] you mean order references by the order of their strip tags in the HTML, rather than by their parse order in the wikitext? [05:34:53] well, that would be an effect of it, but unrelated to the reason for doing it [05:35:15] what is the reason for doing it? [05:35:57] currently, if you pass references in arguments to Scribunto, and you then call pairs(frame.args), the references get evaluated in what may as well be a random order [05:36:12] (and in parsoid, they're in html order) [05:36:52] that still sounds like what I said [05:37:12] anyway, it sounds beneficial [05:37:37] I'm sure users expect references to be numbered in the order they appear in the output [05:38:07] in general, I'm sure there is always a rationale in your head somewhere ;) [05:38:07] are you okay with the strip list containing callables and not just strings? [05:38:57] commit messages are partly documentation of what you did, and partly a message to the reviewer to get them excited about your change [05:39:51] a lot of my pull request descriptions for HHVM are along the lines of "fix terrible breakage in X" [05:40:44] because you have to communicate why you are doing the change and why it is important, in a way the reviewer will understand, otherwise the change will stall [05:42:28] I guess it is fine to add callables [05:42:54] unstrip() used to be done with strtr(), so there was no way to do callables [05:43:32] but now that we do it with preg_replace_callback(), it is easy enough [05:44:06] of course it means we won't be able to go back to strtr() if we wanted to [05:45:38] well, you could expand callables in advance, preparing the array to go into strtr() [05:45:51] that could be done based on the order of strip markers in the HTML [05:46:14] ok, no downside I guess [05:47:20] ok, i'll try doing that, and if it works, i'll abandon the argKeys stuff [05:48:09] sounds good [05:49:07] it also means we will hide references items with no citation [05:49:22] e.g. {{#if:blah||}} [05:53:07] which is a good thing imo [05:53:13] (and parsoid already does that too) [06:09:58] AaronS: are you still interested in having PHP back traces for SlowTimer log entries? [06:10:42] it looks quite easy to do [06:28:18] <_joe_> TimStarling: I was trying to apply https://github.com/hhvm/hhvm-third-party/pull/40 on 3.3.1 and ran in some issues, would you be around in ~ 30 minutes if I need an advice? [07:01:08] <_joe_> TimStarling: I managed to fix it, nevermind. It was missing a couple of includes and a bracket [07:55:34] 3MediaWiki-Core-Team: Retest geo indexes in Orient - https://phabricator.wikimedia.org/T84975#935736 (10aaron) 3NEW a:3aaron [09:03:20] 3MediaWiki-Core-Team: Convert old 1.23wmf* and 1.24wmf* deployment branches to tags - https://phabricator.wikimedia.org/T1288#935821 (10fgiunchedi) if there's no harm in keeping the tags around it sounds like a good idea to have them [13:43:40] 3MediaWiki-Core-Team, operations, Services: Set warning thresholds for average cluster utilization - https://phabricator.wikimedia.org/T76306#936232 (10fgiunchedi) for capacity planning purposes I think we should also get an idea of how many requests a box can handle within a threshold latency, look at requests/... [14:24:19] 3MediaWiki-Core-Team, operations: Review Graphite scaling options - https://phabricator.wikimedia.org/T1018#936310 (10fgiunchedi) a:3fgiunchedi assigning to myself, though not sure what this is about [14:48:36] * anomie doesn't like that bug 1 is now T2001, rather than T1. [15:17:55] 3MediaWiki-API, MediaWiki-Core-Team: allpages filterlanglinks DBQueryError - https://phabricator.wikimedia.org/T78276#936384 (10Anomie) 5Open>3Resolved The fix should be deployed to WMF wikis with 1.25wmf14, see https://www.mediawiki.org/wiki/MediaWiki_1.25/Roadmap for the schedule. [15:32:57] <^d> anomie: Compromises. I wanted them to match up 1:1 too, but that ship sailed too early :( [15:33:18] <^d> +2000 and keeping them sequential was Good Enough for me :) [15:34:14] ^d: BTW, T85002 [15:35:09] <^d> bleh, gerrit. [15:35:21] <^d> anomie: Have you heard that gerrit is terrible and I want to kill it? [15:35:22] <^d> :) [15:35:27] Yeah, at least it's +2000 rather than having to search for that "Reference: bz1" field. [15:36:11] Yes, I have heard that. I also heard that they're taking whatever the real name for "pherrit" is slow. [16:50:15] 3MediaWiki-Core-Team, Scrum-of-Scrums, LabsDB-Auditor: Manually verify whitelisted.yaml / graylisted.yaml to ensure completeness - https://phabricator.wikimedia.org/T78730#936485 (10mark) [17:01:11] <^d> manybubbles: Did we have a task in RT/Phab for shutting down lsearchd? I know of the "see if anyone's using it" one [17:01:26] <^d> I didn't know if we had something from RT that's in Phab now for "shut er down" [17:26:38] ^d: I believe I wanted to make whatever tasks should be left for shut er down after digging into see if anyone is using. like if we have to notify people or implement something else or whatever. basically make a bunch of stuff and then add them as blockers for shuterdown [17:27:14] <^d> I already wrote some patches for dismantling 2, 4 and 5's lvs. [17:28:01] ^d: oh sweet! i think we should wait until january for it though [17:28:08] I think we're officially in no deploy land [17:28:34] <^d> Those pools are in ops' hands now, it's not a deploy thing. [17:28:39] <^d> I already disabled them yesterday. [17:52:25] legoktm: ping [17:53:11] hoo: hi [17:53:38] legoktm: Do you have any bug reports about SpecialMergeAccount bricking accounts? [17:53:43] Because that seems to be the case [17:53:47] we have two cases at dewiki [17:54:01] hadn't heard anything about that... [17:54:18] bricking in the sense that a different password is being set? [17:55:09] legoktm: Email being lost and password "no longer works" [17:55:16] but they have a password set [17:55:41] but if eg. the salt is being set wrong, that might just not work [17:56:24] is it just picking a different home wiki than the user is expecting? [17:56:45] No, dewiki [17:56:57] and the password was copied 1:1 from dewiki to CA [17:57:27] hm [17:57:57] what usernames? [17:58:12] 3Services, MediaWiki-Core-Team, operations: Set warning thresholds for average cluster utilization - https://phabricator.wikimedia.org/T76306#936595 (10GWicke) Lets not allow the perfect to get in the way of the good though. For some services it will make sense to do a deep investigation of latency vs. load, but... [17:58:45] hoo: is this an initial merge or just attaching accounts merge? [17:58:56] initial, sorry I'm tired :P [17:59:11] yeah [18:00:23] protected function storeGlobalData( $salt, $hash, $email, $emailAuth ) { [18:00:48] $ok = $this->storeGlobalData( [18:00:48] $home['id'], [18:00:57] user_id is still the salt right? [18:01:05] yep [18:01:14] dewiki user id became the salt [18:03:16] what's their username? [18:03:30] "Hob Gadling" is one [18:03:46] and "Fingalo" [18:03:57] Manually fixed the first one [18:04:01] second one is untouched [18:04:06] ok, /me looks [18:04:59] Fingalo has an email set for the global account [18:05:00] mh [18:05:55] I'm looking through the code, and I don't actually see where we use the salt. [18:06:05] $hash = $passwordFactory->newFromCiphertext( $encrypted ); [18:06:05] if ( $hash->equals( $plaintext ) ) { [18:06:36] The salt is actually supposed to be in the password serialized hash [18:06:39] nowadays [18:06:50] right [18:06:50] but could be that some legacy stuff is acting up [18:07:12] my gu_salt is '' [18:07:13] remember that bug I recently closed... that stuff is a little messy with legacy things [18:07:21] * fix [18:07:23] * fixed [18:07:35] and CentralAuthUser::saltedPassword() returns '' for the salt [18:09:38] base64_decode( $this->args[0] ) is the salt [18:11:15] hoo: can you file a bug for this? [18:11:36] legoktm: I start to think it might not actually be a bug [18:11:47] but not sure [18:12:04] It should probably have picked the enwiki email if dewiki didn't have one [18:12:16] and the other account is just old [18:12:24] Probably forgot their dawiki password [18:12:32] and am blaming that on CA now [18:12:36] heh [18:13:05] well, if you think we should look into it further, file a bug :P [18:13:45] * legoktm will be afk for a bit [18:14:08] Maybe I can make DerHexer do that [18:47:26] <_joe|off> AaronS: the 2 server orientdb cluster will be live on monday [18:47:54] cool, the I thought it was 3 [18:48:02] s/the/though [18:48:46] <_joe|off> AaronS: no we just have 2, that's the best I could obtain in this timespan [18:49:13] <_joe|off> we had no other spares with ssds larger than 120 GB [18:49:24] <_joe|off> also, we have 600 GB/server here [18:49:51] <_joe|off> so you may as well use a replication factor of 2 (in this case that means mirroring, but whatever) [18:50:07] well that plan was always factor=2 [18:51:12] <_joe|off> AaronS: if we decide to go on with orient experimentation, I'll see if we can obtain one more [19:08:03] heh. 39.5K of the last 42.5K log events from group0 are GlobalTitleFail [19:37:37] ^d: How do I use a db group in InitializeSettings? I want to make a patch that sets $wmgUseMonologLogger true by default and false for wikipedia.dblist. Do I just use 'wikipedia' as the key in the config array? [19:49:36] 3MediaWiki-Core-Team: Fix onResponses in VirtualRESTServiceClient - https://phabricator.wikimedia.org/T85016#936796 (10aaron) 3NEW a:3aaron [19:52:40] <^d> wikipedia is the weird one. [19:52:42] <^d> the others, yes. [19:52:45] <^d> I remember people tripping up on it so I've always avoided it. [20:40:25] ^d: Using 'wikipedia' trips folks up because we don't check for it in CommonSettings. :/ [20:40:38] <^d> Ah [20:41:16] https://github.com/wikimedia/operations-mediawiki-config/blob/master/wmf-config/CommonSettings.php#L198-L205 [20:41:59] * bd808 wonders if small + medium + large is the same thing [20:42:34] nope [20:43:55] small medium large is for how often we run the special page maint. script update stuff [20:43:55] and Reedy updates it every so often [20:57:41] 3MediaWiki-Core-Team, operations: Come up with key performance indicators (KPIs) - https://phabricator.wikimedia.org/T784#937037 (10Gage) [21:04:10] Reedy: When you are about, I'd love to know if this seems like a bad idea to you (adding mediawiki.dblist check in CommonSettings) [21:06:05] does 'wiki' work? [21:06:57] https://noc.wikimedia.org/conf/highlight.php?file=wgConf.php [21:07:01] 'wikipedia' => 'wiki', [21:07:23] I don't know actually. is dbsuffix checked in SiteConfiguration [21:08:27] yup, suffix is checked [21:08:39] so I don't need that hack [21:08:42] sweet [21:30:56] if the xenon/xhprof profilers are loaded in StartProfiler.php, that means we aren't profiling multiversion right? [21:46:27] legoktm: we aren't profiling the bits that occur before the top os WebStart, no [21:46:34] *top of [21:46:56] right [21:47:11] <^d> We could do better in possibly actually. [21:47:22] bd808: so, multiversion's composer autoloader is registered before MW's is, but I'm not sure how we can measure whether it's slow or not [21:47:25] <^d> s/possibly/prod/ [21:47:52] legoktm: hmm... [21:48:26] I don't see it on the xenon graphs nor the xhprof output [21:48:28] we could profile using T.im's raw xhprof technique outside of most things [21:48:45] <^d> Yeah [21:49:50] multiversion's autoloader *should* be fast because it has a very small set of classes to manage [21:49:54] very small [21:50:00] 3MediaWiki-Core-Team: Support strategy #1 in PoolCounter classes - https://phabricator.wikimedia.org/T85026#937122 (10aaron) 3NEW a:3aaron [21:52:56] http://fpaste.org/161470/41902593/raw/ [21:59:25] I think I need to read their autoloader. Is it memozing all the misses? [21:59:34] and if so why? [22:00:01] oh, I bet because they are building a chained loader inside their public loader [22:00:06] hmm [22:00:22] 3MediaWiki-Core-Team: Support strategy #1 in PoolCounter classes - https://phabricator.wikimedia.org/T85026#937162 (10aaron) Per https://gerrit.wikimedia.org/r/#/c/178903/ [22:01:34] MessageCache::parse called by Sanitizer::encodeAttribute/Message::__toString/Message::toString/Message::parseText/MessageCache::parse with no title set. [22:02:17] That's a message being passed to encodeAttribute() without being unwrapped [22:02:55] * bd808 wonders how these will ever all be squashed [22:03:16] yeah, there are also instances of stuff passing Message objects to Html:: functions [22:03:39] bd808: We killed the top 2-3 globaltitlefails already :D [22:03:47] just need to keep working down the chain :P [22:03:52] I wonder if we should back up and look at Message::__toString being called? [22:04:01] legoktm: w00t [22:24:14] manybubbles: actually, https://gerrit.wikimedia.org/r/#/c/178903/12/PoolCounterClient_body.php should go in core come to thing of it [22:24:42] AaronS: I suppose so. [22:24:46] manybubbles: meh doesn't matter [22:24:57] we don't use a delegation pattern anyway [22:24:58] not _too_ much [22:25:13] so it has to be redone in each class...another refactoring project ;) [22:26:16] manybubbles: this is why I love the pattern of a final base class method that delegates to a doX version that is subclasses [22:26:40] you can always add in common logic like this without moving stuff around and breaking internal interfaces [22:27:47] * AaronS will make some patches [22:29:08] 3MediaWiki-Core-Team: Generalize nowait logic in PoolCounterClient_body - https://phabricator.wikimedia.org/T85031#937201 (10aaron) 3NEW a:3aaron [22:32:27] 3MediaWiki-Core-Team: Generalize nowait logic in PoolCounterClient_body - https://phabricator.wikimedia.org/T85031#937216 (10aaron) Move some code to the base class [22:33:21] <^d> bd808: Why does this look like it's doing things twice when I run puppet? https://phabricator.wikimedia.org/P175 [22:34:40] ^d: It's the 2 phases (compile and apply) both loading ruby plugins. Not a big deal. [22:35:16] masterless puppet does basically the things the master would do (compile) and then the things the client would do (apploy) [22:35:27] <^d> I figured it was something it just looked duplicate-y. [22:35:28] <^d> :) [22:35:31] *apply [22:36:01] Puppet does some truly horrible things under it's hood. You don't want to know how the sausage is made. [22:36:43] <^d> mmm, sausage. [22:54:54] 3LabsDB-Auditor, MediaWiki-Core-Team, Scrum-of-Scrums: Manually verify whitelisted.yaml / graylisted.yaml to ensure completeness - https://phabricator.wikimedia.org/T78730#937252 (10Legoktm) Where are whitelisted.yaml and greylisted.yaml? [22:58:14] 3SUL-Finalization, MediaWiki-extensions-CentralAuth, MediaWiki-Core-Team: Re-enable $wgCentralAuthAutoMigrate = true - https://phabricator.wikimedia.org/T78727#937255 (10Legoktm) >>! In T78727#931822, @Nemo_bis wrote: > Do we know how many users were merged thanks to $wgCentralAuthAutoMigrate? legoktm@fluorine:... [23:08:42] greg-g: what's the policy on backporting stuff during the deployment freeze? there's a CA patch I'd like to get deployed so we can turn a feature back on: https://gerrit.wikimedia.org/r/181206 [23:09:09] you need a really really really good reason [23:11:10] ok, I'll see if I can come up with one [23:11:21] still figuring out whether it's worth it... [23:39:04] hi AaronS [23:39:15] I am updating MultiHttpClient... [23:39:30] and I don't understand your inline comment [23:40:43] looks like a space was removed [23:40:56] it was [23:41:18] gerrit displays it differently to my editor :/ [23:41:53] looks like those key doc lines are indented with a space and a tab [23:42:57] manybubbles: https://github.com/orientechnologies/orientdb/issues/3244 [23:43:02] I'm not just crazy, right? [23:45:05] 699 rows now (I'm adding more edges atm)...so it's the full result set I guess [23:45:24] I guess that earlier issue I had is the same bug in all probability