[00:06:14] Should be good to go now [00:07:51] +2 [00:08:06] * bd808 has not gotten to do a +2 for a while [00:08:30] it seems like I've mostly been reviewing python code hidden in our puppet repo [00:09:11] Does anything actually use that code? :P [00:09:34] toolforge :) [00:09:44] its a cute little project you may have heard of [00:09:50] I was meaning ObjectFactory in mw et al :P [00:10:19] ah! Yeah its used in the logger and also in ... some stuff that AaronSchulz built [00:10:43] I basically extracted it from Aaron's code and added a couple of features when I was building the logger [00:10:49] silly ide search bugs :) [00:11:22] I think maybe bag o stuff and the db layer? [00:11:32] yeah [00:11:33] multiwrite bagostuff [00:11:59] oh hi AaronSchulz [00:12:05] long time no see [00:12:34] hey [00:12:57] apparently BlueSpiceExtendedSearch uses it too [00:13:08] according to github search [00:13:28] Yeah, I just made a patch to make it stop using the global namespace one [00:13:53] Technically, we could kill the compat layer from core then [00:14:08] AaronSchulz: how is life in the big city? [00:15:22] * bd808 had forgotten how fun hadoop can be. 52 minutes of CPU time in 90 seconds of wall clock time [00:25:24] 00:08:00 License "GPL-2.0-or-later" is not a valid SPDX license identifier, see https://spdx.org/licenses/ if you use an open license. [00:25:29] What is this about? When it is... [00:26:17] Reedy: packagist whining at you? [00:26:31] jerkins https://integration.wikimedia.org/ci/job/composer-package-hhvm-docker/1772/console [00:26:39] so composer [00:27:11] * bd808 bets on an old as hell composer version [00:27:45] Have we still not fixed that? [00:30:16] https://github.com/wikimedia/integration-composer [00:43:48] the reduction in api calls mystery is not answered by traffic source. The 60/30/10 split of internet/internal/cloud calls is pretty consistent from the start of 2017 to now. [00:44:09] which I guess means interestingly that the decrease is global and relatively uniform [00:45:22] * bd808 is not a data analyst and is hungery so will stop looking for now [00:45:41] too hungry to spell apparently [00:45:50] bd808: did you just find a new "the decline of editing" graph? :) [00:46:48] greg-g: sort of, but this one is a cliff that happens between 2018-03-04 and 2018-03-05. Half of the action api traffic just disappears [00:47:40] hmmmm... I wonder actually if there is something that moved mobile apps traffic from the api to restbase service... [00:48:29] but... that would not be uniform. it would all be external [00:48:37] * bd808 shrugs [00:51:53] Reedy: btw, could you do another rebuildLocalisationCache comparision, this time HHVM on deploy1001 *with* the JIT flag as well? [00:52:17] We have 40min for Jessie with HHVM, 15min for Jessie with HHVM+JIT, and 15min for Stretch with HHVM (no JIT). [00:52:24] I'm curious if it's even better with JIT [00:52:30] What's the flag? [00:53:02] Reedy: PHP='hhvm -v Eval.Jit=false' [00:53:03] I think [00:53:30] Or PHP=' hhvm -vEval.Jit=1 [00:53:36] I forgot which is the default [00:53:38] mwmaint1001 I used last time [00:54:28] See https://phabricator.wikimedia.org/T191921#4150546 [00:55:09] So yeah PHP='hhvm -vEval.Jit=1' should speed it up in theory [00:55:43] running [00:55:57] adding SSDs would speed it up more :) [00:56:27] to beta? :P [00:56:48] Hehe [00:56:59] ah. beta already has SSD I think... [00:57:14] Also, I see Tyler usually runs it with --threads=6 although I guess as long as we have something to compare with that'd be fine [00:57:23] I think threads=6 is what scap uses on beta [00:57:27] Well, if it's quicker with 1 thread... :P [00:57:36] it's already done with c* [00:57:38] evaluating >>> import multiprocessing >>> max(multiprocessing.cpu_count() - 2, 1); on deploy1001 returns 30 [00:57:46] which is what scap uses [00:57:58] https://github.com/wikimedia/scap/blob/5c3646ad20df9c872ba6c62cc9ceae3f1e7ee219/scap/tasks.py#L535 [00:58:18] the VM sees 32 cores? [00:58:21] Yeah, I don't know if the threads hurt or not. [00:58:32] bd808: 8 presumbaly [00:58:51] ..=6 in beta and ..=30 in prod [00:59:09] ah, that makes more sense [01:00:08] a huge amount of the time in rebuildLocalisationCache is stat calls on all the json source files [01:00:34] Unless I'm confused about which script that is [01:01:52] * bd808 disappears to find food [01:02:35] nds [01:04:06] sa [01:07:15] Krinkle: real 11m45.815s [01:11:22] Reedy: nice [01:11:54] Reedy: Could you add that in a brief comment on T191921 so I can cite it elsewhere? [01:11:54] T191921: mwscript rebuildLocalisationCache.php takes 40 minutes on HHVM (rather than ~5 on PHP 5) - https://phabricator.wikimedia.org/T191921 [01:12:03] I did :P [01:12:10] https://phabricator.wikimedia.org/T191921#4247675 [01:15:09] Krinkle: Hey... Do you reckon those Memcached issues Aaron found might explain some of the weirdness of my devwiki using memcached on PHP7 for a few years? [01:15:27] Reedy: Maybe :P [01:15:52] Reedy: Got a link to one of these bug reports or error messags? [01:16:05] https://phabricator.wikimedia.org/T152345 [01:16:27] I had some others... [01:24:17] Will test after patch is merged :P [01:32:16] Reedy: Yeah, it's possible [01:32:28] I would expect that case to fail quickly with some kind of type error [01:32:39] silly edge cases :) [01:33:05] but it's possible the cas() case would time out if it can't detet the issue and just waits for the unlock that failed [01:33:15] Interesting no one else has seemingly had a problem with it [01:33:23] Wonder if Wikia patched it differently/seperately [01:34:46] definitely worth backporting too :) [01:48:07] legoktm: Guess we can bump MW core to codesniffer 20 :P [01:50:32] Reedy: yes :D will need some exclusions in .phpcs.xml for the new sniffs... [01:50:40] stop adding new stuff then!!! [01:54:38] Should do a release of objectfactory and bring that in too [01:54:41] *I should [01:55:16] Is that v2 worthy bumping PHP version? Or just 1.1? [01:56:52] it's a breaking change [01:57:02] so it needs a new major version [01:57:22] So it is a v2 thing? [01:57:46] yeah, 2.0.0 [01:58:22] Is it worth letting you do a tour-de-codesniffer first? I'm imaginging 18->20 for that isn't gonna make much difference on that library [01:59:31] Meh, either way, it's not urgent [02:03:42] I don't think it's worth breaking compat unless there's something we gain [02:04:16] less copypasta [02:12:51] https://integration.wikimedia.org/ci/job/mediawiki-quibble-vendor-mysql-php70-docker/74/console [02:12:54] https://integration.wikimedia.org/ci/job/mediawiki-quibble-vendor-mysql-hhvm-docker/74/console [02:12:58] Why is quibble a lot faster on hhvm? [02:15:24] legoktm: Any reason we can't use class_alias? [02:15:50] class alias has to be in the file with the class itself [02:16:09] Ah, duh. Makes sense [02:16:39] https://integration.wikimedia.org/ci/job/mediawiki-quibble-vendor-mysql-php70-docker/buildTimeTrend https://integration.wikimedia.org/ci/job/mediawiki-quibble-vendor-mysql-hhvm-docker/buildTimeTrend it looks like php70 is generally faster, but the times are all over the place [02:24:42] oops [02:25:54] heh :) [02:27:40] Reedy: how's https://gerrit.wikimedia.org/r/436715 ? was there anything else you needed to change? [02:28:17] LGTM [02:28:22] I think that's all I changed yeah [02:30:04] yeah, I think I changed mind on mass moving libraries to >=7.0 [02:30:58] I mean, we don't need to necessarily make releases for them all [02:31:50] right, that can be done whenever it needs to be [02:32:18] Maybe that question of whether we want to do a release before bumping to 7, but that can always be done in retrospect too if necessary [02:34:21] https://gerrit.wikimedia.org/r/436716 [02:37:37] 02:27:08 - Installation request for mediawiki/phpunit-patch-coverage 0.0.8 -> satisfiable by mediawiki/phpunit-patch-coverage[0.0.8]. [02:37:37] 02:27:08 - mediawiki/phpunit-patch-coverage 0.0.8 requires wikimedia/scoped-callback ^1.0.0 -> satisfiable by wikimedia/scoped-callback[v1.0.0] but these conflict with your requirements or minimum-stability. [02:37:55] does that library need a patch too? [02:37:57] * Reedy looks [02:38:27] ugh, I need to fix that job [02:38:34] in the mean time, yes [02:39:29] https://gerrit.wikimedia.org/r/#/c/436718/ [02:40:56] ty, I'll tag and update jenkins [02:41:09] filed https://phabricator.wikimedia.org/T196128 for the long term fix [02:48:52] legoktm: any comment on https://phabricator.wikimedia.org/T196043 before I go ahead and email 9 contributors? [02:52:54] TimStarling: I feel pretty stupid for not noticing the incompatibility earlier, after the issues with faststringsearch binaries not being redistributable. For relicensing, we've typically just assumed consent for WMF employees (especially former ones that might be hard to contact) [02:53:28] And I checked that the FSF also agrees that licenses are incompatible: https://www.gnu.org/licenses/license-list.html#PHP-3.01 [02:54:04] I could just notify the non-staff contributors [02:56:00] I think that should be fine [03:05:41] btw I'm waiting with baited breath for your interview scorecard [03:06:38] oops, I usually wait a few hours before filling those out, I'll do that now [03:08:19] My favorite part was how 20 minutes before we were supposed to start, Comcast decided to cut our home internet. Yay for mobile hotspots ^.^ [03:44:57] TimStarling: submitted it [03:54:36] thanks [04:34:09] I do wonder where https://salsa.debian.org/mediawiki-team/wikidiff2/blob/7eaf1f37e9f5fa1818fd7e424a8bb13bc17c6619/debian/copyright came from [04:35:21] https://www.mediawiki.org/w/index.php?diff=205080&oldid=183604&title=Extension:Wikidiff2 [04:37:48] TimStarling: hmm, I think wikidiff2 has been labeled as GPL since the beginning. See https://www.mediawiki.org/wiki/Special:Code/MediaWiki/12986 [04:38:02] > GPL blah blah, see below for history [04:38:44] uh huh [04:40:44] + * Some ideas are (and a bit of code) are from from analyze.c, from GNU [04:40:44] + * diffutils-2.7, which can be found at: [04:43:47] well, if we are saying Dairiki still owns the copyright then it would be GPL 2: https://sourceforge.net/p/phpwiki/code/HEAD/tree/trunk/lib/diff.php [04:44:41] How much of that code still exists today? [04:46:16] https://sourceforge.net/p/phpwiki/code/HEAD/tree/trunk/lib/difflib.php is more relevant [04:47:31] there are hundreds of lines here that are very similar [04:48:40] :/ [04:50:03] I guess it has to be GPL then [04:56:34] ugh [05:06:09] This means we have to remove it from Debian/Ubuntu :( [05:15:20] yes [05:15:55] theoretically we could replace the GPL bits with another diff engine implementing the same interface [05:16:20] ultimately it all comes from published computer science literature [05:20:13] I assume other people have already tried to convince PHP to make their license GPL compatible... [05:28:07] TimStarling: https://gerrit.wikimedia.org/r/436723 [06:10:51] I'll file a follow-up task about the licensing problem [06:20:11] https://phabricator.wikimedia.org/T196132 [07:25:42] Krinkle: https://phabricator.wikimedia.org/R2355:829c9ef924d552503a2e6c70e108a5317dc7b9c3 and you have commit/shell access to the tool now [19:00:18] anomie: Krinkle: since I've spent a bit more time looking at the installer, is there any good reason to let people pick MyISAM if they have InnoDB available? When you pick MyISAM, it puts up a giant warning telling you not to pick it >.< [19:05:03] legoktm: I don't know of any reason for people to use MyISAM anymore. [19:08:25] Double check with the DBAs and remove it? [19:10:20] some hosts providers says innodb is easily data can be lost. [19:10:25] when mysql crashes [19:10:53] Eh? [19:11:00] InnoDB has better crash recovery... [19:11:23] well that is strange but a host provider i knew said that [19:11:24] not me [19:11:24] I thought the only benefit MyISAM has is full text search? [19:11:25] :) [19:11:30] welll [19:11:37] innodb supports full text search now [19:11:42] since mysql 5.6? [19:12:41] MyISAM might still be faster, since it doesn't bother with slow things like transactional integrity. ;) [19:15:00] hahaha [19:15:59] OTOH, a quick search indicates that MyISAM might only have table-level locks so that could make it slower when there are many concurrent writers. [19:26:40] Krinkle: in https://en.wikipedia.org/wiki/Wikipedia:Village_pump_(technical)#Change_coming_to_MonoBook_skin_for_mobile_users there's a note by Graham87 about the jump to links that I think might be related to your change [19:48:31] legoktm: Not that I want to nerd-snipe you or anything, but https://gerrit.wikimedia.org/r/#/c/66054/ might be your kind of thing… Will need discussion etc. [19:49:05] ooooh [19:49:29] Found whilst going through https://gerrit.wikimedia.org/r/#/q/status:open+age:3years which is a bit sad. [20:46:40] legoktm: replied