[01:56:34] running mysqld under eatmydata reduces test execution time from 13 to 5 minutes, it becomes CPU limited [02:00:32] :o [02:00:44] TimStarling: https://gerrit.wikimedia.org/r/plugins/gitiles/integration/quibble/+/master/quibble/backend.py#241 is how we're currently running mysqld [02:01:14] I don't see any eatmydata [02:03:16] but my current aim is to reproduce the test failure at https://integration.wikimedia.org/ci/job/mediawiki-quibble-vendor-mysql-php70-docker/5185/console , so how quibble does it is relevant [02:03:42] TimStarling: it was an attempt to get you to submit a patch to add eatmydata there :) [02:04:42] hmm, ok [02:06:41] TimStarling: are you using quibble/docker when trying to reproduce it? [02:06:49] setting up quibble would hopefully help me to reproduce that bug and also test such a change [02:06:58] no, I'm not using it [02:07:52] https://gerrit.wikimedia.org/r/plugins/gitiles/integration/quibble/+/master/README.rst#46 [02:08:21] yeah, I'm reading that already [02:16:13] James_F: (re 1.33) I think those are the only ones I know of, will add any more if I see them [02:23:10] so I reproduced my test failure without quibble now, so I don't have that as a reason to do it anymore [02:24:04] bummer :) [02:27:27] Kk. [02:28:33] all you have to do is install the package (libeatmydata1) and add to the environment LD_PRELOAD=libeatmydata.so [02:29:29] the package appears to install the library file into the default library path so LD_LIBRARY_PATH is not required [05:01:46] TimStarling: I've seen a decent amount of code that runs Language::convert() over the title text before passing it over to LinkRenderer (e.g https://gerrit.wikimedia.org/g/mediawiki/core/+/master/includes/specials/SpecialMostlinkedcategories.php#87) - is that something LinkRenderer should do by default? (I don't really understand how/when Language::convert() is supposed to be used) [05:07:41] I guess it could be the default [05:08:06] Parser has its own arrangements, in general we would need to avoid converting twice [05:08:45] the converted title is what the user sees in the

, so in some sense it is the page name for them [05:09:28] the complication with using it in maintenance lists is if you have the original title and the converted title both actually existing [05:09:39] could be confusing [05:10:11] but it's probably better to show it in a language the user understands than to show unique garbage [05:11:17] I think the main issue would be the double converting, because I've seen extensions as well apply convert() before passing to LinkRenderer [05:18:27] the fact that editing is always done in the untranslated content language implies that special pages that are tools for editors don't really need conversion [05:19:09] it's implied that editors have some working knowledge of the content language [05:51:03] thanks, I'll file some tasks [06:31:13] my latest comment on https://gerrit.wikimedia.org/r/c/mediawiki/core/+/455487 has my analysis of the test issue I've been chasing, a bit of a stream of consciousness but maybe there is something interesting in there [06:31:35] I don't know how to fix it yet, I need to think about it [06:38:31] this is the crux of it: in total, 4989 ServiceContainer instances were created to run 7173 tests, so sharing ServiceContainer instances between tests is evidently not a useful optimisation measure. [11:43:12] [a86187d6440469406e14829e] [no req] Wikimedia\Rdbms\DBUnexpectedError from line 375 of /srv/mediawiki-staging/php-1.32.0-wmf.18/includes/libs/rdbms/database/DBConnRef.php: Database selection is disallowed to enable reuse. [11:43:23] That error message doesn't make much sense in english [11:44:03] Also, broken addWiki.php [11:45:51] https://phabricator.wikimedia.org/T203154 [11:56:44] oh, I didn't realise wfGetDB() gave a DBConnRef when I +2'd that [11:56:58] I thought callers had to opt in to it [11:58:26] you can just revert https://gerrit.wikimedia.org/r/c/mediawiki/core/+/452616 if you want [11:58:56] I reverted it locally (on deploy1001) to unblock it [11:59:46] Then removed it again after I'd run addWiki [12:16:23] Do you mean revert out of master too? Or is there a fix we can apply to addWiki instead? [12:19:02] The number of different ways I've seen people break addWiki is at this point fairly ridiculous [12:19:30] It seems that, given any two sets of wikis getting created a few months apart, it's more likely than not that by the second set addWiki will have been broken again [12:20:01] I've renamed my old 'it keeps getting broken' ticket [12:21:54] hopefully we can get some tests in that show up when this happens, instead of everyone being unaware until the next time the script is needed [14:23:44] 14:08:59 php -l tests/phan/stubs/tideways.php [14:23:44] 14:08:59 PHP Fatal error: Cannot redeclare tideways_enable() in tests/phan/stubs/tideways.php on line 9 [14:23:44] 14:08:59 Errors parsing tests/phan/stubs/tideways.php [14:23:47] https://integration.wikimedia.org/ci/job/mwgate-php70lint/7612/console [16:22:33] legoktm: TimStarling: Years ago we switched mysql in CI to a path overtaken by tempfs. It was the same month we switched from sqlite to mysql (remember back when we didn't even test CI on mysql?, Yeah that wasn't so long ago actually). [16:22:41] If memory serves it was you actually Tim that brought us to that. [16:23:04] I'm sure it was there still when we moved from gallium to labs slaves, and also there still in Nodepool. [16:23:20] Which means if it isn't there now, it must've been lost only since a month or two with Quibble/docker. [17:57:32] TimStarling: Checked with Antoine, indeed, we lost it in the recent Nodepool>Docker+Quibble migration. https://phabricator.wikimedia.org/T203181 [17:58:52] thank you to have noticed that! [23:58:50] looks like an easy fix judging by hashar's comment