[02:21:02] Hm.. job insertion errors should be rare but this suggests it might not be https://phabricator.wikimedia.org/T204183 [02:21:28] But it's hard to compare, we have much better logging now, but I don't recall seeing redis insertion errors in the past. [08:39:57] Is there a list of repos somewhere that mediawiki-vendor is meant to serve / contain the libs for? [08:40:21] * addshore goes to find a way to get all wmf deployed extensions with a composer.json that actually requires some package [08:50:04] This will do :D ... find . -name 'composer.json' -exec grep -H "require\"" {} \; [08:52:49] * addshore leaves his finished version in the irc logs for searchability... [08:52:50] find . -name 'composer.json' -exec grep -H "require\"" {} \; | sed 's/:.*//' | sed 's/\/composer.*//' | sed 's/\.\///' [08:55:33] addshore: mediawiki/vendor would hold dependencies for the extensions deployed on wmf. The reference of extensins would be in mediawiki/tools/release in make-wmf-branch [08:55:44] addshore: surely CI should be switched to default to composer [09:00:17] hashar: well, I just finished writing some crappy script to compare what is in mediawiki-vendor vs what will actually be installed if you install all of our extensions [09:01:24] oh [09:01:37] addshore: oh also there is a bit more madness going on [09:01:57] hehe, of course :P [09:02:09] mediawiki/vendor would hold all dependencies for deployed extensions [09:02:14] that is for the master and wmf/ branches [09:02:18] yuppp [09:02:31] but mediawiki/vendor RELXXX branches only hold dependencies for the extension in the mediawiki tarball [09:02:47] so if you have an extension that is not in the tarball and the extension has dependencies via composer [09:02:52] * addshore cares about https://phabricator.wikimedia.org/T179663 [09:02:56] the job will fail on the REL branch because vendor lacks the depedencies [09:03:06] there is a task about it somewhere. And probably we should just use composer instead [09:03:30] and iirc [09:03:30] heh [09:03:31] The requested package monolog/monolog ~1.18.2 exists as monolog/monolog[1.22.1] but these are rejected by your constraint. [09:03:38] doh [09:04:01] DonationInterface has "monolog/monolog": "~1.18.2", but apparently that is not installable [09:04:03] the quibble jobs using vendor do not run composer install, though they do install the require-dev dependencies [09:04:32] DonationInterface is a special case. Its master branch is tested with vendor and core branch fundraising/REL1_27 [09:04:43] because the fundraising cluster runs on a fork of REL1_27 [09:04:52] okay, I'll just take DonationInterface out of my list for now :) [09:04:55] * addshore re runs [09:04:57] yeah ;] [09:05:30] https://phabricator.wikimedia.org/T203084 has the quibble / zuul-cloner command being used [09:06:05] (note: quibble is passed parameters that are then passed to zuul-cloner: --branch REL1_27 --project-branch mediawiki/extensions/DonationInterface=master --project-branch mediawiki/vendor=fundraising/REL1_27 [09:06:16] evil DOnationInterface [09:06:36] yeah it is a special case really [09:13:26] I could pretty easily write a jenkins job for the hackey thing I have just written, and then have it running daily [09:13:40] ah [09:14:00] addshore: so for that in integration/config there are some python tests that are run on a daily basis [09:14:20] really? comparing mediawiki-vendor and what should be there? or? [09:14:25] which generates some dashboard https://integration.wikimedia.org/ci/job/integration-config-qa/lastCompletedBuild/testReport/ [09:14:44] aah, okay [09:14:48] but that does not clone mediawiki-vendor :] [09:15:01] I mean, this script is shitty, but it works :P [09:15:05] good! [09:15:17] I guess you can just create yet another jenkins job so [09:15:23] yup [09:18:18] im dumb [09:21:19] https://www.irccloud.com/pastebin/iV25Ch7D/ [09:21:20] hashar: ^^ [09:21:27] those are the current differences then :) [09:21:44] diff between ? [09:21:54] master (being core and extensions master) and vendor [09:22:07] I guess because some are not pinned [09:22:21] so if you take all extensions that are deployed and have a composer.json and install them using the composer merge plugin, VS what is pinned in mediawiki-vendor [09:22:30] probably we could make that a job for mediawiki/core and mediawiki/vendor to ensure master/wmf branches are in sync [09:22:34] or updated and mediawiki-vendor was not updated [09:22:45] hmm [09:22:52] I would guess some of those deps are not explicitly pinned [09:23:02] maybe james-heinrich/getid3 has "1.9.*" [09:23:42] well, james-heinrich/getid3 isn;t in mediawiki/composer.json it is required only by extensions it would seem [09:23:59] TimedMediaHandler/composer.json: "james-heinrich/getid3": "^v1.9" [09:24:01] "james-heinrich/getid3": "^v1.9" in TimedMediaHandler [09:24:17] JADE/composer.json: "justinrainbow/json-schema": "~5.2" [09:24:17] Kartographer/composer.json: "justinrainbow/json-schema": "~5.2" [09:24:19] but that's fine, versions don't have to be pinned there [09:24:26] only in mediawiki-vendor [09:24:43] my main point is though, someone really has to keep an eye on mediawiki-vendor [09:26:42] I guess I should make the patches to update those libs in mediawiki-vendor now [09:27:27] the status of that job should basically be checked before each branch cut, and mediawiki-vendor fixed if needed [09:32:14] * addshore made 3 patches for those libs [10:41:31] addshore: going to have a nap and I will look at adding WikibaseLexeme & all to the extensions gate [11:56:38] addshore: The last release of getid3 was broken [11:56:51] https://github.com/JamesHeinrich/getID3/releases [11:57:16] Only in demos, but still [11:57:17] https://github.com/JamesHeinrich/getID3/commit/adb28c2e69ec0c6ca722e8363016d8a042069b37 [11:57:29] So I added PHP linting [11:57:29] https://github.com/JamesHeinrich/getID3/commit/1f9135a8a59184402c7e47f3c95b9e70d5369d28 [11:57:36] haha [11:57:40] And made travis do shit [11:57:41] https://github.com/JamesHeinrich/getID3/commit/f178850e6ccbee87e47c10361585d5cc99f679db [11:58:12] Really, what we need is to get something like https://phabricator.wikimedia.org/T180278 actually done [11:58:21] So we know if we have actual issues, or just version number creep :P [12:00:22] addshore: Of course... [12:00:30] WMDE/Wikidata is the worst offender in vendor [12:01:01] 6/28 are yours [12:01:56] Oh, probably more [12:02:03] Because some of the symfony stuff are WMDE stuff :P [12:02:23] I'm happy to do a bump on the pear stuff at least [12:07:12] https://gerrit.wikimedia.org/r/#/c/mediawiki/vendor/+/460336/ [12:07:19] Core doesn't pin them, so nothing to update there [12:08:53] addshore: https://github.com/justinrainbow/json-schema/pull/495 looks fine to bring in [12:09:51] Reedy: really the extension that requires getid3 should state it doesn;t want this broken version [12:09:58] then in the script I just wrote, it won't get pulled in [12:10:10] lol [12:10:13] It's not a big deal [12:10:19] no, but then the script works :P [12:10:19] We could bring it in, and just delete the demo [12:10:28] But the script has only just been created, sooo [12:10:38] I actually already added the demo dir to the gitignore of mediawiki-vendor [12:10:48] That works too [12:10:52] because, it thought, why on earth are we pulling demo stuff in :P [12:11:05] thats already in the PS for mediawiki-vendor [12:11:17] Why on earth are they including them in their exports? :P [12:11:37] because people know nothing [12:11:56] Are they all Jon Snow? [12:12:01] indeed [12:12:07] oooh, when does got start again.. [12:12:13] Fucking ages, I think [12:12:17] upstream patches a go go [12:18:45] addshore: https://gerrit.wikimedia.org/r/#/c/mediawiki/vendor/+/460339/ [12:19:31] that was confusing me for a minute [12:19:48] heh [12:19:56] like, why is that committed... [12:21:26] damn i [12:21:26] t [12:21:28] trailing commas [12:25:20] 12:21:52 The lock file is not up to date with the latest changes in composer.json, it is recommended that you run `composer update`. [12:25:26] composer update is a noop [12:27:12] typical, validate gives different errors locally [12:32:42] :D [12:33:18] wait and see what fails in the end [12:43:26] srsly, what is this shit [12:44:40] hashar: Anything special about mwgate-composer-hhvm-docker ? [12:45:50] Reedy: it runs composer in docker ? :] [12:45:52] what is happening? [12:46:02] It's failing with 12:21:52 The lock file is not up to date with the latest changes in composer.json, it is recommended that you run `composer update`. [12:46:06] on https://gerrit.wikimedia.org/r/#/c/mediawiki/vendor/+/460339/ [12:46:13] But if I run composer validate locally, no error [12:46:23] and composer update is a noop [12:46:55] try locally ? [12:47:11] I did, no error :P [12:56:55] Reedy: maybe it is a stale cache :/ [12:57:11] silly thing [13:08:59] well [13:09:01] cache pruned [13:09:04] but https://integration.wikimedia.org/ci/job/mwgate-composer-hhvm-docker/1318/console failed nonetheless [13:10:15] Could obviously be a composer bug [13:12:40] * Reedy downgrades composer to check [13:14:56] Nope, 1.6.5 locally doesn't give that error [13:15:46] Annoying [13:16:08] Reedy: seems to be an issue with that patch [13:16:18] I debugged on the slave directly [13:16:32] ssh integration-slave-docker-1010.integration.eqiad.wmflabs [13:16:38] cd /srv/jenkins-workspace/workspace/mwgate-composer-hhvm-docker [13:16:42] sudo docker run --rm -it --workdir=/src --entrypoint=composer --volume /srv/jenkins-workspace/workspace/mwgate-composer-hhvm-docker/src:/src docker-registry.wikimedia.org/releng/composer-test-hhvm:0.2.5 --ansi validate --no-check-publish -vvvv [13:16:50] that fails with the patch you send [13:17:20] trying again with: git fetch https://gerrit.wikimedia.org/r/mediawiki/vendor refs/changes/39/460339/2 && git checkout FETCH_HEAD [13:17:25] yeah that patch fails composer validate [13:17:33] it doesn't locally though [13:17:44] but master works fine [13:17:48] there's the usual... [13:17:48] Remove jakub-onderka/php-console-highlighter from require-dev [13:17:48] It's already in require, so having both is confusing and redundant [13:17:50] ffs [13:17:53] name : The property name is required [13:17:53] description : The property description is required [13:18:43] hashar: What does `composer update --no-dev` change? [13:19:51] With or without --no-dev locally there's no lock file changes [13:21:19] trying [13:24:40] Reedy: https://phabricator.wikimedia.org/P7542 [13:25:06] That just looks like php version differences [13:25:13] uh, composer version differences [13:25:18] the container uses Composer version 1.6.5 2018-05-04 11:44:59 [13:25:23] which is the one from integration/composer [13:26:10] so yeah composer.lock is changed [13:26:15] and would need to be commited in [13:26:25] Hmm [13:26:36] bloody compoters [13:26:38] *computers [13:27:07] https://github.com/wikimedia/mediawiki-vendor/commit/8c405f8f2ff3facd980563fd20f8f7878d27ec3e [13:27:21] I think we need to ask volker what version of composer he's running... [13:27:25] see also https://gerrit.wikimedia.org/r/#/c/mediawiki/vendor/+/460336/2/composer.lock :] [13:29:44] bleugh :) [13:29:45] Cheers [13:29:59] https://gerrit.wikimedia.org/r/460357 to update the readme to tell people to use at least 1.6.5 too [13:41:42] hashar: heh, new error [13:41:43] 13:27:22 Warning: fork failed - Cannot allocate memory in /src/jakub-onderka/php-parallel-lint/src/Process/Process.php on line 47 [13:41:43] 13:27:22 Cannot create new process php -d asp_tags=Off -d short_open_tag=Off -d error_reporting=E_ALL -n -l './elasticsearch/elasticsearch/src/Elasticsearch/Endpoints/Tasks/Get.php' [13:41:43] 13:27:22 Script parallel-lint --exclude composer/autoload_static.php --exclude jakub-onderka/php-parallel-lint/tests --exclude symfony/var-dumper/Tests/Fixtures --exclude psy/psysh/test . handling the test event returned with error code 255 [14:24:07] addshore: Want to re CR+2 https://gerrit.wikimedia.org/r/#/c/mediawiki/vendor/+/460339/ please? [14:24:13] Looks like tests are passing finally :P [14:27:26] i guess [14:28:44] Then need to rebase shit [16:22:12] addshore: https://gerrit.wikimedia.org/r/460336 [16:26:39] DanielK_WMDE: When you have a chance, I'd appreciate feedback on T204158. [16:26:40] T204158: Review namespacing of MCR classes - https://phabricator.wikimedia.org/T204158 [19:48:12] and this is why we pin dependencies >.< [19:49:01] legoktm: Eesh, yes. [19:57:10] gahh, pear's bug tracker just threw away my report because I was supposed to log in first, even though the instructions said you didn't need to login if you provided an email address >.< [20:05:05] https://pear.php.net/bugs/bug.php?id=23768 [20:06:01] (yes, pear doesn't let you pick a PHP version higher than 5.6) [22:44:24] legoktm: https://github.com/pear/Net_Socket/tags [22:44:28] Looks like someone has deleted it [22:45:13] Or.. not [22:45:50] ah, it's just github display [23:05:19] TimStarling: Hm.. from your comment https://phabricator.wikimedia.org/T204010, suggests that the Lucene query bar is equivalent to "query" in DSL, is that right? [23:06:13] Which means it would be possible to do a complex filter like (foo:x OR bar:y) in a filter (and thus be label-able and disable-able), by using DSL, despite it normally requiring to occupy the Lucene bar on the dashboard. [23:18:32] yes, true, it is possible to run that query using the query bar