[13:59:33] wm-labs-meetbot`: ping [13:59:33] pong [13:59:36] hurrah [14:01:06] \o [14:01:25] #startmeeting CI weekly triage [14:01:34] Meeting started Tue May 12 14:01:25 2015 UTC and is due to finish in 60 minutes. The chair is hashar. Information about MeetBot at http://wiki.debian.org/MeetBot. [14:01:34] Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. [14:01:34] The meeting name has been set to 'ci_weekly_triage' [14:01:34] ... [14:01:37] o/ [14:01:43] jzerebecki: zeljkof Krinkle|detached :D [14:02:00] hey [14:02:11] hashar: good afternoon [14:02:21] #link https://www.mediawiki.org/wiki/Continuous_integration/Meetings/2015-05-12 Agenda [14:02:39] legoktm :) [14:02:46] so at first my apologize for last week screw up [14:03:11] I wasn't available and was overdue on sending the minutes / mail announce that would have made it clear that the meeting was cancelled [14:03:37] #topic Actions restrospective [14:03:55] from last week there was a bunch of tasks all solved :} [14:04:06] #info past week minutes copied on the wiki [14:04:18] #info All jobs now logrotate their build history [14:04:31] one was: bump Zuul version to incorporate the zuul-cloner patch that hard links from git cache https://phabricator.wikimedia.org/T97106 [14:04:49] we have a Debian package now [14:05:02] so we can "easily" cherry pick patch sets and push them to all our machines [14:05:08] such as to update zuul-cloner on the instances [14:05:28] there is a few more patches pending though [14:05:31] but that is ongoing [14:05:40] anything to add from previous meeting? [14:06:54] next :D [14:07:11] #topic Salt! [14:07:38] #info https://phabricator.wikimedia.org/T87819 We have a salt master on labs integration-saltmaster.eqiad.wmflabs makes it easier to run command everywhere [14:07:55] #link https://wikitech.wikimedia.org/wiki/Salt Basic salt documentation. [14:08:22] that is probably of interest to legoktm and Krinkle :] [14:08:42] in short: an usage being ssh integration-saltmaster.eqiad.wmflabs [14:08:43] sudo -s [14:09:19] run a command on all instances: salt '*' cmd.run hostname [14:09:59] For instance having trusty in their hostname: salt -G 'host:*-trusty-*' cmd.run 'hostname; zuul-cloner --version' [14:11:35] each salt client has grains, much like puppet fact. Can be listed with: salt -G '*' grains.items [14:11:47] so we can query all instances having the trusty OS: [14:11:48] salt -G 'oscodename:trusty' cmd.run hostname [14:11:49] etc [14:12:21] Krinkle|detached: salt should be able to replace your dsh / mass ssh command :-} [14:12:28] #topic Triage [14:12:51] #link https://phabricator.wikimedia.org/project/board/1208/ Continuous-Integration-Config [14:13:07] jzerebecki: were you all fine with the Wikibase extension change you did last night ? [14:13:19] hashar: yes [14:13:25] seems you ended up waiting for +2 / zuul deploy [14:13:46] yes thank you. very fast turn around time :) [14:13:50] maybe we should grant you +2 and shell access on the zuul server? [14:13:56] cause you definitely have a ton of experience :} [14:14:02] i'd like that [14:16:02] #action Grant jzerebecki +2 on integration/config and deploy access for Zuul changes. https://phabricator.wikimedia.org/T98865 [14:16:10] and task is filled :} [14:16:34] form the config dashboard [14:16:43] #link https://phabricator.wikimedia.org/T92871 Parsoid patches don't update Beta Cluster automatically -- only deploy repo patches seem to update that code [14:17:01] Parsoid has two repositories [14:17:10] one holding the source code for devs ( mediawiki/services/parsoid ) [14:17:29] and another one meant to deploy the code which has the source repo as a submodule ( mediawiki/services/parsoid/deploy ) [14:17:58] #info it is up to dev to decide whether they prefer to have the tip of dev repo or the deploy one to run on beta cluster [14:18:03] so it is on their hand [14:18:05] s [14:18:47] zeljkof: for CI we now have two Phabricator projects [14:18:54] one being Continuous-Integration-Config to change Jenkins jobs [14:18:59] hashar: I have noticed [14:19:05] maybe you can get some of the browser-tests tasks added to that project ? [14:19:13] they are closely related [14:19:23] hashar: looking... [14:19:24] and usually all about tweaking jjb yaml files [14:19:46] the idea was to split requests from devs from our "internal" work to maintain the CI infra [14:20:02] but also zuul yaml files? [14:20:03] so Continuous-Integration-Config has less tasks and is easier to triage/act on [14:20:07] jzerebecki: yeah as well [14:20:26] you can see a few tasks requesting to add jobs on various repos [14:20:56] configuring browser tests jobs can most probably land there as well. It "just" need JJB knowledge [14:23:14] looking at the board, there is a lot of low hanging fruits [14:24:15] #info We might want to flag browser-tests configuration tasks with #Continuous-Integration-Config since most changes are trivial and that would offload browser test board a bit [14:24:37] jzerebecki: does WMDE handles CI configuration change in their sprint ? [14:24:45] I noticed some are in the Wikidata project as well [14:25:10] jzerebecki: yes we add them to wikidata and to wikidata-sprint-projects [14:25:33] hashar: found just one browser-tests task that is already in cont-int-config https://phabricator.wikimedia.org/T98477 [14:25:58] zeljkof: great! I guess you can keep flagging them to send notifications to other JJB folks [14:26:01] (if watching [14:26:31] hey [14:26:36] I love Pharbricator [14:26:46] CI config + Wikidata = https://phabricator.wikimedia.org/maniphest/query/Yn1W.2PJKvdE/#R [14:27:15] :) [14:29:06] so [14:29:13] #link https://phabricator.wikimedia.org/T93404 run phpunit ResourcesTest from core for wikibase [14:29:41] that is a structure test and I am pretty sure it is run by default for core [14:29:45] and for extensions as well [14:31:34] not sure about the general extension templates, but wikibase is using the phpunit group argument, thus those tests are not run [14:31:50] ahh [14:32:31] jzerebecki: I think you want: php phpunit.php --testsuite structure [14:32:48] yes something like that [14:33:01] there is also a suite to test .less files [14:33:08] but it is in the 'extensions' testsuite [14:33:16] our PHPUnit suites are a mess :/ [14:33:22] (phab is really slow for me right now) [14:34:09] noted in the ticket [14:34:41] talking about messes... https://phabricator.wikimedia.org/T97560 [14:35:01] #link https://phabricator.wikimedia.org/T97560 enable use of a composer created autoloader in extensions deployed to production [14:35:14] jzerebecki: I have bailed out on composer / extensions dependencies [14:35:15] it would be nice if wikibase and other wikidata extensions could be handled like any other mediawiki extension [14:35:32] I have no clue how to handle them properly but potentially legoktm and bd808 would have more informations [14:35:51] mind you, I have no idea how dependencies are handled for extensions right now [14:36:02] potentially that is still using the old system of cloning all dependent extensions [14:36:06] or for other extensions to be able to use more features from composer [14:36:11] with third party libs in mediawiki/endor [14:36:17] yup [14:36:25] exactly it is all still the old way [14:36:35] one of the problem though is that mediawiki/core has its own composer file [14:36:35] even for wikidata extensions [14:36:45] and you can have several extensions together, each having their own composer files [14:36:52] + we have some dependencies in mediawiki/vendor [14:36:58] basically clone all dependent extensions via zuul, then run composer for each of them [14:37:03] oh [14:37:15] s/zuul/zuul-cloner/ [14:37:19] so you might end up with the same lib fetched/installed multiple times right? [14:37:24] exactly [14:37:29] I am not sure how the autoloading behave in such case [14:37:38] it should work [14:37:47] but that is still not nice [14:38:12] :( [14:38:24] also for wmf deployment, I think we get them in mediawiki/vendor [14:38:27] so there is this composer-merge-tool which would allow us to only run composer once and thus also only get one copy of each dependency [14:38:41] so potentially we would want to test libs provided via composer as well as against mediawiki/vendor [14:39:03] merge-tool great [14:39:04] and more importantly correctly resolve or fail dependencies on the same component with different version requirements [14:39:43] so that craft a new composer file that has the common dependencies of all repos involved? [14:39:46] but the merge-tool does not support autoloader settings in extensions, it only merges the dependencies of the extensions [14:39:52] exactly [14:40:43] so it was not intended to and does not work for wikibase and other extensions that themselves use composer to generate their autoloader [14:40:54] that is what the ticket is about [14:41:17] ahhh [14:41:35] bringing dependencies sounds like an epic task :/ [14:42:11] if that feature is implemented then we could integrate this into wikimedias CI, build and deployment processes and handle all extensions the same [14:43:01] so I guess the issue at https://github.com/wikimedia/composer-merge-plugin/issues/18 needs to be pointed to legoktm and bd808 for implementation [14:43:03] right? [14:43:03] run composer once during CI on master, build output to mediawiki/vendor, use vendor repo for wmf branches [14:43:24] sounds good [14:43:38] You can tell meetbot about it with: #agreed run composer once during CI on master, build output to mediawiki/vendor, use vendor repo for wmf branches [14:43:39] :} [14:43:40] yes, except bd808 tells me that is not his job [14:43:56] but probably the job of the rel-eng team [14:44:00] #action the issue at https://github.com/wikimedia/composer-merge-plugin/issues/18 needs to be pointed to legoktm and bd808 or find a champion to implement it. [14:44:02] yeah probably [14:44:09] we can sort it out [14:44:21] thx :) [14:44:52] can you write the #agree I pasted above ? :) so meetbot will assign you the authorship [14:45:12] #agree [14:45:28] jzerebecki: na you want to paste: #agreed run composer once during CI on master, build output to mediawiki/vendor, use vendor repo for wmf branches [14:45:29] :} [14:45:42] #agreed run composer once during CI on master, build output to mediawiki/vendor, use vendor repo for wmf branches [14:46:09] #info Added point to weekly RelEng meeting [14:46:18] will talk about it during our weekly meeting in an hour [14:46:45] bd808 point stands. It is really not in is area of responsibility and he is quite busy [14:48:02] jzerebecki: OpenStack has the equivalent of a global composer.json [14:48:13] so all projects are tested with their own composer.json [14:48:27] and then with the global composer.json (which probably has different version) [14:48:48] on release, they use that global composer.json and know all their bricks pass with them :} [14:49:06] so that global composer may be used to generate mediawiki/vendor [14:49:29] #link https://github.com/openstack/requirements#global-requirements-for-openstack-projects OpenStack global requirements, more or a less a common composer.json [14:51:00] I'm not sure if composer-merge-plugin can actually hook all the things it would need to to make issue 18 possible or not [14:51:14] It amy actually take upstream work with composer [14:51:19] yup we already have that situation. running composer merge-tool in core with all extensions will complain if extensions specify incompatible dependencies [14:52:04] conflicts we want to catch as early as possible so we don't have to deal with it on beta cluster or worth on prod [14:53:21] bd808: and that brings back the topic of whom is in charge of mediawiki/core ;-} [14:53:44] yes then there is the problem that we can not commit to multiple extensions repos at the same time, so we might need to change one, which breaks the mediawiki/vendor build, then commit to the rest to make it all consistent [14:53:44] we all are. the whole FLOSS mediawiki community [14:54:56] But for this question of who will make the 2 extensions that decided to do everything differently, I'd lean towards it being the responsibility of those extension maintainers [14:55:42] I think that composer is a good future for our php package management needs [14:55:48] but there is work to do still [14:56:15] jzerebecki: in theory changes should be back compatible :/ [14:56:23] jzerebecki: though it is not always possible [14:56:49] hashar: core breaks backwards compat all the time [14:56:56] ok so in short, I bring the topic to releng meeting and we tried to find a champion to polish up the composer things [14:57:04] jzerebecki: yeah and that is a problem [14:57:36] I worked last year on having extensions tests to run for mediawiki/core patchsets [14:57:42] but that did not play well :/ [14:57:48] need another sprint or two to finish that [14:58:03] but in theory, we could prevent core from introducing breaking changes [14:58:17] hashar: fixing all extensions to work with the compat break and then making the compat break is fine [14:58:17] by having extensions upgraded first then core change being merged [14:58:25] yeah [14:58:29] if we are going to move forward there are going to be breaking changes [14:58:32] but it is not enforced by CI right now :( [14:59:35] so I think we covered it all [14:59:40] for any extension that is in mediawiki-extensions-hhvm it is enforced, right? [14:59:50] jzerebecki: yes [14:59:59] but it is only a slice of what we run :/ [15:00:09] yup, much work to do [15:02:02] and you are right we could do the same dance we wrote above about with core also with composer dependencies so we never get into a state of a broken master/build [15:02:33] ideally yeah [15:02:40] hope we can talk about it during the hackathon [15:02:45] for now, end of meeting I guess :} [15:04:29] #endmeeting [15:04:30] Meeting ended Tue May 12 15:04:29 2015 UTC. Information about MeetBot at http://wiki.debian.org/MeetBot . (v 0.1.4) [15:04:30] Minutes: https://tools.wmflabs.org/meetbot/wikimedia-office/2015/wikimedia-office.2015-05-12-14.01.html [15:04:30] Minutes (text): https://tools.wmflabs.org/meetbot/wikimedia-office/2015/wikimedia-office.2015-05-12-14.01.txt [15:04:30] Minutes (wiki): https://tools.wmflabs.org/meetbot/wikimedia-office/2015/wikimedia-office.2015-05-12-14.01.wiki [15:04:30] Log: https://tools.wmflabs.org/meetbot/wikimedia-office/2015/wikimedia-office.2015-05-12-14.01.log.html [15:04:35] jzerebecki: bd808 thank you ! [15:05:22] thank you hashar [15:08:43] hashar: post the minutes while they are still fresh ;) [15:09:58] yeah [15:10:24] zeljkof: thanks for the reminder