[00:03:26] (03PS3) 10Legoktm: seccheck: Install plugin into /opt/phan [integration/config] - 10https://gerrit.wikimedia.org/r/459265 (https://phabricator.wikimedia.org/T202386) [00:07:22] (03CR) 10Legoktm: [C: 032] seccheck: Install plugin into /opt/phan [integration/config] - 10https://gerrit.wikimedia.org/r/459265 (https://phabricator.wikimedia.org/T202386) (owner: 10Legoktm) [00:08:52] (03Merged) 10jenkins-bot: seccheck: Install plugin into /opt/phan [integration/config] - 10https://gerrit.wikimedia.org/r/459265 (https://phabricator.wikimedia.org/T202386) (owner: 10Legoktm) [00:11:57] !log updating docker-pkg for https://gerrit.wikimedia.org/r/459265 [00:12:00] Logged the message at https://wikitech.wikimedia.org/wiki/Release_Engineering/SAL [00:16:14] (03PS1) 10Legoktm: Use mediawiki-phan-seccheck:0.3.0 [integration/config] - 10https://gerrit.wikimedia.org/r/459267 [00:17:27] PROBLEM - Puppet errors on deployment-certcentral-testdns is CRITICAL: CRITICAL: 22.22% of data above the critical threshold [0.0] [00:52:02] 10Diffusion, 10Repository-Admins, 10User-Luke081515: Connect/Link Phabricator/Maniphest projects to Diffusion repositories - https://phabricator.wikimedia.org/T120915 (10MGChecker) I would like to assist at this if it's fine to you. Especially if I'm creating new projects for extensions, I would like to tag... [02:20:14] (03CR) 10Legoktm: [C: 032] "INFO:jenkins_jobs.builder:Reconfiguring jenkins job mwext-php70-phan-seccheck-docker" [integration/config] - 10https://gerrit.wikimedia.org/r/459267 (owner: 10Legoktm) [02:22:11] (03Merged) 10jenkins-bot: Use mediawiki-phan-seccheck:0.3.0 [integration/config] - 10https://gerrit.wikimedia.org/r/459267 (owner: 10Legoktm) [02:30:08] (03Restored) 10Legoktm: Enable phan-taint-check on FundraisingEmailUnsubscribe [integration/config] - 10https://gerrit.wikimedia.org/r/458980 (https://phabricator.wikimedia.org/T202386) (owner: 10Brian Wolff) [02:30:14] (03PS2) 10Legoktm: Enable phan-taint-check on FundraisingEmailUnsubscribe [integration/config] - 10https://gerrit.wikimedia.org/r/458980 (https://phabricator.wikimedia.org/T202386) (owner: 10Brian Wolff) [02:30:20] (03CR) 10Legoktm: [C: 032] Enable phan-taint-check on FundraisingEmailUnsubscribe [integration/config] - 10https://gerrit.wikimedia.org/r/458980 (https://phabricator.wikimedia.org/T202386) (owner: 10Brian Wolff) [02:31:43] (03Merged) 10jenkins-bot: Enable phan-taint-check on FundraisingEmailUnsubscribe [integration/config] - 10https://gerrit.wikimedia.org/r/458980 (https://phabricator.wikimedia.org/T202386) (owner: 10Brian Wolff) [02:32:26] !log deploying https://gerrit.wikimedia.org/r/458980 [02:32:29] Logged the message at https://wikitech.wikimedia.org/wiki/Release_Engineering/SAL [02:39:01] 10Continuous-Integration-Config, 10Wikimedia-General-or-Unknown, 10phan-taint-check-plugin, 10MW-1.32-release-notes (WMF-deploy-2018-08-21 (1.32.0-wmf.18)), 10Patch-For-Review: Enable phan-taint-check-plugin on all Wikimedia-deployed repositories where it is curr... - https://phabricator.wikimedia.org/T201219 [03:14:31] legoktm: could use some pointers/motivation for setting up a job similar to patch-coverage. I found the Jenkins job config, but could you maybe give me a top-level view of the different components involved? [03:22:28] The interaction between the Dockerfile/shell script and the composer package itself isn't super clear to me. The rest I mostly got, I think. [03:36:36] so it got pretty messy after the docker/quibble migration [03:37:23] but basically the Dockerfile prepares the environment (installs packages, lays out directory structure, etc.). shell script clones the repos, runs composer install, and then invokes the phpunit-patch-coverage. patch-coverage looks at the git diff, runs its stuff, git checkout HEAD~1, then runs more stuff, diffs the two, and returns [03:37:37] Krinkle: is this job going to require MW to be setup? [03:38:15] Yeah, intially just core, but ideally later with extensions as well (as in, extensions for the commit submitted, core as core only, extension with its deps) [03:38:35] so then it'll have to be built into the quibble stuff [03:39:00] I was sort of expecting your job to "call" the quibble image with some of its commands and then do its own thing. [03:39:20] similar to how some of the non-quibble jobs invoke like a dozen differnet docker images/commands. [03:39:26] but that's not the case. [03:39:38] docker_run_options: '--volume /srv/git:/srv/git:ro' [03:39:38] run_args: '--packages-source {packages-source} --db {database} --commands "phpunit-patch-coverage check --command \"php7.0 -d zend_extension=xdebug.so tests/phpunit/phpunit.php\" --html /log/coverage.html"' [03:40:01] maybe that's more difficult, but nicer, anyway, not criticising :) - Just trying to see if I missed something. [03:40:06] so it looks like quibble shells out to phpunit-patch-coverage [03:40:27] hm.. so who reads '--commands' ? [03:40:47] it's passed to the main docker invocation from the Jenkins job, and then... [03:40:53] https://gerrit.wikimedia.org/r/plugins/gitiles/integration/quibble/+/master/quibble/cmd.py#143 [03:41:01] ah, quibble itself. [03:41:02] okay [03:41:21] Hm.. "instead of the stages" [03:41:24] that's interesting. [03:41:32] I asume there are still some stages it does (like installing) [03:41:35] it's mostly a hack so we can re-use the quibble setup logic [03:41:38] or do you have to do that then? [03:41:42] Yeah [03:41:47] > stages = ['phpunit', 'npm-test', 'composer-test', 'qunit', 'selenium'] [03:41:59] quibble does install regardless [03:42:39] Right [03:42:47] Oh, I see there's shutdown/teardown logic as well [03:43:11] so you couldn't just have the patch coverage job do "docker ... quibble -- stages=None" and then the coverage tool as second command after that [03:43:41] quibble tears down the databases once it quits, so that wouldn't work [03:46:43] so I'd probably design your thing to expect MW setup, run whatever it wants, git checkout HEAD~1, then run it a second time, and then diff the two outputs [03:47:33] Yeah, definitely. [03:47:44] OK. just looking for the checkout part still. [03:48:25] was thinking it might be in mwext-phpunit-coverage-patch.sh [03:48:42] 10Continuous-Integration-Infrastructure (shipyard): Rebuild quibble images for Firefox 60 - https://phabricator.wikimedia.org/T203902 (10Legoktm) p:05Triage>03High [03:48:56] the git checkout HEAD~1? that's done in phpunit-patch-coverage [03:49:22] https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/tools/phpunit-patch-coverage/+/master/src/CheckCommand.php#207 [03:51:43] Oh, I see. [03:53:17] sorry, I didn't make that very clear [03:55:06] (03PS1) 10Krinkle: [WIP] Add performance-patch job [integration/config] - 10https://gerrit.wikimedia.org/r/459268 (https://phabricator.wikimedia.org/T133646) [03:55:15] legoktm: https://gerrit.wikimedia.org/r/#/c/integration/config/+/459268/1/dockerfiles/quibble-stretch/mediawiki-performance-patch.sh [03:55:18] I was thinking more like that [03:55:50] that could also work [03:56:06] but in that case you're running the same exact thing both times right? [03:56:40] the same set of tests yeah, against before/after [03:56:49] the second one also doing comparisons [03:57:00] for phpunit the test files we run for after and before are different, so that's why I found it easier to have the command take care of it all [03:57:54] legoktm: Hm.. not sure I follow [03:58:10] you mean during the first run for 'after', you memorise what's run and run only those on the before, right? [03:59:03] so when it compares the modified files against the class names existing on disk, that's only done during the initial phase, and then applied to the before, is that right? [03:59:08] it looks at the git diff to see which files were added/modified (first run/after), then looks at which files were modified/deleted (second run/second) [03:59:25] Hm.. right [03:59:44] Well for my case at least I don't intent for the tests run to be partial or influenced by the diff [03:59:52] It's more holistically looking at the page load process. [04:00:37] yeah, the main thing I was fighting against was that running every single test under coverage is just way too slow. but in your case you want to run the full thing twice [04:02:19] Yeah [04:02:49] legoktm: the /opt directories, and pruning, I assume that's to keep the image small, right? [04:03:08] yep [04:15:36] legoktm: hm.. why find and then rmdir, instead of rm -rf directly? [04:15:51] I vaguely recalling something about each step being stored but not sure if that's significant. [04:16:42] each RUN command is stored separately in a "layer", but if you && everything into one RUN command, it becomes one layer [04:17:26] I think the find delete deletes all the files, but since it's running as the nobody user, it can't delete the directory itself [04:17:50] if you rm -rf as root in the next layer, then you still have to download all the files that were deleted in the previous layer [05:19:39] legoktm: right, that makes sense [05:19:43] you delete as much as you can first. [05:19:45] then the dir [05:19:46] col [05:19:46] cool [08:57:25] RECOVERY - Puppet errors on deployment-certcentral-testdns is OK: OK: Less than 1.00% above the threshold [0.0] [11:31:54] 10Project-Admins, 10Security-Team: Undefined #Security-General and #Security-Other - https://phabricator.wikimedia.org/T109328 (10Tgr) >>! In T109328#4557257, @chasemp wrote: >>>! In T109328#4324821, @Aklapper wrote: >> * I propose to archive the tags #Security-Core, #Security-General, #Security-Other, #Securi... [12:59:07] 10Project-Admins, 10Security-Team: Undefined #Security-General and #Security-Other - https://phabricator.wikimedia.org/T109328 (10Aklapper) >>! In T109328#4569805, @Tgr wrote: > * tasks within the archived projects were not migrated to #security and many of them have now no security-related tag at all Hmm, ind... [14:55:00] 10Project-Admins: New Extension: ChangeUserPasswords - https://phabricator.wikimedia.org/T202275 (10Aklapper) a:03Mz83ude [16:30:17] 10Project-Admins, 10Security-Team: Undefined #Security-General and #Security-Other - https://phabricator.wikimedia.org/T109328 (10Tgr) IMO, open tasks should be added to #security and closed ones are not worth worrying about. [18:12:25] 10MediaWiki-Codesniffer: MediaWiki.Commenting.FunctionAnnotations.UnrecognizedAnnotation reports @@ as error - https://phabricator.wikimedia.org/T203916 (10Umherirrender) [18:14:29] 10Release-Engineering-Team (Kanban), 10MediaWiki-General-or-Unknown, 10MW-1.32-release-notes (WMF-deploy-2018-09-18 (1.32.0-wmf.22)), 10Security, 10User-zeljkofilipin: `npm audit` for mediawiki/core found 24 vulnerabilities - https://phabricator.wikimedia.org/T194280 (10Krinkle) >>! In T194280#4568328, @... [19:14:35] 10MediaWiki-Codesniffer: MediaWiki.Commenting.FunctionAnnotations.UnrecognizedAnnotation reports @@ as error - https://phabricator.wikimedia.org/T203916 (10Umherirrender) [19:30:03] 10MediaWiki-Codesniffer: Allow deprecated @type in MediaWiki.Commenting.FunctionAnnotations.UnrecognizedAnnotation - https://phabricator.wikimedia.org/T203922 (10Umherirrender) [19:35:55] 10MediaWiki-Codesniffer: Allow deprecated @type in MediaWiki.Commenting.FunctionAnnotations.UnrecognizedAnnotation - https://phabricator.wikimedia.org/T203922 (10Umherirrender) [19:36:48] 10MediaWiki-Codesniffer: Allow deprecated @type in MediaWiki.Commenting.FunctionAnnotations.UnrecognizedAnnotation - https://phabricator.wikimedia.org/T203922 (10Umherirrender) [19:44:43] (03PS1) 10Legoktm: seccheck for TranslationNotifications [integration/config] - 10https://gerrit.wikimedia.org/r/459401 [19:45:05] (03CR) 10Legoktm: [C: 032] seccheck for TranslationNotifications [integration/config] - 10https://gerrit.wikimedia.org/r/459401 (owner: 10Legoktm) [19:47:29] (03Merged) 10jenkins-bot: seccheck for TranslationNotifications [integration/config] - 10https://gerrit.wikimedia.org/r/459401 (owner: 10Legoktm) [19:47:54] !log deploying https://gerrit.wikimedia.org/r/459401 [19:47:57] Logged the message at https://wikitech.wikimedia.org/wiki/Release_Engineering/SAL [22:04:41] legoktm: Hm.. I don't suppose we can have Node 7 or Node 8 in CI with Quibble, right? [22:05:56] I assume we're on Node 6, matching prod? [22:07:25] I wouldn't normally mind, it's just that puppeteer with chromium is easier/faster to use in Node 8 and above. But I can make it work [22:07:58] I suppose we would have Node 8 in CI before prod though, I mean, it makes sense to support it at least, the same way we also run tets against PHP 7 and Postgres, for Node service to integrate against newer Node versions. [22:40:08] (03PS1) 10QChris: Allow “Gerrit Managers” to import history [wikimedia-cz/tracker] (refs/meta/config) - 10https://gerrit.wikimedia.org/r/459406 [22:40:10] (03CR) 10QChris: [V: 032 C: 032] Allow “Gerrit Managers” to import history [wikimedia-cz/tracker] (refs/meta/config) - 10https://gerrit.wikimedia.org/r/459406 (owner: 10QChris) [22:41:09] (03PS1) 10QChris: Import done. Revoke import grants [wikimedia-cz/tracker] (refs/meta/config) - 10https://gerrit.wikimedia.org/r/459407 [22:41:11] (03CR) 10QChris: [V: 032 C: 032] Import done. Revoke import grants [wikimedia-cz/tracker] (refs/meta/config) - 10https://gerrit.wikimedia.org/r/459407 (owner: 10QChris) [22:52:11] Krinkle: just node 6. I think prod is planning to directly jump to node 10 - https://phabricator.wikimedia.org/T203239 [22:55:22] legoktm: cool [22:55:55] I was searching for "Node(.js)? (upgrade|6|7|8)" but couldn't find that one :) [23:01:26] :) [23:51:44] legoktm: basic idea is to have a .fresnel.yml file in MW, and individual extensions can have their own scenarios instead if needed. Although at first I expect the main point will be to measure an extensions impact on the default scenarios, but e.g. Flow might want to load a page in their custom namespace instead not run the default scenarios (given they wouldn't impact them, unless they do in which case it can add a similar scenario back). I [23:51:44] plan to keep it simple and focussed on the individual project, so probably not extending or merging configs. [23:52:47] exciting!!