[00:00:39] 10Continuous-Integration-Config, 10MediaWiki-extensions-Insider: Insider extension code C+2s not triggering jenkins - https://phabricator.wikimedia.org/T140702#2473555 (10Paladox) I doint think you can write a message and do +2 at the same time. Try doing +2 without writing a message please. [00:02:35] 10Continuous-Integration-Config, 10MediaWiki-extensions-Insider: Insider extension code C+2s not triggering jenkins - https://phabricator.wikimedia.org/T140702#2473611 (10Paladox) @Jdforrester-WMF That change depends on https://gerrit.wikimedia.org/r/#/c/298680/ which is open and will not merge until that patc... [00:03:13] 10Continuous-Integration-Config, 10MediaWiki-extensions-Insider: Insider extension code C+2s not triggering jenkins - https://phabricator.wikimedia.org/T140702#2473612 (10Paladox) 05Open>03Invalid Closing as invalid since https://gerrit.wikimedia.org/r/#/c/298682/ needs to be merged first. [00:24:49] (03PS1) 10MaxSem: Publish Doxygen and jsduck documentation for Kartographer [integration/config] - 10https://gerrit.wikimedia.org/r/299697 (https://phabricator.wikimedia.org/T140657) [01:02:08] PROBLEM - Long lived cherry-picks on puppetmaster on deployment-puppetmaster is CRITICAL: CRITICAL: 100.00% of data above the critical threshold [0.0] [01:18:05] 06Release-Engineering-Team, 06ArchCom, 06Developer-Relations, 10Phabricator, 06Team-Practices: Consider alternative processes for Unbreak Now bugs, especially those which cross-cut components - https://phabricator.wikimedia.org/T140207#2456573 (10Tgr) > Adding a "Are you really sure? UBN means people aro... [02:42:10] Project selenium-CirrusSearch » firefox,beta,Linux,contintLabsSlave && UbuntuTrusty build #89: 04FAILURE in 1 min 8 sec: https://integration.wikimedia.org/ci/job/selenium-CirrusSearch/BROWSER=firefox,MEDIAWIKI_ENVIRONMENT=beta,PLATFORM=Linux,label=contintLabsSlave%20&&%20UbuntuTrusty/89/ [04:05:02] Project selenium-MultimediaViewer » safari,beta,OS X 10.9,contintLabsSlave && UbuntuTrusty build #78: 04FAILURE in 9 min 0 sec: https://integration.wikimedia.org/ci/job/selenium-MultimediaViewer/BROWSER=safari,MEDIAWIKI_ENVIRONMENT=beta,PLATFORM=OS%20X%2010.9,label=contintLabsSlave%20&&%20UbuntuTrusty/78/ [04:18:39] Yippee, build fixed! [04:18:39] Project selenium-MultimediaViewer » firefox,beta,Linux,contintLabsSlave && UbuntuTrusty build #78: 09FIXED in 22 min: https://integration.wikimedia.org/ci/job/selenium-MultimediaViewer/BROWSER=firefox,MEDIAWIKI_ENVIRONMENT=beta,PLATFORM=Linux,label=contintLabsSlave%20&&%20UbuntuTrusty/78/ [05:49:36] Project mediawiki-core-code-coverage build #2144: 04STILL FAILING in 2 hr 49 min: https://integration.wikimedia.org/ci/job/mediawiki-core-code-coverage/2144/ [06:00:34] 06Release-Engineering-Team, 06Developer-Relations (Jul-Sep-2016), 15User-greg: Write blog post highlighting recent Phabricator improvements - https://phabricator.wikimedia.org/T137727#2473968 (10mmodell) @greg: I'm sure your rewrite is an improvement... I was having trouble expressing that part clearly, it s... [06:34:30] PROBLEM - puppet last run on gallium is CRITICAL: CRITICAL: Puppet has 47 failures [06:41:25] RECOVERY - Free space - all mounts on deployment-jobrunner01 is OK: OK: All targets OK [06:58:16] PROBLEM - Puppet run on phab-beta is CRITICAL: CRITICAL: 22.22% of data above the critical threshold [0.0] [06:58:19] RECOVERY - puppet last run on gallium is OK: OK: Puppet is currently enabled, last run 1 minute ago with 0 failures [07:49:55] PROBLEM - Puppet staleness on deployment-fluorine is CRITICAL: CRITICAL: 10.00% of data above the critical threshold [43200.0] [08:13:24] (03PS3) 10Zfilipin: Deleted browsertests-Wikidata* Jenkins jobs [integration/config] - 10https://gerrit.wikimedia.org/r/298448 (https://phabricator.wikimedia.org/T128190) [08:13:33] (03CR) 10Zfilipin: [C: 032] Deleted browsertests-Wikidata* Jenkins jobs [integration/config] - 10https://gerrit.wikimedia.org/r/298448 (https://phabricator.wikimedia.org/T128190) (owner: 10Zfilipin) [08:15:11] (03Merged) 10jenkins-bot: Deleted browsertests-Wikidata* Jenkins jobs [integration/config] - 10https://gerrit.wikimedia.org/r/298448 (https://phabricator.wikimedia.org/T128190) (owner: 10Zfilipin) [08:53:04] morning hashar ! :D [08:56:03] addshore: hello [08:56:14] addshore: I am busy doing accounting / paper work this morning = [08:56:33] oooooh, that sounds fun! :D Don't worry I have nothing for you, it was just a random morning! :D [09:00:00] addshore: that is very kind ! [09:49:04] 10Deployment-Systems, 10scap, 10Analytics, 10Analytics-Cluster, and 3 others: Deploy analytics-refinery with scap3 - https://phabricator.wikimedia.org/T129151#2474657 (10elukey) Summary: 1) External scap repository config: https://gerrit.wikimedia.org/r/299714 2) puppet changes: https://gerrit.wikimedia.o... [10:29:23] 10Beta-Cluster-Infrastructure, 10DBA, 10Wikimedia-Logstash, 07WorkType-NewFunctionality: Create a logstash input filter to preprocess mysqld syslog messages - https://phabricator.wikimedia.org/T140751#2474720 (10hashar) [10:29:32] 10Beta-Cluster-Infrastructure, 10DBA, 13Patch-For-Review, 07WorkType-NewFunctionality: Send deployment-db1 deployment-db2 syslog to beta cluster logstash - https://phabricator.wikimedia.org/T119370#1824659 (10hashar) Kibana has been upgraded on the beta cluster logstash. I have remade the `mysql` dashboard... [10:37:28] 10Beta-Cluster-Infrastructure, 10DBA, 10Wikimedia-Logstash, 07WorkType-NewFunctionality: Create a logstash input filter to preprocess mysqld syslog messages - https://phabricator.wikimedia.org/T140751#2474745 (10jcrespo) This is a known issue, and I cannot find one now, but there are numerous examples foun... [10:39:32] hashar im wondering if multi connections would work as in for example if jenkins tests on gerrit would it be able to publish the results on gerrit and gerrit-new [10:40:36] 10Beta-Cluster-Infrastructure, 10DBA, 10Wikimedia-Logstash, 07WorkType-NewFunctionality: Create a logstash input filter to preprocess mysqld syslog messages - https://phabricator.wikimedia.org/T140751#2474751 (10hashar) [10:42:29] 10Beta-Cluster-Infrastructure, 10DBA, 13Patch-For-Review, 07WorkType-NewFunctionality: Send deployment-db1 deployment-db2 syslog to beta cluster logstash - https://phabricator.wikimedia.org/T119370#2474754 (10jcrespo) Found some examples: * https://www.digitalocean.com/community/tutorials/how-to-use-logst... [10:42:56] 10Beta-Cluster-Infrastructure, 10DBA, 10Wikimedia-Logstash, 07WorkType-NewFunctionality: Create a logstash input filter to preprocess mysqld syslog messages - https://phabricator.wikimedia.org/T140751#2474755 (10jcrespo) Found some examples: * https://www.digitalocean.com/community/tutorials/how-to-use-lo... [10:43:11] paladox: hello. As I said, I have never tried the multiple connection thing [10:43:26] Ok and hi [10:43:31] paladox: but maybe you can simulate it by having two connections to the same Gerrit with different name and username [10:43:31] Im testing it currently [10:43:36] [connection gerrit1] [10:43:42] username = jenkins01 [10:43:45] Oh [10:43:46] [connection gerrit2] [10:43:50] username = jenkins02 [10:44:11] Seems source defines who can do it [10:44:22] source is from where events comes [10:44:36] that is really "gerrit stream-events" [10:44:43] but yeh it looks like it needs a different user to be able to submit it on the other change. [10:44:46] but i got it working [10:44:53] on two different pipeline [10:44:53] then there are reporters [10:44:55] 10Beta-Cluster-Infrastructure, 10DBA, 10Wikimedia-Logstash, 07WorkType-NewFunctionality: Create a logstash input filter to preprocess mysqld syslog messages - https://phabricator.wikimedia.org/T140751#2474759 (10jcrespo) Be careful because on start/stop, you will find some multi-line "errors" (aside from a... [10:44:58] https://gerrit-zuul.wmflabs.org/ [10:45:03] aka gerrit review --code-review 2 .... [10:45:20] Oh [10:50:56] hashar im wondering weather there will be any more updates to zuul, i can test them with gerrit 2.12 for jessie. [10:51:09] please [10:54:47] paladox: I have to rebuild the package for Jessie [10:54:55] hashar oh. [10:55:01] Ive already done that [10:55:10] but best you do it since i may have missed a few things [10:55:17] :) [10:55:27] I am off for lunch :D [10:55:33] Ok [10:55:35] :) [10:55:46] It will be launch here soon 11:55am [11:22:08] RECOVERY - Long lived cherry-picks on puppetmaster on deployment-puppetmaster is OK: OK: Less than 100.00% above the threshold [0.0] [12:32:50] 10Deployment-Systems, 10scap, 10Analytics, 10Analytics-Cluster, and 3 others: Deploy analytics-refinery with scap3 - https://phabricator.wikimedia.org/T129151#2474875 (10elukey) Deleted the keys from the keyholder private repo, I need to pass protect them properly as stated in https://wikitech.wikimedia.or... [12:43:50] 06Release-Engineering-Team, 06Operations, 10Traffic, 13Patch-For-Review, 05Security: Make sure we're not relying on HTTP_PROXY headers - https://phabricator.wikimedia.org/T140658#2474897 (10elukey) p:05Triage>03High [12:45:15] 10Deployment-Systems, 03Scap3, 06Operations: Warning: rename(): Permission denied in /srv/mediawiki/wmf-config/CommonSettings.php on line 189 - https://phabricator.wikimedia.org/T136258#2474899 (10elukey) p:05Triage>03Normal [12:46:46] 10Deployment-Systems, 03Scap3, 06Operations: Warning: rename(): Permission denied in /srv/mediawiki/wmf-config/CommonSettings.php on line 189 - https://phabricator.wikimedia.org/T136258#2328753 (10elukey) @Dereckson would you mind to double check that everything is working correctly at the moment running l1... [13:37:45] 10Deployment-Systems, 03Scap3, 06Operations: Warning: rename(): Permission denied in /srv/mediawiki/wmf-config/CommonSettings.php on line 189 - https://phabricator.wikimedia.org/T136258#2475046 (10Dereckson) Test run added to this evening SWAT. [14:24:46] yup [14:31:23] 06Release-Engineering-Team (Deployment-Blockers), 05Release: MW-1.28.0-wmf.11 deployment blockers - https://phabricator.wikimedia.org/T139212#2475334 (10matmarex) [14:33:33] 06Release-Engineering-Team, 10TimedMediaHandler-Player, 13Patch-For-Review: All audio players have 120px width, regardless of provided options - https://phabricator.wikimedia.org/T140763#2475348 (10matmarex) Dear #releng, one of the patches provided above should be merged before today's branch cut (T139212).... [14:33:59] 06Release-Engineering-Team, 10TimedMediaHandler, 10TimedMediaHandler-Player, 13Patch-For-Review: All audio players have 120px width, regardless of provided options - https://phabricator.wikimedia.org/T140763#2475353 (10matmarex) p:05Triage>03High [14:34:21] 06Release-Engineering-Team, 10TimedMediaHandler, 10TimedMediaHandler-Player, 13Patch-For-Review: All audio players have 120px width, regardless of provided options - https://phabricator.wikimedia.org/T140763#2475313 (10matmarex) [14:37:15] 06Release-Engineering-Team, 10TimedMediaHandler, 10TimedMediaHandler-Player, 13Patch-For-Review: All audio players have 120px width, regardless of provided options - https://phabricator.wikimedia.org/T140763#2475369 (10matmarex) Wikitext to reproduce the problem: ``` [[File:Example.ogg|500px|thumb]] [[Fil... [14:50:37] Does anyone here have any preferences about what to do with deployment-upload now? [14:50:41] where's hashar.. [15:26:08] 10Deployment-Systems, 03Scap3, 06Operations: Warning: rename(): Permission denied in /srv/mediawiki/wmf-config/CommonSettings.php on line 189 - https://phabricator.wikimedia.org/T136258#2475715 (10bd808) >>! In T136258#2475046, @Dereckson wrote: > Test run added to this evening SWAT. You should be able to j... [15:45:50] 06Release-Engineering-Team, 10MediaWiki-General-or-Unknown, 06Operations, 10Traffic, and 2 others: Make sure we're not relying on HTTP_PROXY headers - https://phabricator.wikimedia.org/T140658#2475770 (10demon) [15:46:53] Project selenium-MobileFrontend » firefox,beta,Linux,contintLabsSlave && UbuntuTrusty build #84: 04FAILURE in 24 min: https://integration.wikimedia.org/ci/job/selenium-MobileFrontend/BROWSER=firefox,MEDIAWIKI_ENVIRONMENT=beta,PLATFORM=Linux,label=contintLabsSlave%20&&%20UbuntuTrusty/84/ [15:51:27] (03PS1) 10Jforrester: MediaViewer: Drop old jsduck/jshint templates, add composer [integration/config] - 10https://gerrit.wikimedia.org/r/299771 (https://phabricator.wikimedia.org/T140652) [15:55:11] (03CR) 10EBernhardson: [C: 032] Single Line comments no multiple '*'. [tools/codesniffer] - 10https://gerrit.wikimedia.org/r/295895 (owner: 10Lethexie) [16:07:30] 06Release-Engineering-Team, 10MediaWiki-General-or-Unknown, 06Operations, 10Traffic, and 2 others: Make sure we're not relying on HTTP_PROXY headers - https://phabricator.wikimedia.org/T140658#2475862 (10BBlack) [16:27:06] (03CR) 10Jdlrobson: [C: 031] "npm can handle these" [integration/config] - 10https://gerrit.wikimedia.org/r/299771 (https://phabricator.wikimedia.org/T140652) (owner: 10Jforrester) [17:07:51] 06Release-Engineering-Team, 10MediaWiki-General-or-Unknown, 06Operations, 10Traffic, and 4 others: Make sure we're not relying on HTTP_PROXY headers - https://phabricator.wikimedia.org/T140658#2471564 (10cscott) This patch might have broken OCG; see T140789. [17:25:47] 10scap, 13Patch-For-Review, 03Scap3 (Scap3-MediaWiki-MVP), 07WorkType-NewFunctionality: Basic scap{2,3} canary deployment process & checks - https://phabricator.wikimedia.org/T110068#2476212 (10thcipriani) So it seems like since the update to the new kibana, we're no longer exposing the `_search` api in th... [17:30:17] thcipriani: hey, I have a question regarding https://phabricator.wikimedia.org/T138234#2429601 I checked scap3 help and I couldn't find a way to deploy into one group let's say I want to build a canary node for ores like mw1017 so I deploy to it first and then if everything was alright, I do another deployment but to all nodes [17:30:31] I couldn't find a way to do this in scap [17:30:48] something like scap deploy --group=canary [17:31:44] so the server_groups setting in the config can take as many groups as you care to throw in there. It will also pause between all groups, so you can cancel at any point. [17:32:06] but, no, there isn't currently a way to deploy only to a single group IIRC [17:32:23] but scap3 will go through server_groups in order [17:33:02] so if you add a canary group in with only one server name in the dsh file it will deploy there first, wait for you to continue, then go on to the next server group [17:33:36] oh, I see the message "Deployed finished and was successful continue?[y]" [17:33:48] or something like that :D [17:34:03] actually, now that I'm thinking about it, I wonder if: scap deploy -D'server_groups:canary' would probably work to only deploy to a single group [17:34:13] yup :) [17:34:52] who ever came up with the flexibility of -D for scap should get a medal or something ;) [17:34:52] okay thanks [17:35:06] I will try it and let you know [17:35:19] * thcipriani gives bd808 a medal or something [17:35:21] :P [17:36:12] funny thing about how that happened, I just needed to be able to test settings in my dev environment. Wasn't even thinking about prod usage [17:36:28] 06Release-Engineering-Team, 10TimedMediaHandler, 10TimedMediaHandler-Player, 13Patch-For-Review, 05WMF-deploy-2016-07-19_(1.28.0-wmf.11): All audio players have 120px width, regardless of provided options - https://phabricator.wikimedia.org/T140763#2475313 (10greg) >>! In T140763#2475691, @gerritbot wrot... [17:36:52] bd808: since you're around, can I ask you about querying elasticsearch dirctly? The kibana upgrade seems to have locked down my access to https://logstash.wikimedia.org/logstash-*/_search [17:36:59] grey, orange or yellow medal? (more tokens) [17:37:24] bd808: context: https://phabricator.wikimedia.org/T110068#2476212 [17:38:15] blerg. kibana4 stops it huh? [17:39:21] yeah, I just get a 404 for the _search query that was working [17:39:32] thcipriani: the easiest thing would probably be to open up port 9200 inside the cluster [17:39:41] from the github issues comment, seems like that's expected and we're not encouraged to use it. [17:39:43] that's just blocked with ferm rules [17:40:03] and the main reason we locked it down was that we didn't have an active usecase to open it up [17:40:18] sure, makes sense. [17:40:43] I was hoping this wasn't the first use-case :P [17:40:48] I think I commented on some related ticket somewhere that allowing 9200 from deploy and logging hosts would be a reasonable idea (eg tin, fluorine) [17:41:44] would that check need to run from all hosts? [17:41:56] nah, just from the deployment hosts as part of scap [17:42:25] so opening that port seems like a reasonable solution [17:42:42] "facets" are gone now too in ES 2.x. There is some similar but slightly different query to get aggregate data [17:43:03] https://www.elastic.co/guide/en/elasticsearch/reference/current/search-aggregations.html [17:43:11] yeah, I noticed that going through the api docs. I'm fiddling with the query on beta right now. [17:43:52] I'll get the query fixed up and make a separate patch to put the script in place and open the ports to the deployment hosts. [17:44:08] we'll see how that goes :) [17:44:25] https://github.com/wikimedia/operations-puppet/blob/production/manifests/role/logstash.pp#L44-L48 [17:44:42] there is a hole for krypton already so should be sort of simple to extend [17:44:47] 10scap, 13Patch-For-Review, 03Scap3 (Scap3-MediaWiki-MVP), 07WorkType-NewFunctionality: Basic scap{2,3} canary deployment process & checks - https://phabricator.wikimedia.org/T110068#2476330 (10EBernhardson) The main change here was that the old version of kibana was a webapp that talked directly to elasti... [17:45:07] * thcipriani nods [17:45:54] path seems mostly clear. heh, looks like ebernhardson just suggested the same thing :) [17:46:01] thanks both [17:46:16] great minds ... [18:03:03] hashar: When you're back, would love to get https://gerrit.wikimedia.org/r/#/c/299771/ merged. [18:09:53] 06Release-Engineering-Team, 06Developer-Relations (Jul-Sep-2016), 15User-greg: Write blog post highlighting recent Phabricator improvements - https://phabricator.wikimedia.org/T137727#2476444 (10greg) >>! In T137727#2473968, @mmodell wrote: > @greg: I'm sure your rewrite is an improvement... I was having tro... [18:10:41] 10Beta-Cluster-Infrastructure, 06Release-Engineering-Team, 10DBA, 07WorkType-Maintenance: Upgrade mariadb in deployment-prep from Precise/MariaDB 5.5 to Jessie/MariaDB 5.10 - https://phabricator.wikimedia.org/T138778#2476445 (10greg) From today's Technology Management meeting: > Cross-DC alternative to MA... [18:14:05] 06Release-Engineering-Team, 10MediaWiki-General-or-Unknown, 06Operations, 10Traffic, and 4 others: Make sure we're not relying on HTTP_PROXY headers - https://phabricator.wikimedia.org/T140658#2476469 (10demon) >>! In T140658#2476106, @cscott wrote: > This patch might have broken OCG; see T140789. Yep, th... [18:18:18] zeljkof: I've heard that you have problems connecting via SSH to cirrus-browser-bot.eqiad.wmflabs. Can I help? [18:20:24] PROBLEM - Host deployment-upload is DOWN: CRITICAL - Host Unreachable (10.68.16.189) [18:21:18] Project mediawiki-core-code-coverage build #2145: 04STILL FAILING in 3 hr 21 min: https://integration.wikimedia.org/ci/job/mediawiki-core-code-coverage/2145/ [18:26:08] gehel: sorry, it's beer time here (8pm) :) [18:26:40] zeljkof: same time here... Ping me tomorrow (or later today) if you need me... [18:26:42] (on the phone) [18:27:00] Tomorrow then, and thanks! [18:29:28] thcipriani: https://gerrit.wikimedia.org/r/#/c/299814/ for when you have some time to check :) Thanks [18:30:31] 10Beta-Cluster-Infrastructure, 06Release-Engineering-Team, 10DBA, 07WorkType-Maintenance: Upgrade mariadb in deployment-prep from Precise/MariaDB 5.5 to Jessie/MariaDB 5.10 - https://phabricator.wikimedia.org/T138778#2476557 (10jcrespo) Upgrading a database (specially if it is not production) takes no more... [18:32:17] Amir1: lgtm, noteworthy that you can also run other checks on specific groups, so you can run more thorough post-service-restart checks on the canary nodes if you want [18:33:08] thcipriani: yeah I thought of it, I will do it soon but mostly I think people should do manual tests too since in each deployment they might introduce a new thing that has its own way of testing [18:33:48] yup, sounds like a good plan [18:56:16] 10Deployment-Systems, 10scap, 10Analytics, 10Analytics-Cluster, and 3 others: Deploy analytics-refinery with scap3 - https://phabricator.wikimedia.org/T129151#2476622 (10thcipriani) >>! In T129151#2474657, @elukey wrote: > Summary: > > 1) External scap repository config: https://gerrit.wikimedia.org/r/299... [18:59:58] 06Release-Engineering-Team (Deployment-Blockers), 05Release: MW-1.28.0-wmf.11 deployment blockers - https://phabricator.wikimedia.org/T139212#2476645 (10matmarex) [19:09:24] 06Release-Engineering-Team, 06Developer-Relations (Jul-Sep-2016), 15User-greg: Write blog post highlighting recent Phabricator improvements - https://phabricator.wikimedia.org/T137727#2476710 (10mmodell) I wasn't planning to go for anything as high-profile as the wikimedia blog, especially if it's gonna take... [19:12:17] 06Release-Engineering-Team, 06Developer-Relations (Jul-Sep-2016), 15User-greg: Write blog post highlighting recent Phabricator improvements - https://phabricator.wikimedia.org/T137727#2376793 (10Dzahn) Hey, do Phabricator blog posts have RSS feeds? We could add that to planet.wikimedia.org. Gets you a bit mo... [19:13:17] (03CR) 10Chad: [C: 032] "Updated: https://wikitech.wikimedia.org/wiki/Heterogeneous_deployment/Train_deploys#Create_the_new_branch_in_gerrit" [tools/release] - 10https://gerrit.wikimedia.org/r/298784 (owner: 10Chad) [19:14:04] (03Merged) 10jenkins-bot: make-wmf-branch: Don't use SSH, it's lame [tools/release] - 10https://gerrit.wikimedia.org/r/298784 (owner: 10Chad) [19:28:39] 10Continuous-Integration-Config, 10MediaWiki-extensions-MultimediaViewer, 06Reading-Web-Backlog, 13Patch-For-Review: Drop old "jshint" job from MediaViewer extension test/etc. pipelines - https://phabricator.wikimedia.org/T140652#2471308 (10hashar) jshint is some 2+ years old version that is snapshotted on... [20:00:09] 06Release-Engineering-Team, 06Developer-Relations (Jul-Sep-2016), 15User-greg: Write blog post highlighting recent Phabricator improvements - https://phabricator.wikimedia.org/T137727#2477086 (10Legoktm) Why not put it on the Wikimedia blog? It's a pretty cool thing that should be shared more widely IMO :) [20:07:35] (03CR) 10Krinkle: "fixme: Rendering of https://integration.wikimedia.org/cover used to work, but now displays an Apache index. It works with trailing slash, " [integration/docroot] - 10https://gerrit.wikimedia.org/r/298449 (https://phabricator.wikimedia.org/T139620) (owner: 10Hashar) [20:07:54] (03CR) 10Hashar: [C: 032] MediaViewer: Drop old jsduck/jshint templates, add composer [integration/config] - 10https://gerrit.wikimedia.org/r/299771 (https://phabricator.wikimedia.org/T140652) (owner: 10Jforrester) [20:09:03] (03Merged) 10jenkins-bot: MediaViewer: Drop old jsduck/jshint templates, add composer [integration/config] - 10https://gerrit.wikimedia.org/r/299771 (https://phabricator.wikimedia.org/T140652) (owner: 10Jforrester) [20:10:50] 06Release-Engineering-Team, 06Developer-Relations (Jul-Sep-2016), 15User-greg: Write blog post highlighting recent Phabricator improvements - https://phabricator.wikimedia.org/T137727#2477130 (10greg) >>! In T137727#2476716, @Dzahn wrote: > Hey, do Phabricator blog posts have RSS feeds? We could add that to... [20:23:34] James_F: done_ no more jsduck/jshint on MediaViewer [20:24:00] hashar: Thank you. :-) [20:24:17] James_F: jsduck would have to be run via rake-jessie though [20:24:22] James_F: it is a ruby gem :( [20:27:43] hashar: Eurgh. Can't we just shell out with an exec() in grunt? :-) [20:29:35] well [20:29:36] no [20:29:37] :D [20:29:56] James_F: we have an outdated jsduck version on Trusty, and I dont think there is any on Jessie :( [20:30:08] * James_F nods. [20:30:12] but yeah eventually we will have a single job doing all the package managers installs [20:30:20] and triggers the 'test' commands in parallel [20:30:21] "CI-dostuff". [20:30:43] Jan and I talked about that months ago [20:30:48] more like a basic Makefile [20:30:52] 06Release-Engineering-Team, 15User-greg: Send vendor feedback forms to HR - https://phabricator.wikimedia.org/T140662#2471745 (10greg) 05Open>03Resolved [20:31:33] James_F: or one could replace the ruby based / dated jsduck with something else that is pure JavaScript [20:32:12] hashar: Re-writing jsduck in node just to make it nicer seems a bit over-kill. [20:34:23] * paladox to many adverts on tv. [20:37:33] hashar: Does anyone do jsduck via a Rakefile in Wikimedia CI yet? Everything I can see has `npm run doc` just execute `jsduck`. [20:38:05] James_F: I dont even know what jsduck is for :( [20:38:12] yeah let me check [20:38:13] hashar: JavaScript documentation. [20:38:30] I think we got a job template that runs on Jessie [20:38:34] and install jsduck from rubygems.org [20:38:41] Ah. [20:38:44] Oh yes [20:38:52] Remember hashar you created that [20:39:00] 10Continuous-Integration-Config, 10MediaWiki-extensions-MultimediaViewer, 06Reading-Web-Backlog, 13Patch-For-Review, and 2 others: Drop old "jshint" job from MediaViewer extension test/etc. pipelines - https://phabricator.wikimedia.org/T140652#2477341 (10Jdlrobson) [20:39:06] Since we had to workaround jsduck not being on Jessie [20:39:11] We had to use gems [20:39:18] '{name}-jsduck-jessie' job template [20:39:24] Kk. [20:39:25] Yep [20:39:53] James_F: the issue is that packaging jsduck is not trivial, it relies on a bunch of other gems and other libraries so eventually ops / us gave up [20:39:57] and moved to just rubygems :D [20:39:59] Yeah. [20:40:19] ideally we would want a rubygems wikimedia-nitpicker [20:40:30] that would have all the proper gems as dependencies and default / sane rake tasks [20:40:56] same for npm I guess. a wikimedia/nitpicker that comes with proper grunt tasks and the common set of npm packages we use for linting [20:40:58] eg banana [20:41:05] * James_F nods. [20:41:57] lets mail Product to create some shinny feature so that Fundraising team has argument as why Wikipedia & sister projects are worth giving to [20:42:13] so we can get more folks to improve the code! [20:42:28] or maybe I should fill that as tasks [20:42:36] and folks might just do them [20:42:58] I think we have enough shiny. [20:43:25] 10Continuous-Integration-Config, 10MediaWiki-extensions-MultimediaViewer, 06Reading-Web-Backlog, 13Patch-For-Review, and 2 others: Drop old "jshint" job from MediaViewer extension test/etc. pipelines - https://phabricator.wikimedia.org/T140652#2477366 (10Jdforrester-WMF) 05Open>03Resolved a:03Jdforres... [20:56:36] Does anyone here have any preferences about what to do with deployment-upload now? [20:56:37] hashar? [20:56:57] If we don't need it, kill it? [20:58:59] ostriches: me ??? [20:59:05] oh no [20:59:08] deployment-upload [20:59:27] Krenair: if Captcha / Maths are working fine [20:59:47] Krenair: and after a week of beta cluster being run on Swift without much complaints ... Yeah delete the instance [20:59:56] really if something is broken, that should be a corner case / side issue [20:59:57] hashar, someone else should test that [21:00:09] and we can figure out how to fix the broken part so it runs on swift [21:01:26] Krenair: I would assume Math/Mathoid to already use swift [21:01:36] and for captcha, I honestly have no idea how to have it point to Swift. [21:01:50] one sure thing, captchas can be regenerated [21:03:09] they were already pointed to swift [21:07:17] 10Deployment-Systems, 06Release-Engineering-Team (Long-Lived-Branches): thoroughly document the new branch cutting plan / strategy - https://phabricator.wikimedia.org/T136015#2477445 (10mmodell) [21:07:20] 10Deployment-Systems, 06Release-Engineering-Team (Long-Lived-Branches), 03releng-201617-q1, 07Epic: Merge to deployed branches instead of cutting a new deployment branch every week. - https://phabricator.wikimedia.org/T89945#2477446 (10mmodell) [21:07:22] (03CR) 10Hashar: [C: 032 V: 032] Merge commit '8c250cf' into debian/jessie-wikimedia [integration/zuul] (debian/jessie-wikimedia) - 10https://gerrit.wikimedia.org/r/293269 (https://phabricator.wikimedia.org/T137279) (owner: 10Hashar) [21:07:24] 06Release-Engineering-Team, 06Operations, 10Continuous-Integration-Infrastructure (phase-out-gallium): Port Zuul package 2.1.0-95-g66c8e52 from Precise to Jessie - https://phabricator.wikimedia.org/T137279#2477447 (10hashar) 05Open>03Resolved [21:07:36] 06Release-Engineering-Team, 06Operations, 10Continuous-Integration-Infrastructure (phase-out-gallium): Port Zuul package 2.1.0-95-g66c8e52 from Precise to Jessie - https://phabricator.wikimedia.org/T137279#2363409 (10hashar) Need to further bump it.. [21:24:25] 06Release-Engineering-Team, 06Developer-Relations (Jul-Sep-2016), 15User-greg: Write blog post highlighting recent Phabricator improvements - https://phabricator.wikimedia.org/T137727#2477521 (10mmodell) https://phabricator.wikimedia.org/phame/blog/feed/1/ It doesn't seem to be referenced in the html header... [21:25:11] Hey CI people, I just had a patch fail two times in a row with 21:23:24 stderr: 'fatal: HEAD not found below refs/heads!' [21:25:28] I'll try it again in half an hour or so and if it fails again I'll file a bug [21:26:06] ostriches hashar ^^ [21:26:25] Could it be that zuul on jessie needs updating. [21:28:09] RoanKattouw: I should've fixed that earlier. [21:28:14] Link to the change/job? [21:28:25] ostriches: https://gerrit.wikimedia.org/r/298125 [21:28:40] One of the jenkins runs is still pending but I can see it's already failed with the same error again [21:29:16] Thanks for jumping on this, I commented here before filing a bug in the hopes that somebody would recognize it [21:29:33] I totally thought I fixed this already [21:29:35] *stabs* [21:29:47] 06Release-Engineering-Team, 06Developer-Relations (Jul-Sep-2016), 15User-greg: Write blog post highlighting recent Phabricator improvements - https://phabricator.wikimedia.org/T137727#2477540 (10Dzahn) :) ->> https://gerrit.wikimedia.org/r/#/c/299867/1/modules/planet/templates/feeds/en_config.erb [21:31:02] (03PS1) 10Hashar: Merge branch 'debian/precise-wikimedia' into debian/jessie-wikimedia [integration/zuul] (debian/jessie-wikimedia) - 10https://gerrit.wikimedia.org/r/299869 [21:31:04] RoanKattouw: So what happened is I pruned a ton of ancient wmf/* branches and tags that we didn't use. [21:31:24] And so some zuul repos complained when their origins/wmf/* went missing [21:31:32] But I did `git fetch -p` on everything [21:31:36] So it shouldddd have fixed. [21:31:38] But clearly not! [21:31:41] hashar :) [21:32:12] ostriches i think zuul may save in cache. [21:33:24] In what cache? [21:33:31] paladox: a quick and dirty zuul package for Jessie https://people.wikimedia.org/~hashar/debs/zuul_2.1.0-151-g30a433b-wmf4jessie1/ [21:33:35] https://people.wikimedia.org/~hashar/debs/zuul_2.1.0-151-g30a433b-wmf4jessie1/zuul_2.1.0-151-g30a433b-wmf4jessie1_amd64.deb [21:33:37] Oh thanks [21:33:38] :) [21:33:46] paladox: almost guaranteed to NOT work [21:33:49] Oh [21:33:52] hashar i can test it [21:33:56] now [21:33:59] I have just merged the precise-wikimedia branch to jessie-wikimedia branch [21:34:03] :) [21:34:09] fixed a few conflicts in debian/control and debian/changelog [21:34:10] then built it [21:34:16] I havent even checked the build output [21:34:21] Maybe i will pickup the pbr error [21:34:32] which pbr error? [21:34:40] Well i found pbr 1.* [21:34:46] caused it to stop working [21:34:53] with the zuul commands [21:34:53] bah [21:35:24] Yep, ive logged in now. [21:35:30] and wget your deb :) [21:35:37] (03CR) 10Hashar: "https://people.wikimedia.org/~hashar/debs/zuul_2.1.0-151-g30a433b-wmf4jessie1/" [integration/zuul] (debian/jessie-wikimedia) - 10https://gerrit.wikimedia.org/r/299869 (owner: 10Hashar) [21:36:07] hashar: Thoughts on https://integration.wikimedia.org/ci/job/mediawiki-extensions-hhvm/69342/console ? [21:36:21] I already `git fetch --purge` all the bad refs, but it still happens. [21:36:26] (this is on scandium) [21:36:36] hashar [21:36:37] https://people.wikimedia.org/~hashar/debs/zuul_2.1.0-151-g30a433b-wmf4jessie1/zuul_2.1.0-151-g30a433b-wmf4jessie1_amd64.deb [21:36:44] Unpacking zuul (2.1.0-151-g30a433b-wmf4jessie1) over (2.1.0-151-g30a433b-wmf1jessie1) ... [21:36:44] dpkg: dependency problems prevent configuration of zuul: [21:36:44] zuul depends on python-gear (>= 0.5.4); however: [21:36:44] Package python-gear is not installed. [21:36:44] dpkg: error processing package zuul (--install): [21:36:45] dependency problems - leaving unconfigured [21:36:47] Processing triggers for systemd (215-17+deb8u4) ... [21:36:49] Errors were encountered while processing: [21:36:51] zuul [21:36:54] paladox: I am going to polish up the jessie version so it is suitable to run a zuul-server and will test that on a labs instance against gerrit-new [21:36:55] it never copy's what i wont it to copy. [21:37:12] Ok, thanks, ive installed it on my test install [21:37:27] It seems to be getting Package python-gear is not installed. [21:37:42] hashar are you going to polish it up tonight or tomarror :) [21:37:51] ostriches: zuul-cloner does execute a "git remote update --prune" or something like "git remote prune origin" [21:38:09] ostriches: then 00:00:16.804 stderr: 'fatal: HEAD not found below refs/heads!' [21:38:11] that is a mystery [21:38:12] Well that's clearly not helping. [21:38:18] looks like the HEAD symbolic ref is gone somehow [21:38:22] It is not. [21:38:42] Wait.... [21:38:48] It went away on scandium? [21:38:51] How on earth would that happen [21:39:02] maybe the copy of mediawiki/extensions/MobileApp on that slave / workspace is borked somehow ? [21:39:43] Same on all repos. [21:39:46] zuul@scandium /srv/ssd/zuul/git/mediawiki/extensions/CirrusSearch ((17425d2...))$ git symbolic-ref HEAD [21:39:47] fatal: ref HEAD is not a symbolic ref [21:39:55] Why on earth would HEADs disappear? [21:40:50] hashar i reverted back to the package i created zuul_2.1.0-151-g30a433b-wmf1jessie1_amd64.deb [21:40:59] The weirder part (to me) is why it's trying to prune refs that are already gone. [21:41:06] eg: git branch -d -r origin/wmf/1.23wmf14 origin/wmf/1.23wmf15 origin/wmf/1.23wmf16 origin/wmf/1.23wmf17 origin/wmf/1.23wmf18 origin/wmf/1.23wmf19 origin/wmf/1.23wmf20 origin/wmf/1.23wmf21 origin/wmf/1.23wmf22 origin/wmf/1.24wmf1 origin/wmf/1.24wmf10 origin/wmf/1.24wmf11 origin/wmf/1.24wmf12 origin/wmf/1.24wmf13 origin/wmf/1.24wmf14 origin/wmf/1.24wmf15 [21:41:06] origin/wmf/1.24wmf16 origin/wmf/1.24wmf17 origin/wmf/1.24wmf18 origin/wmf/1.24wmf19 origin/wmf/1.24wmf2 origin/wmf/1.24wmf20 origin/wmf/1.24wmf21 origin/wmf/1.24wmf22 origin/wmf/1.24wmf3 origin/wmf/1.24wmf4 origin/wmf/1.24wmf5 origin/wmf/1.24wmf6 origin/wmf/1.24wmf7 origin/wmf/1.24wmf8 origin/wmf/1.24wmf9 origin/wmf/1.25wmf1 origin/wmf/1.25wmf10 [21:41:06] origin/wmf/1.25wmf11 origin/wmf/1.25wmf12 origin/wmf/1.25wmf13 origin/wmf/1.25wmf14 origin/wmf/1.25wmf15 origin/wmf/1.25wmf16 origin/wmf/1.25wmf17 origin/wmf/1.25wmf18 origin/wmf/1.25wmf19 origin/wmf/1.25wmf2 origin/wmf/1.25wmf20 origin/wmf/1.25wmf21 origin/wmf/1.25wmf22 origin/wmf/1.25wmf23 origin/wmf/1.25wmf24 origin/wmf/1.25wmf3 origin/wmf/1.25wmf4 [21:41:06] origin/wmf/1.25wmf5 origin/wmf/1.25wmf6 origin/wmf/1.25wmf7 origin/wmf/1.25wmf8 origin/wmf/1.25wmf9 origin/wmf/1.26wmf1 origin/wmf/1.26wmf10 origin/wmf/1.26wmf11 origin/wmf/1.26wmf12 origin/wmf/1.26wmf13 origin/wmf/1.26wmf14 origin/wmf/1.26wmf15 origin/wmf/1.26wmf16 origin/wmf/1.26wmf17 origin/wmf/1.26wmf18 origin/wmf/1.26wmf19 origin/wmf/1.26wmf2 [21:41:06] origin/wmf/1.26wmf20 origin/wmf/1.26wmf21 origin/wmf/1.26wmf22 origin/wmf/1.26wmf23 origin/wmf/1.26wmf24 origin/wmf/1.26wmf3 origin/wmf/1.26wmf4 origin/wmf/1.26wmf5 origin/wmf/1.26wmf6 origin/wmf/1.26wmf7 origin/wmf/1.26wmf8 origin/wmf/1.26wmf9 origin/wmf/1.27.0-wmf.1 origin/wmf/1.27.0-wmf.10 origin/wmf/1.27.0-wmf.11 origin/wmf/1.27.0-wmf.12 [21:41:07] origin/wmf/1.27.0-wmf.13 origin/wmf/1.27.0-wmf.14 origin/wmf/1.27.0-wmf.15 origin/wmf/1.27.0-wmf.16 origin/wmf/1.27.0-wmf.17 origin/wmf/1.27.0-wmf.18 origin/wmf/1.27.0-wmf.19 origin/wmf/1.27.0-wmf.2 origin/wmf/1.27.0-wmf.20 origin/wmf/1.27.0-wmf.21 origin/wmf/1.27.0-wmf.22 origin/wmf/1.27.0-wmf.23 origin/wmf/1.27.0-wmf.3 origin/wmf/1.27.0-wmf.4 [21:41:07] origin/wmf/1.27.0-wmf.5 origin/wmf/1.27.0-wmf.6 origin/wmf/1.27.0-wmf.7 origin/wmf/1.27.0-wmf.8 origin/wmf/1.27.0-wmf.9 origin/wmf/1.27.0-wmf12 [21:41:07] Im not sure why it is hitting the python-gear is not installed [21:41:08] oh [21:41:26] Where you get that list? Silly zuul. [21:41:39] ostriches delete all the repos. They will be recloned with zuul when test are run on them [21:41:43] the lame way to fix would be to ditch it out on integration-slave-trusty-1016 rm -fR /mnt/jenkins-workspace/workspace/mediawiki-extensions-hhvm/src/extensions/MobileApp [21:42:00] ostriches: possibly `git ls-remote origin` ? [21:42:33] hashar thankyou for working on the jessie package :) [21:42:52] paladox: python-gear is more recent on Jessie [21:42:59] oh [21:43:05] paladox: so it is not embedded in the package I have build. Gotta be installed manually [21:43:09] I just ran a git fetch -p and they all went byebye :p [21:43:11] oh [21:43:15] paladox: usually I do: dpkg -i zuulxxxx.deb [21:43:20] Oh yep i did that [21:43:27] paladox: then it complains about missing dep and I resolve them via: apt-get install [21:43:33] Oh [21:43:42] But carn't we install it through pip [21:43:44] it is becuase dpkg has no way to fix the missing dependency / install other packages [21:43:52] pip is forbidden on production [21:43:55] Oh [21:44:08] can we add it to the requirement.txt file [21:44:20] RoanKattouw: Should be fixed now for you, I hope. [21:44:25] So i apt-get install python-gear [21:44:26] ostriches: maybe the refs were not gone on that specific (slave , workspace ) [21:44:44] ostriches: on the integration-slave etc slaves, the workspaces are kept between builds [21:44:56] Ive done apt-get install python-gear now [21:44:59] paladox: yeah or apt-get install python-gear indeead [21:45:01] i will install your package [21:45:03] thanks [21:45:08] try it out :) [21:45:12] Yep [21:45:16] paladox: will test it myself tomorrow [21:45:16] I will do that now [21:45:17] :) [21:45:18] Ok [21:45:31] my goal is to have a zuul package for jessie that is suitable to run as a zuul server in prod [21:45:37] Oh :) [21:45:39] yay [21:45:39] ideally by end of tomorrow evening [21:45:43] Oh yay [21:46:10] Ok im applying it now [21:46:22] yay it installed [21:46:36] https://gerrit-zuul.wmflabs.org/ [21:46:42] so far so good ^^ has turned on now [21:46:54] lets recheck one of my patches [21:47:26] yay it works [21:47:33] http://gerrit-jenkins.wmflabs.org/job/composer-gerrit-test/42/ [21:47:37] hashar ^^ [21:48:28] paladox: bah python-gear is outdated on Jessie :) gotta embed it [21:48:36] Oh [21:48:57] hashar thankyou for working on it :) :) [21:49:20] And what do you mean by embed it [21:49:22] hashar ^^ [21:49:39] paladox: instead of having zuul to depends on python-gear [21:49:47] the zuul.deb packages has a copy of python-gear [21:49:50] Oh yep, oh [21:49:51] ) [21:49:53] :) [21:49:56] it is really a terrible process [21:50:01] Yep [21:50:08] They should do it how windows do [21:50:37] or better even turn http://gerrit-jenkins.wmflabs.org/job/composer-gerrit-test/42/ into a linux installer [21:50:39] (03PS2) 10Hashar: Merge branch 'debian/precise-wikimedia' into debian/jessie-wikimedia [integration/zuul] (debian/jessie-wikimedia) - 10https://gerrit.wikimedia.org/r/299869 [21:50:54] paladox: see diff of PS1..PS2 on https://gerrit.wikimedia.org/r/299869 [21:50:56] ostriches: Looks like it's fixed now, thanks. The change isn't merged yet but the affected job succeeded [21:51:02] Ok thanks [21:51:26] I think you may have to put a space between python-gear and # [21:52:18] paladox: I have updated https://people.wikimedia.org/~hashar/debs/zuul_2.1.0-151-g30a433b-wmf4jessie1/zuul_2.1.0-151-g30a433b-wmf4jessie1_amd64.deb [21:52:26] Thankyou [21:52:29] I will apply that now [21:52:30] :) [21:53:13] ok applying it now [21:53:34] yay it installed now lets recheck a patch [21:54:05] yay it works [21:54:05] http://gerrit-jenkins.wmflabs.org/job/composer-gerrit-test/44/ [21:54:08] hashar ^^ [21:54:09] RoanKattouw: Awesome yay! [21:54:22] (I went ahead and cleaned up similar scenarios on 1016, 1017, 1013, 1014 [21:54:58] ostriches im wondering would this change fix the issue https://github.com/openstack-infra/zuul/commit/d085d2d597aba3f7fadf535c846c9b0513991911 for you [21:55:08] https://github.com/openstack-infra/zuul/commit/6c54b531a36644d53ae054d3fdf846fb2e538677 [21:56:24] It seems to say switching to fetch may get us current branch information. [21:56:28] So that may fix it. [21:56:55] I guess it depends on what origin.fetch() vs. origin.update() do :) [21:56:57] I only know git :p [21:59:09] Oh [21:59:17] No there git commands [21:59:30] ostriches this http://stackoverflow.com/questions/2688251/what-is-the-difference-between-git-fetch-origin-and-git-remote-update-origin [21:59:40] is what it refers to in the commit on why there switching [21:59:52] paladox: did you get the zuul-merger instance upgraded as well ? [22:00:06] hashar you mean jenkins-slave-ci [22:00:09] ? [22:00:16] i upgraded gerrit-test [22:00:29] well I guess that is where the zuul-merger is :) [22:00:36] Ok [22:00:40] I will do that now [22:00:54] I will do it now [22:00:54] :0 [22:00:56] :) [22:01:36] * paladox ssh into jenkins-slave-01, pc slow due to the very hot weather the uk is having. [22:02:03] ostriches: what Paladox said :] [22:02:19] Yay so what i said is correct. [22:02:24] that doing that will fix the issue [22:02:25] :) [22:02:28] ? [22:02:36] Ok im applying it now [22:02:36] :) [22:02:49] Ok ive applied it now [22:02:54] now time to recheck a patch [22:02:55] * ostriches reads GitPython source to see what these functions actually do :) [22:02:55] :0 [22:02:56] :) [22:03:33] hashar yay http://gerrit-jenkins.wmflabs.org/job/composer-gerrit-test/45/ [22:03:37] but it seems at the start [22:03:45] there was a second it was in the check pipeline [22:03:54] then went into the test pipeline a second later. [22:04:38] ostriches, they do the same thing, just one allow you to get more things like current branch information or tags [22:05:23] Hehe, their fix doesn't do what I want :p [22:05:30] And it basically does nothing in their case either. [22:05:35] origin.update() ==> git remote update [22:05:40] I want `git fetch --tags --prune` :) [22:05:41] origin.fetch() ==> git fetch [22:05:47] origin.fetch(tags=true) ==> git fetch --tags [22:05:56] hashar: That's what one can assume, but until you look at the source, who can be sure? :P [22:06:13] ostriches they say that here https://github.com/openstack-infra/zuul/commit/6c54b531a36644d53ae054d3fdf846fb2e538677#diff-d8d0a00bae440342b1584106cb214cce [22:06:16] in the comment msg [22:06:17] "git remote update" is exactly the same as "git fetch --all" (same code path, one is an alias of the other) [22:06:22] so yeah corner case :( [22:06:56] ostriches: GitPython is a bit of a crazy module. It was a single man effort that has been left idling for a couple eyars at least [22:06:57] Oh [22:07:20] paladox: Nobody says anything about pruning in that commit message :) [22:07:20] but now it is I think actively maintained by a team https://github.com/gitpython-developers/GitPython [22:07:35] https://github.com/gitpython-developers/GitPython/graphs/contributors [22:07:38] ostriches no but they say something about getting current branch information [22:07:45] you can see the idling area in 2012 / 2013 [22:07:53] including tags [22:08:01] Since we sometimes want correct tag information as well as correct [22:08:01] branch information for our repos, switch from update to fetch, with [22:08:01] the '--tags' argument. [22:08:09] ostriches ^^ thats a reference [22:08:09] ostriches: upstream is definitely willing to take patches :] [22:08:22] paladox: Yeah, it grabs latest new branches and tags. [22:08:28] But it won't prune by default. [22:08:31] Hence --prune [22:08:35] (which is my problem :)) [22:08:49] ostriches oh yep, but i guess we can submit that patch to openstack [22:08:52] and apply it here [22:08:54] to test [22:09:47] I think I have an Openstack account, it's been a minute. [22:09:49] lol [22:10:07] Hmmm UbuntuOne/Launchpad. [22:10:10] I can't remember my creds. [22:10:24] Oh [22:14:01] I could've sworn I signed a CLA with them before. [22:15:24] Oh [22:15:53] ostriches you can always upload the patch to the zuul repo we have and then test it, then submit it to openstack if you want [22:17:05] It's kinda hard to hit the edge case since I already cleaned things up [22:19:52] oh [22:20:38] hashar yay it works, you built a correct package. [22:20:39] :) [22:20:45] tested against gerrit 2.12.3 [22:20:50] and gerrit 2.8.1 [22:21:23] ostriches and hashar plus i got multi connections working as long as they are in a different pipelin [22:21:52] * paladox carn't wait until ios 10 public beta number 2 is released. [22:23:15] hashar did you hear apply pay launched in france today (http://9to5mac.com/2016/07/18/apple-pay-officially-launches-in-france/) [22:24:32] ostriches: as a wmf employee there is a corporate cla as well IIRC you want to check with legal [22:25:17] :) [22:25:20] (and no I cant upload a patch on your behalf, that breaks the CLA :D ) [22:25:36] paladox: awesome thank you for the test! [22:25:43] Your welcome :) [22:25:52] that labs setup is going to be quite useful [22:25:58] Yep [22:26:11] jenkins 2.* seems to be really good [22:26:19] as for Apple Pay [22:26:31] Yep [22:26:31] well Apple + my money wallet --> no no no [22:26:40] Haha. [22:26:57] Apple pay is not you paying apple actually [22:27:05] nothing is save on apple servers either [22:27:09] I am 100% sure they have a ton of creative ways to vacuum clean my deposit without me even figuring it out [22:27:22] actually did you know your credit card is not saved on your phone. [22:27:35] There is a hash key generated that can be destroyed anytime [22:27:42] ha [22:28:39] I have (****) pounds in my itunes account numbers are hidden to prevent any bot from hacking in [22:28:51] Let me find the number and say it in french [22:29:12] hashar deux cent [22:29:19] Apple Pay is scary [22:29:29] Oh yep. [22:29:41] it basically replace your visa/mastercard [22:29:47] well the physicall device [22:29:48] Yep [22:29:52] well the physicall card [22:29:59] But no one can still your money not even apple. [22:30:03] but in the end it is still paying with your visa/mastercard number [22:30:06] so really [22:30:08] Due it stored on a special secure chip [22:30:29] that is something that has essentially no cost to them but they share a 0.02 fee ? eeeeeek [22:30:43] The banks have to pay for apple to support them [22:31:05] But i have deux cent in my itunes account. [22:31:23] * paladox uses a special translator not saying so bots carn't figure it out. [22:31:34] so the thing is [22:31:39] Yep [22:31:49] in France our debit cards have a contactless chip already [22:31:56] Oh, they do that here [22:32:11] so I am not sure I see the point of Apple Pay. But the Jobs believers must have a good reason [22:32:14] but what is so funny they want customers to use the less secure contactless cards [22:32:41] yeh since apple is more secure. Nothing leavs your device from the wallet app i doint think [22:32:50] I have my subway card in my wallet. [22:33:00] including having cinemar tickets, so handy. [22:33:03] yeah there are some more layers of security [22:33:12] Yep [22:33:15] which is really just [22:33:23] Actually, I don't think that's the right code path to fix. [22:33:23] Yep [22:33:27] oh [22:33:29] I'm just gonna file a bug upstream [22:33:32] ok [22:33:36] thanks [22:33:37] replacing the stupid 90's (card #  + 4 numbers pin code) [22:33:51] with a random one time use card #  + a fingerprint [22:33:51] Oh in america they just begun using pin numbers [22:33:55] oh [22:34:04] I use my figerprint on my iphone [22:34:08] it is so easy [22:34:15] plus no one know my password then. [22:34:48] My iphone has the find my iphone turned on, so if anyone steals it i will just turn on the find my iphone which makes a loud noise, then i will disable it. [22:34:56] true, chips on ATM cards are new here. the local Americans all complain that it takes longer at the store to pay now [22:35:08] which isnt wrong.. but cash is fastest [22:35:39] Oh ha, no it dosent. All the brits and french and not sure other countrys do it are use to it. It's been used since 19** something [22:35:49] mutante it is the more secure option [22:36:02] and will bring down stolen credit card crimes [22:36:15] well, if you compare swiping the magnetic strip to inserting the chip [22:36:32] it does take a couple seconds [22:36:36] Yep [22:36:39] or their POS sucks [22:36:42] mutante: come back to the modern side of the world :] [22:36:45] or their connection to the bank [22:36:53] But it means less likly someone is going to steal your credit card [22:37:04] hashar: lol, i actually write checks now [22:37:18] mutante why not use contactless, that swipping. [22:37:26] mutante: US still has checks ? [22:37:34] mutante i doint think checks are in use here any more [22:37:43] hashar: yea, i just wrote one to my own child [22:37:45] you can still get them but unlikly places accept checks [22:37:52] oh [22:37:59] hashar: also common to pay your rent with check still [22:38:07] funny [22:38:11] now they switched to rentcafe.com [22:38:18] Oh, but it takes along time to clear a check [22:38:24] which is a 3rd party, only for the rent payment [22:38:40] because general payment from account to account is still not a thing [22:39:03] * paladox giving french ago salut hasher [22:39:27] paladox: yea, it sucks too if you write the check.. you dont know when they will post it [22:39:33] Yep [22:39:52] that is the old fasion way of getting money [22:40:48] Mon compte itunes a deux cents livres [22:40:52] hashar ^^ french [22:40:56] :) [22:41:00] mutante ^^ [22:41:14] mutante: rentacafe is quite nice. I am going to relocate to Charlotte, NC :) [22:41:22] ha [22:41:55] But arent bussnesses pulling out of nc [22:42:01] hashar: oooh.. really? with the whole family [22:42:01] due to the strict law they have [22:42:24] mutante: na I am kidding :] Just that house rental prices looks reasonable there [22:42:32] hashar yep [22:42:39] you can own alot of land [22:42:43] in florida [22:42:51] hehe, i did not even use the search feature [22:42:55] which is cheaper then buying a house here [22:42:59] i just login as resident [22:43:01] the land includes big houses [22:43:06] and then pay my rent there [22:43:12] og [22:43:13] oh [22:43:26] paladox: yea, land in florida.. haha [22:43:33] ha [22:43:46] It is really hot in florida. [22:43:50] I have set up a monthly recurring wire transfer to my landlord. Done ! [22:43:53] timeshare opportunities [22:43:57] Ha [22:44:11] My gran and grandad own there house [22:44:28] they doint pay rent, or morgage. but pay taxes on the house [22:44:48] hashar: yea, so i go to online banking.. and there is a service called "bill pay", you can setup automatic payment to people every month. yea.. what does it mean? the bank sends the paper check in the mail _for you_ :) [22:44:49] they also have a deal that from 1am to 7am they get discounted electricity. [22:45:49] hashar im guessing we will do more zuul things tomarror, me trying english -> french :) [22:46:00] hashar im devinant nous allons faire des choses plus Zuul tomarror , moi essayant anglais - > français :) [22:46:05] mutante ^^ [22:46:57] paladox: i am guessing "ce truc zuul" [22:47:06] whenever it's a "thing" it becomes a truc :p [22:47:10] that's what i noticed [22:47:30] out for bed / night etc :) [22:47:38] sorry gotta sleep really [22:47:41] Hamutante this thing Zuul [22:47:42] Ok [22:47:43] good night hashar [22:47:43] hashar, paladox: https://storyboard.openstack.org/#!/story/2000678 [22:47:49] ostriches thanks [22:47:52] (not really urgent, but filed for them) [22:48:12] * paladox opens up microsoft edge [22:48:20] * paladox internet explorer dosent work with that website [22:48:34] ostriches ^^ [22:48:50] paladox: "thingamajig" https://en.wiktionary.org/wiki/truc#French [22:48:59] Oh [22:49:29] * paladox uses google translator [22:51:41] Mutant Salut, moi aime la nouvelle mise à jour de Gerrit 2.12 , me demande ce que le temps aujourd'hui la date de la mise à niveau happend au par :) autruches [22:51:47] mutant = mutante [22:51:51] sorry missplet the name [22:51:59] Oh wait french is mutant [22:52:09] autruches = ostriches [22:52:51] Mon français pas très bien. [22:53:20] ostriches oh you speak french (My French very well.) [22:54:06] * paladox only knows a few words, like bonjour, obvour, and some numbers but carn't spell them [22:54:12] ostriches ^^ [22:55:35] Bonjour. [22:55:55] That's one of the only words I can say in French [22:57:27] paladox: I speak french poorly, heh. "My french is not very good/well" [22:57:38] I know enough to make myself look like an idiot :p [22:58:07] Oh [22:58:17] ha [22:58:24] I carn't speak french correctly either. [22:59:28] The only two languges i know are english (british) and english (america) due to me being an american citizen since i was born there, But also am british, i hold two passport :). Im british and american. [22:59:31] ostriches ^^ [22:59:57] and yes there both actually a lanaguge just some words are spelt differently or mean something differently. [23:00:29] Like cookie in american means biscuit and in british actually means a biscuit type not a whole biscuit . [23:00:50] Plus you guys spell color wrong :p [23:00:51] lolol [23:01:17] ha [23:01:38] yeh we spell it colour but seems i always spell it color [23:02:13] i mix up my english, mum is spelt mom in american. [23:02:14] I'm with you guys on theatre though, way nicer than theater. [23:02:20] ha [23:03:46] lucky thing though i never knew until last year though, president lincoln family came from northampton, with there house being 30 mins to an hour from where i live [23:04:13] http://www.sulgravemanor.org.uk/ [23:04:20] ostriches ^^ [23:05:49] oh wrong president [23:05:56] president george washington [23:06:03] ostriches ^^ [23:08:01] https://www.google.co.uk/maps/dir/Northampton/Sulgrave/@52.1617546,-1.1012397,31531m/data=!3m1!1e3!4m13!4m12!1m5!1m1!1s0x487704236e4aa273:0xcdf495d0d9e86209!2m2!1d-0.902656!2d52.240477!1m5!1m1!1s0x48772219fa70dafd:0x460afa1ed661dc73!2m2!1d-1.1866413!2d52.1037936 [23:08:18] i live really near northampton [23:08:26] ostriches ^^ [23:10:06] :) [23:10:19] (03CR) 10Paladox: [C: 031] "thanks." [integration/zuul] (debian/jessie-wikimedia) - 10https://gerrit.wikimedia.org/r/299869 (owner: 10Hashar) [23:13:00] paladox: i dont have a manor here, we have this https://www.mysteryspot.com/ [23:13:14] ha [23:14:58] we have drayton manor https://www.draytonmanor.co.uk/ [23:15:21] and alton towers https://www.altontowers.com/ [23:15:27] mutante ^^ [23:17:49] ostriches but seems i use american english the most [23:18:16] except from saying the garage which i use the british word not the american one [23:19:54] ostriches do you have a date for when the upgrade for gerrit will happen please? [23:20:43] Nope. [23:20:52] As soon as I know I'll announce. [23:21:02] ok [23:21:05] http://news.sky.com/story/donald-trump-secures-republican-nomination-10506727 [23:22:35] 06Release-Engineering-Team (Deployment-Blockers), 05Release: MW-1.28.0-wmf.10 deployment blockers - https://phabricator.wikimedia.org/T139211#2477977 (10demon) 05Open>03Resolved [23:29:13] 06Release-Engineering-Team (Deployment-Blockers), 10EventBus, 05Release, 07Wikimedia-log-errors: Regression: "Unable to deliver event: 400: 0 out of 1 events were accepted." - https://phabricator.wikimedia.org/T140848#2477982 (10demon) [23:30:30] 06Release-Engineering-Team (Deployment-Blockers), 10EventBus, 07Regression, 05Release, 07Wikimedia-log-errors: Regression: "Unable to deliver event: 400: 0 out of 1 events were accepted." - https://phabricator.wikimedia.org/T140848#2478000 (10Danny_B) [23:31:30] 06Release-Engineering-Team (Deployment-Blockers), 10EventBus, 07Regression, 05Release, 07Wikimedia-log-errors: Regression: "Unable to deliver event: 400: 0 out of 1 events were accepted." - https://phabricator.wikimedia.org/T140848#2478001 (10greg)